Site icon Updated American Standard Version

The Coherence-Based Genealogical Method (CBGM): An In-depth Analysis

cropped-uasv-2005.jpg

Please Support the Bible Translation Work of the Updated American Standard Version (UASV)

$5.00

Click here to purchase.

The Coherence-Based Genealogical Method (CBGM): An In-depth Analysis (Mink, Gurry, Head)

The Coherence-Based Genealogical Method (CBGM) is a relatively new but increasingly influential approach to New Testament textual criticism. Pioneered by scholars like Gerd Mink and further explored by scholars such as Peter Gurry and Peter Head, this method attempts to solve some of the most enduring complexities in the field of New Testament textual criticism. It does so by using computational methods to analyze relationships between manuscripts in a way that traditional methodologies could not.

The field of New Testament textual criticism is a complex landscape filled with diverging opinions and methodologies. Each approach carries its own set of merits and drawbacks, but what they all share is a commitment to understanding the text of the New Testament as faithfully as possible. What differentiates them is the weight they assign to various forms of evidence and the presuppositions they bring to the text. No methodology is without its criticisms, and the debates between these different schools of thought enrich the discipline as a whole, pushing it continually toward a more nuanced and accurate understanding of the New Testament text.

Theoretical Foundations

The CBGM is not just an empirical method; it has a theoretical foundation that distinguishes it from more traditional approaches. It operates on the premise that no single manuscript can be considered the closest to the original text in all its readings. Instead, each manuscript may have some readings closer to the original text, while other readings may be further away. The genealogical relationships between the manuscripts are, therefore, not linear but rather multidimensional, resembling a complex web. This is a radical departure from previous methods that aimed to prioritize certain “families” of texts or individual manuscripts.

Computational Methodology

One of the most defining aspects of the CBGM is its reliance on computational algorithms. This allows for a more objective assessment of the coherence among different manuscripts. Each variant reading is treated separately, and the manuscripts that contain it are analyzed for their genealogical relatedness. For a given variant, manuscripts that share the same reading are grouped together and their mutual coherence is evaluated. The coherence score is based on how often these manuscripts agree on other variants throughout the text. This approach is designed to approximate how closely the manuscripts in question could be related to the original text.

Strengths of the CBGM

The CBGM brings a level of statistical rigor and objectivity that is often lacking in more traditional methodologies. Its data-driven approach minimizes some of the subjective elements that have long plagued New Testament textual criticism. For example, it reduces the influence of scholars’ personal preferences for certain text types or individual manuscripts. Moreover, its ability to handle a large volume of data allows for a more comprehensive analysis of the extant manuscript evidence.

Another strength is its capacity to adapt to new manuscript discoveries. As the method is computational, incorporating additional data does not require a complete overhaul of the existing analysis; rather, the new data can be more seamlessly integrated.

Criticisms and Limitations

Though innovative, the CBGM is not without its detractors. One of the main criticisms is that it heavily emphasizes the internal evidence and sometimes overlooks the external evidence like patristic citations or ancient translations. Critics argue that these external witnesses can offer valuable historical context that could aid in establishing the original text.

Another concern is that while the method might be objective in its calculations, the interpretation of its results still requires subjective judgments. The ‘coherence’ between manuscripts does not automatically prove their closeness to the original text; it simply shows they are related in some way.

Lastly, the method’s complexity and reliance on specialized software mean that it is less accessible to scholars who do not have a background in computational methods or statistical analysis.

The Coherence-Based Genealogical Method is a groundbreaking contribution to the field of New Testament textual criticism, providing new tools and avenues for understanding the relationship between manuscripts. Its data-driven, computational approach offers both a fresh perspective and a set of challenges. While it brings some advantages in terms of objectivity and comprehensiveness, its limitations and criticisms suggest that it should be used in conjunction with other methodologies for a more rounded approach to textual criticism. Like all methods, it is not a panacea, but it is an important step forward in the ongoing endeavor to get as close as possible to the original text of the New Testament.

Digging Deeper into the Weaknesses and Limitations Coherence-Based Genealogical Method (CBGM)

The Coherence-Based Genealogical Method (CBGM) has been hailed as a revolutionary approach in New Testament textual criticism, particularly in the reconstruction of the text of the New Testament. While the method has its merits, especially in its attempt to apply a consistent methodology to a complex problem, it is not without its weaknesses and limitations. The following points address these concerns in depth.

Subjectivity in Identifying “Initial Text”

One of the significant concerns is the subjective nature of determining the “Initial Text” (Ausgangstext) in the CBGM methodology. Critics argue that there’s no definitive way to establish the earliest recoverable form of the text. Given that the method depends on the Initial Text as a point of reference, the element of subjectivity at this stage casts a shadow over the entire process.

Failure to Adequately Account for Weight of Individual Manuscripts

One notable weakness of the Coherence-Based Genealogical Method (CBGM) from a documentary approach perspective is its inadequacy in giving proper weight to individual manuscripts based on their historical value, age, or textual characteristics. While the CBGM attempts to trace textual flow and assess coherence among a set of manuscripts, it largely treats manuscripts as equals when generating its genealogical relationships. This is a stark contrast to the documentary approach, which emphasizes the need to weigh manuscripts differently based on multiple criteria, including their historical context, scribal habits, and paleographic dating.

Examples:

  1. Papyrus Manuscripts: Take, for instance, the importance given to the Bodmer Papyri (P66 and P75) or the Chester Beatty Papyri (P45, P46, and P47). These papyri are dated much earlier and often contain text-types that are closer to the originals. In a documentary approach, these would receive considerable weight, especially when they agree. CBGM’s more democratic treatment of manuscripts can dilute the importance of such key witnesses.

  2. Alexandrian Manuscripts: The Alexandrian text-type, as represented by Codex Sinaiticus and Codex Vaticanus, is often given weight in the documentary approach due to its early date and consistent textual character. However, under CBGM, the coherence of these manuscripts with later Byzantine copies might lessen their individual impact on the reconstruction of the text.

  3. Major Uncials: In the documentary approach, major uncial manuscripts like Codex Bezae for the Western text-type or Codex Alexandrinus for the Byzantine text-type often have their unique characteristics examined in depth. This allows scholars to account for scribal tendencies and possible intentional changes. CBGM’s algorithmic method lacks the nuance to weigh these individual traits fully.

  4. Patristic Citations: In the documentary approach, the use of Scripture by early Church Fathers can serve as an additional witness to the text. For example, quotations from Ignatius of Antioch or Origen can offer insights into early readings. CBGM largely ignores this wealth of evidence.

  5. Lectionaries and Liturgical Manuscripts: Some manuscripts were produced for specific ecclesiastical settings and may contain unique readings that reflect early Christian worship or doctrinal emphasis. These can offer valuable insights into the reception and use of the New Testament texts in early Christian communities. Again, these considerations fall outside the purview of the CBGM’s more algorithmic approach.

  6. Textual Clusters and Families: The documentary approach often analyzes manuscripts in groups or families, like Family 1 or Family 13 in the Gospels, or the textual clusters evident in the Pauline Epistles. These familial relationships can offer crucial insights into the textual history, yet CBGM does not incorporate these group dynamics in a nuanced way.

The ability to consider the weight of individual manuscripts, as well as the unique characteristics and historical circumstances that surround them, provides a robust framework for approximating the original words of the New Testament text. This is a dimension where CBGM, in its current form, falls short, offering a more flattened, generalized view of textual relationships that may not accurately reflect the multi-faceted reality of textual transmission. In essence, while CBGM focuses on textual flow and coherence, it misses the complex interplay of factors that a documentary approach incorporates to get as close as possible to the original words of the New Testament.

Theoretical Constraints

The CBGM operates within a theoretical framework that assumes the changes in the text occur in a particular manner that can be modeled genealogically. However, the reality of how scribes copied texts is far more complicated. The copying process could involve multiple exemplars, oral traditions, or other mechanisms that do not fit neatly into the CBGM’s theoretical constraints.

Statistical Complexity

The algorithmic nature of CBGM requires a level of statistical understanding that many textual scholars do not possess. While algorithms can provide valuable insights, they are also limited by the assumptions programmed into them. This complexity can serve as a barrier to many who wish to engage with the methodology critically.

Limitation to Greek Manuscripts

The CBGM focuses on the Greek manuscripts and largely disregards translations and citations from the early Church Fathers. While the Greek manuscripts are undoubtedly crucial, ignoring these other witnesses is a serious limitation. These sources can offer perspectives that are not present in the Greek manuscripts and can serve as a counterpoint to the type of coherence that the CBGM seeks to establish.

Ambiguity in Determining “Coherence”

The very term “coherence” can be misleading or ambiguous. Coherence does not necessarily indicate historical accuracy or reliability but rather refers to consistency within a predefined set of manuscripts. Two or more texts could be coherent with each other while still being incorrect representations of the original text.

Difficulty in Addressing Conflation

Conflation, where a scribe combines readings from different textual families, poses a significant problem for the CBGM. Given its genealogical approach, the method finds it challenging to address instances where scribes have merged readings from multiple ancestors, as this goes against the assumed ‘tree-like’ lineage that the method is designed to handle.

Lack of External Criteria

The CBGM is mainly concerned with the internal evidence, focusing less on external criteria like paleography, codicology, or even patristic citations. The historical context within which a manuscript was produced can offer invaluable insights into its textual characteristics, but these are largely sidelined in the CBGM approach.

Historical Anomalies

The method does not adequately account for historical anomalies, such as the widespread adoption of a particular reading due to ecclesiastical endorsement or persecution. Such external factors can dramatically influence the transmission process, and their absence in the CBGM model marks a considerable gap.

Fails to Fully Address Human Element

Textual transmission was not a mechanical process but one filled with human decisions. Scribal activities were influenced by theological, sociopolitical, or even personal factors, which could cause variants to appear that are not strictly genealogical in nature. The CBGM, with its mechanistic approach, does not fully capture this human element.

Dependence on Existing Editions

The CBGM methodology has often been applied to modify or validate existing critical editions of the New Testament, such as the Nestle-Aland edition. This raises the question of circular reasoning, as the methodology can be used to confirm what is already assumed in the selected ‘base text.’

While the Coherence-Based Genealogical Method represents a significant attempt to standardize the process of New Testament textual criticism, it has considerable limitations that cannot be ignored. From its inception point with the “Initial Text” to its theoretical assumptions, statistical complexity, and other constraints, the method leaves much to be desired in addressing the intricate and human process of textual transmission. Whether one views these limitations as fatal flaws or challenges to be overcome could depend on one’s presuppositions. However, what is clear is that any textual critical method, including CBGM, should not be used in isolation but rather in conjunction with a variety of approaches to reach a more balanced and nuanced understanding of the New Testament text.

Let’s delve deeper into each point with specific examples to provide a clearer understanding of the weaknesses and limitations of the Coherence-Based Genealogical Method (CBGM). Below are examples of each of these points we have just made.

Examples of the Above Points

Subjectivity in Identifying “Initial Text”

The CBGM begins by identifying what it calls the “Initial Text,” which is assumed to be the earliest recoverable form of the text. The problem here is the subjectivity involved in determining what this “Initial Text” should be. For example, in the case of Mark 16:9-20, some might consider the longer ending as part of the “Initial Text,” while others might disagree. This initial choice then influences the entire genealogical model that follows.

Theoretical Constraints

The CBGM operates within certain theoretical assumptions about how the text has been transmitted. For instance, it assumes a tree-like genealogy of texts. However, we have instances like the “Western Non-Interpolations” in Luke, where scribes seemed to omit rather than add material. These cases are hard to fit into a tree-like model but are better understood as examples of scribal decision-making that defy simple genealogical representation.

Statistical Complexity

The CBGM is algorithmically complex, making it difficult for those without statistical training to fully engage with the method. Consider the “pre-genealogical coherence” calculations that the method employs. These calculations are crucial for determining relationships between manuscripts, but understanding them requires more than just a cursory knowledge of statistics.

Limitation to Greek Manuscripts

The CBGM mostly ignores early translations like the Latin Vulgate or the Syriac Peshitta, as well as citations from the Church Fathers. These can serve as significant witnesses to the text. For instance, certain readings in 1 Timothy 3:16 are supported by early Latin and Syriac versions but might be overlooked if one were focusing solely on the Greek manuscripts.

Ambiguity in Determining “Coherence”

The term “coherence” is rather ambiguous in the CBGM framework. For instance, in the text of Revelation, the CBGM might find that manuscripts A and B are coherent because they share a set of similar readings. However, this coherence doesn’t necessarily imply that A and B are closer to the original text—just that they share similarities.

Difficulty in Addressing Conflation

In John 7:53-8:11, the story of the woman caught in adultery, we have evidence of conflation from different textual traditions. Some manuscripts include the passage, some don’t, and some have it in different locations. The CBGM struggles with such scenarios because its algorithm is not designed to handle readings that are the result of the blending of different traditions.

Lack of External Criteria

For manuscripts like Papyrus 66 (P66), external evidence such as paleography dates it to around 200 C.E., which gives it a particular weight in textual decisions. However, CBGM largely ignores these external criteria, focusing more on internal coherence, which could lead to skewed results.

Historical Anomalies

Take the example of the “Comma Johanneum” (1 John 5:7-8). This Trinitarian formula was more likely inserted into the Latin Vulgate due to ecclesiastical pressure. While this is an extreme example, it illustrates how the CBGM can be ill-equipped to handle readings influenced by external historical factors.

Fails to Fully Address Human Element

Scribal changes were not purely accidental; sometimes they were deliberate for theological or other reasons. For instance, the change from “God was manifest” to “He who was manifest” in 1 Timothy 3:16 could have been motivated by early Christological debates. The CBGM is not designed to account for such intentional changes.

Dependence on Existing Editions

The CBGM has often been applied to validate or modify existing critical editions like the Nestle-Aland. When it was used in preparing the Editio Critica Maior of the Catholic Epistles, the “Initial Text” was generally aligned with the Nestle-Aland 27th edition. This raises questions about the independence of the CBGM from existing critical texts.

By examining these examples, it becomes evident that while the CBGM is a sophisticated tool with specific advantages, it also has limitations that require careful consideration. The complexities involved in the transmission of the New Testament text mean that no single method can capture all the variables at play, making it crucial to use a multi-faceted approach in textual criticism.

About the Author

EDWARD D. ANDREWS (AS in Criminal Justice, BS in Religion, MA in Biblical Studies, and MDiv in Theology) is CEO and President of Christian Publishing House. He has authored over 220+ books. In addition, Andrews is the Chief Translator of the Updated American Standard Version (UASV).

SCROLL THROUGH THE DIFFERENT CATEGORIES BELOW

BIBLE TRANSLATION AND TEXTUAL CRITICISM

BIBLE TRANSLATION AND TEXTUAL CRITICISM

BIBLICAL STUDIES / BIBLE BACKGROUND / HISTORY OF THE BIBLE/ INTERPRETATION

EARLY CHRISTIANITY

HISTORY OF CHRISTIANITY

CHRISTIAN APOLOGETIC EVANGELISM

TECHNOLOGY AND THE CHRISTIAN

CHRISTIAN THEOLOGY

CHILDREN’S BOOKS

HOW TO PRAY AND PRAYER LIFE

TEENS-YOUTH-ADOLESCENCE-JUVENILE

 

CHRISTIAN LIVING

 
 

CHRISTIAN COMMENTARIES

CHRISTIAN DEVOTIONALS

CHURCH HEALTH, GROWTH, AND HISTORY

Apocalyptic-Eschatology [End Times]

CHRISTIAN FICTION

Exit mobile version