Please Support the Bible Translation Work of the Updated American Standard Version (UASV)
The field of New Testament Textual Criticism is pivotal for anyone who wishes to engage deeply with the text of the New Testament. It addresses the question of what the original text of the New Testament books most likely said, considering the multitudes of manuscript copies with their variant readings. A grounded understanding of its processes and principles not only helps scholars and translators but also bolsters the faith of the average Christian, who can better appreciate the meticulous care involved in preserving the Scriptures.
Understanding the Nature of the Textual Tradition
Before diving into the methodology, it is essential to understand the nature of the New Testament’s textual tradition. Unlike other ancient documents, the New Testament has been preserved in a multiplicity of manuscripts, numbering over 5,800 in Greek alone. This multitude is both a blessing and a challenge: it provides a robust foundation for establishing the original text but also requires a sophisticated methodology to sift through the variants.
The Manuscript Evidence
We possess the New Testament text in various forms of manuscripts, including papyri, uncials, minuscules, and lectionaries. These manuscripts serve as the primary evidence in textual criticism. Each manuscript category has its own set of characteristics, dating, and provenance, making some more reliable than others depending on the context.
The Basic Process of Textual Criticism
Textual criticism involves a carefully structured process. Starting from gathering the manuscripts, scholars move on to comparing textual variants and then to establishing a critical text.
Collation and Categorization
The first step is the collation of all available manuscripts and versions of the New Testament. These texts are compared word-for-word, and variations are noted. These variations are then categorized based on manuscript types, dates, and geographical origin.
Evaluating Variants
Once the variants have been collated and categorized, the next step is to evaluate them. The purpose is to determine which reading is most likely to be original. Several criteria are used for this evaluation, which are broadly categorized into “external” and “internal” evidence.
Principles of Evaluating Variants
External Evidence
External evidence pertains to the manuscript data. The age of the manuscript, its textual family, and the geographical distribution of a particular reading all contribute to the weight of external evidence. Generally, earlier manuscripts are given more weight, as are readings that appear in multiple textual families.
Internal Evidence
Internal evidence involves the intrinsic factors of the text itself. Questions asked include: Which variant best fits the author’s style and vocabulary? Which reading is the hardest and thus less likely to be the result of scribal alteration? Here, the principle of lectio difficilior potior (“the more difficult reading is the stronger”) often applies.
The Principle of Coherence
The coherence principle suggests that a reading that coheres with a particular manuscript’s overall textual character is to be preferred. This involves a deep understanding of individual manuscripts and textual families to make an educated decision.
Weighing the Evidence
After applying these principles, scholars weigh the evidence to arrive at the most probable original reading. Often, this is more art than science, requiring a nuanced understanding of both the textual data and the historical context in which a variant might have arisen.
Establishing a Critical Text
The final step is the compilation of a critical text. This text incorporates the most likely original readings and provides an apparatus noting significant variants and the evidence supporting them.
Textual Apparatus
The textual apparatus is an essential tool in any critical edition of the New Testament. It allows the reader to see the manuscript support for each reading and offers a transparent look at the choices made by the editors.
The Role of Modern Technology
Advancements in technology have significantly impacted New Testament textual criticism. Digital humanities, databases, and advanced software have streamlined the collation and comparison process, making it easier to manage the voluminous data involved.
The Limitations and Challenges
While New Testament textual criticism provides a robust method for getting as close as possible to the original text, it is not without its limitations. The absence of the autographs (original manuscripts), the subjective element in weighing evidence, and the ever-growing number of discovered manuscripts continually challenge the field.
The field of New Testament Textual Criticism is indispensable for anyone who takes the text of the New Testament seriously. Its rigorous methodology, built on both external and internal evidence, serves to establish the most probable original readings of the text. Though the field faces various challenges, the process has been remarkably successful in providing a text that is incredibly close to the original, a fact that should inspire confidence in the reliability of the New Testament Scriptures.
Modern Approaches to New Testament Textual Criticism: An Exhaustive Exploration
The discipline of New Testament textual criticism has undergone numerous shifts and modifications over the years. To truly comprehend its multi-faceted nature, we need to scrutinize the prevailing methodologies currently employed by scholars in the field. These methodologies provide the ideological frameworks through which these scholars examine and interpret textual variants in the New Testament manuscripts.
Radical Eclecticism (G. D. Kilpatrick, J. K. Elliott)
Radical Eclecticism is a school of thought that primarily relies on internal evidence, such as grammar, style, and immediate context, to decide between textual variants. For these scholars, the text of the New Testament is almost like a tabula rasa, a blank slate, as they argue that it is almost impossible to trace back the history of its textual transmission with any degree of certainty. Therefore, they feel it would be futile to give significant weight to any specific manuscript or text type. This methodology has been heavily criticized for sidelining external evidence, including the extant Greek manuscripts, which many believe provide crucial insights into the original text. The absence of a reverence for the manuscript tradition can make this approach feel ungrounded, perhaps even whimsical, to its critics.
Reasoned Eclecticism (B. M. Metzger, K. Aland)
Reasoned Eclecticism attempts to strike a balance between internal and external evidence. This methodology informs the textual basis of the Nestle-Aland and United Bible Societies’ Greek New Testaments. Unlike Radical Eclecticism, this approach does take into consideration the text types and manuscripts but refrains from giving primacy to any one of them. However, a subtle preference can often be seen for the Alexandrian text type, a legacy of the influence exerted by Westcott and Hort in the 19th century. Critics of Reasoned Eclecticism argue that, despite its claims of neutrality, it tends to produce a sort of modern Textus Receptus—a standardized text that could potentially stifle further textual inquiry.
Reasoned Conservatism (H. A. Sturz)
Reasoned Conservatism adopts a more balanced view and argues for the early and independent nature of the main text types, tracing them back to the 2nd century C.E. Like Reasoned Eclecticism, this approach values both internal and external evidence but without a bias for the Alexandrian text type. Instead, Reasoned Conservatism champions the geographical distribution of text types and the consensus among them as markers for their originality. A unique feature of this approach is its willingness to consider the Byzantine text type as an early witness to the text of the New Testament, given that Byzantine readings have been found in early Egyptian papyri. Critics say this methodology tends to restore a text type that many view as “corrupt” to a place of undue prominence.
Radical Conservatism (Z. Hodges, A. Farstad)
Radical Conservatism places the Byzantine text type on a pedestal, asserting that it most closely represents the original New Testament text. The methodology behind this approach often involves counting manuscripts, the majority of which are Byzantine, and considering these to be closer to the autographs. This line of thinking influenced the production of the New King James Version, among other translations. The primary criticism leveled against this approach is its quantitative methodology, counting manuscripts rather than weighing their worth, which can perpetuate scribal errors if they exist in a “parent” manuscript that was copied multiple times.
The Documentary Approach (Comfort, Andrews, Wilkins)
This approach is grounded in the principle that the New Testament text has been transmitted through a series of copying and variations, both unintentional and intentional. The Documentary Approach employs a rigorous examination of manuscripts, considering their age, geographic origin, and textual characteristics. This involves both internal and external forms of evidence. External evidence could include quotations from early Church Fathers or other ancient texts that reference the New Testament. The emphasis here is on a careful evaluation of the manuscript witnesses, taking into account both their qualitative and quantitative aspects.
The Coherence-Based Genealogical Method (CBGM): An In-depth Analysis (Mink, Gurry, Head)
The Coherence-Based Genealogical Method (CBGM) is a relatively new but increasingly influential approach to New Testament textual criticism. Pioneered by scholars like Gerd Mink and further explored by scholars such as Peter Gurry and Peter Head, this method attempts to solve some of the most enduring complexities in the field of New Testament textual criticism. It does so by using computational methods to analyze relationships between manuscripts in a way that traditional methodologies could not.
The field of New Testament textual criticism is a complex landscape filled with diverging opinions and methodologies. Each approach carries its own set of merits and drawbacks, but what they all share is a commitment to understanding the text of the New Testament as faithfully as possible. What differentiates them is the weight they assign to various forms of evidence and the presuppositions they bring to the text. No methodology is without its criticisms, and the debates between these different schools of thought enrich the discipline as a whole, pushing it continually toward a more nuanced and accurate understanding of the New Testament text.
Theoretical Foundations
The CBGM is not just an empirical method; it has a theoretical foundation that distinguishes it from more traditional approaches. It operates on the premise that no single manuscript can be considered the closest to the original text in all its readings. Instead, each manuscript may have some readings closer to the original text, while other readings may be further away. The genealogical relationships between the manuscripts are, therefore, not linear but rather multidimensional, resembling a complex web. This is a radical departure from previous methods that aimed to prioritize certain “families” of texts or individual manuscripts.
Computational Methodology
One of the most defining aspects of the CBGM is its reliance on computational algorithms. This allows for a more objective assessment of the coherence among different manuscripts. Each variant reading is treated separately, and the manuscripts that contain it are analyzed for their genealogical relatedness. For a given variant, manuscripts that share the same reading are grouped together and their mutual coherence is evaluated. The coherence score is based on how often these manuscripts agree on other variants throughout the text. This approach is designed to approximate how closely the manuscripts in question could be related to the original text.
Strengths of the CBGM
The CBGM brings a level of statistical rigor and objectivity that is often lacking in more traditional methodologies. Its data-driven approach minimizes some of the subjective elements that have long plagued New Testament textual criticism. For example, it reduces the influence of scholars’ personal preferences for certain text types or individual manuscripts. Moreover, its ability to handle a large volume of data allows for a more comprehensive analysis of the extant manuscript evidence.
Another strength is its capacity to adapt to new manuscript discoveries. As the method is computational, incorporating additional data does not require a complete overhaul of the existing analysis; rather, the new data can be more seamlessly integrated.
Criticisms and Limitations
Though innovative, the CBGM is not without its detractors. One of the main criticisms is that it heavily emphasizes the internal evidence and sometimes overlooks the external evidence like patristic citations or ancient translations. Critics argue that these external witnesses can offer valuable historical context that could aid in establishing the original text.
Another concern is that while the method might be objective in its calculations, the interpretation of its results still requires subjective judgments. The ‘coherence’ between manuscripts does not automatically prove their closeness to the original text; it simply shows they are related in some way.
Lastly, the method’s complexity and reliance on specialized software mean that it is less accessible to scholars who do not have a background in computational methods or statistical analysis.
Digging Deeper into the Weaknesses and Limitations Coherence-Based Genealogical Method (CBGM)
The Coherence-Based Genealogical Method (CBGM) has been hailed as a revolutionary approach in New Testament textual criticism, particularly in the reconstruction of the text of the New Testament. While the method has its merits, especially in its attempt to apply a consistent methodology to a complex problem, it is not without its weaknesses and limitations. The following points address these concerns in depth.
Subjectivity in Identifying “Initial Text”
One of the significant concerns is the subjective nature of determining the “Initial Text” (Ausgangstext) in the CBGM methodology. Critics argue that there’s no definitive way to establish the earliest recoverable form of the text. Given that the method depends on the Initial Text as a point of reference, the element of subjectivity at this stage casts a shadow over the entire process.
Theoretical Constraints
The CBGM operates within a theoretical framework that assumes the changes in the text occur in a particular manner that can be modeled genealogically. However, the reality of how scribes copied texts is far more complicated. The copying process could involve multiple exemplars, oral traditions, or other mechanisms that do not fit neatly into the CBGM’s theoretical constraints.
Statistical Complexity
The algorithmic nature of CBGM requires a level of statistical understanding that many textual scholars do not possess. While algorithms can provide valuable insights, they are also limited by the assumptions programmed into them. This complexity can serve as a barrier to many who wish to engage with the methodology critically.
Limitation to Greek Manuscripts
The CBGM focuses on the Greek manuscripts and largely disregards translations and citations from the early Church Fathers. While the Greek manuscripts are undoubtedly crucial, ignoring these other witnesses is a serious limitation. These sources can offer perspectives that are not present in the Greek manuscripts and can serve as a counterpoint to the type of coherence that the CBGM seeks to establish.
Ambiguity in Determining “Coherence”
The very term “coherence” can be misleading or ambiguous. Coherence does not necessarily indicate historical accuracy or reliability but rather refers to consistency within a predefined set of manuscripts. Two or more texts could be coherent with each other while still being incorrect representations of the original text.
Difficulty in Addressing Conflation
Conflation, where a scribe combines readings from different textual families, poses a significant problem for the CBGM. Given its genealogical approach, the method finds it challenging to address instances where scribes have merged readings from multiple ancestors, as this goes against the assumed ‘tree-like’ lineage that the method is designed to handle.
Lack of External Criteria
The CBGM is mainly concerned with the internal evidence, focusing less on external criteria like paleography, codicology, or even patristic citations. The historical context within which a manuscript was produced can offer invaluable insights into its textual characteristics, but these are largely sidelined in the CBGM approach.
Historical Anomalies
The method does not adequately account for historical anomalies, such as the widespread adoption of a particular reading due to ecclesiastical endorsement or persecution. Such external factors can dramatically influence the transmission process, and their absence in the CBGM model marks a considerable gap.
Fails to Fully Address Human Element
Textual transmission was not a mechanical process but one filled with human decisions. Scribal activities were influenced by theological, sociopolitical, or even personal factors, which could cause variants to appear that are not strictly genealogical in nature. The CBGM, with its mechanistic approach, does not fully capture this human element.
Dependence on Existing Editions
The CBGM methodology has often been applied to modify or validate existing critical editions of the New Testament, such as the Nestle-Aland edition. This raises the question of circular reasoning, as the methodology can be used to confirm what is already assumed in the selected ‘base text.’
While the Coherence-Based Genealogical Method represents a significant attempt to standardize the process of New Testament textual criticism, it has considerable limitations that cannot be ignored. From its inception point with the “Initial Text” to its theoretical assumptions, statistical complexity, and other constraints, the method leaves much to be desired in addressing the intricate and human process of textual transmission. Whether one views these limitations as fatal flaws or challenges to be overcome could depend on one’s presuppositions. However, what is clear is that any textual critical method, including CBGM, should not be used in isolation but rather in conjunction with a variety of approaches to reach a more balanced and nuanced understanding of the New Testament text.
Let’s delve deeper into each point with specific examples to provide a clearer understanding of the weaknesses and limitations of the Coherence-Based Genealogical Method (CBGM). Below are examples of each of these points we have just made.
Examples of the Above Points
Subjectivity in Identifying “Initial Text”
The CBGM begins by identifying what it calls the “Initial Text,” which is assumed to be the earliest recoverable form of the text. The problem here is the subjectivity involved in determining what this “Initial Text” should be. For example, in the case of Mark 16:9-20, some might consider the longer ending as part of the “Initial Text,” while others might disagree. This initial choice then influences the entire genealogical model that follows.
Theoretical Constraints
The CBGM operates within certain theoretical assumptions about how the text has been transmitted. For instance, it assumes a tree-like genealogy of texts. However, we have instances like the “Western Non-Interpolations” in Luke, where scribes seemed to omit rather than add material. These cases are hard to fit into a tree-like model but are better understood as examples of scribal decision-making that defy simple genealogical representation.
Statistical Complexity
The CBGM is algorithmically complex, making it difficult for those without statistical training to fully engage with the method. Consider the “pre-genealogical coherence” calculations that the method employs. These calculations are crucial for determining relationships between manuscripts, but understanding them requires more than just a cursory knowledge of statistics.
Limitation to Greek Manuscripts
The CBGM mostly ignores early translations like the Latin Vulgate or the Syriac Peshitta, as well as citations from the Church Fathers. These can serve as significant witnesses to the text. For instance, certain readings in 1 Timothy 3:16 are supported by early Latin and Syriac versions but might be overlooked if one were focusing solely on the Greek manuscripts.
Ambiguity in Determining “Coherence”
The term “coherence” is rather ambiguous in the CBGM framework. For instance, in the text of Revelation, the CBGM might find that manuscripts A and B are coherent because they share a set of similar readings. However, this coherence doesn’t necessarily imply that A and B are closer to the original text—just that they share similarities.
Difficulty in Addressing Conflation
In John 7:53-8:11, the story of the woman caught in adultery, we have evidence of conflation from different textual traditions. Some manuscripts include the passage, some don’t, and some have it in different locations. The CBGM struggles with such scenarios because its algorithm is not designed to handle readings that are the result of the blending of different traditions.
Lack of External Criteria
For manuscripts like Papyrus 66 (P66), external evidence such as paleography dates it to around 200 C.E., which gives it a particular weight in textual decisions. However, CBGM largely ignores these external criteria, focusing more on internal coherence, which could lead to skewed results.
Historical Anomalies
Take the example of the “Comma Johanneum” (1 John 5:7-8). This Trinitarian formula was more likely inserted into the Latin Vulgate due to ecclesiastical pressure. While this is an extreme example, it illustrates how the CBGM can be ill-equipped to handle readings influenced by external historical factors.
Fails to Fully Address Human Element
Scribal changes were not purely accidental; sometimes they were deliberate for theological or other reasons. For instance, the change from “God was manifest” to “He who was manifest” in 1 Timothy 3:16 could have been motivated by early Christological debates. The CBGM is not designed to account for such intentional changes.
Dependence on Existing Editions
The CBGM has often been applied to validate or modify existing critical editions like the Nestle-Aland. When it was used in preparing the Editio Critica Maior of the Catholic Epistles, the “Initial Text” was generally aligned with the Nestle-Aland 27th edition. This raises questions about the independence of the CBGM from existing critical texts.
By examining these examples, it becomes evident that while the CBGM is a sophisticated tool with specific advantages, it also has limitations that require careful consideration. The complexities involved in the transmission of the New Testament text mean that no single method can capture all the variables at play, making it crucial to use a multi-faceted approach in textual criticism.
The Significance of Documentary Evidence in Textual Criticism
Reasoned Eclecticism vs Documentary Priority
Textual scholars often use methods like “reasoned eclecticism” or “local-genealogical” methods to determine the original text of the New Testament. However, these approaches prioritize internal evidence, such as wording and style, over external evidence, like manuscripts and their history. Westcott and Hort argued for the opposite: prioritizing documentary evidence to get closer to the original text.
Scholarly Consensus and Challenges
Colwell also emphasized the importance of external or documentary evidence. However, many scholars have ignored this call. They doubt we can create a stemma, or a family tree of manuscripts, particularly for the New Testament. They fear that this will lead to subjective decisions about which manuscripts are most accurate.
Why Documentary Evidence Matters
Documentary evidence doesn’t just offer a lineage back to the original text, but also helps us understand how various manuscripts are related. The goal is to discover which manuscripts are the closest replications of the original text. For instance, the papyrus 𝔓75 has shown that we can’t dismiss documentary methods. It’s an accurate second-century manuscript and has about an 83-percent agreement with Codex Vaticanus, a fourth-century manuscript.
Shifting Perspectives: 𝔓75 as a Game-Changer
Before 𝔓75 was discovered, many scholars thought early texts were too varied to be reliable. They believed that the Codex Vaticanus was the result of careful scholarly editing. However, 𝔓75 has changed that perspective. It’s now clear that Codex Vaticanus was not a scholarly recension but a copy of a manuscript similar to 𝔓75.
The Alexandrian Text: A Preexisting Standard
Zuntz and others thought that Alexandrian scribes slowly refined the text. But 𝔓75 proves that a clean Alexandrian text existed as early as the late second century. This means there was no long “purification process”; the text was already quite accurate.
The Western Text: An Unreliable Alternative
Some scholars argue that the 𝔓75 and Codex Vaticanus type of text is not necessarily superior to another early text type, known as the Western text. However, many critics who have worked closely with these manuscripts find 𝔓75 and B to be more reliable due to fewer errors and alterations.
In sum, I advocate for a documentary approach in textual criticism over other methods. This position often diverges from the choices made by editors of the NU text, highlighting the critical role that documentary evidence should play in reconstructing the original text of the New Testament.
About the Author
EDWARD D. ANDREWS (AS in Criminal Justice, BS in Religion, MA in Biblical Studies, and MDiv in Theology) is CEO and President of Christian Publishing House. He has authored over 220+ books. In addition, Andrews is the Chief Translator of the Updated American Standard Version (UASV).

SCROLL THROUGH THE DIFFERENT CATEGORIES BELOW
BIBLE TRANSLATION AND TEXTUAL CRITICISM
BIBLE TRANSLATION AND TEXTUAL CRITICISM
BIBLICAL STUDIES / BIBLE BACKGROUND / HISTORY OF THE BIBLE/ INTERPRETATION
EARLY CHRISTIANITY
HISTORY OF CHRISTIANITY
CHRISTIAN APOLOGETIC EVANGELISM
TECHNOLOGY AND THE CHRISTIAN
CHRISTIAN THEOLOGY
CHILDREN’S BOOKS
HOW TO PRAY AND PRAYER LIFE
TEENS-YOUTH-ADOLESCENCE-JUVENILE
CHRISTIAN LIVING
CHRISTIAN DEVOTIONALS
CHURCH HEALTH, GROWTH, AND HISTORY
Apocalyptic-Eschatology [End Times]
CHRISTIAN FICTION
Like this:
Like Loading...
Leave a Reply