Linguistics Development in the 20th Century

Introduction

The study of language can be approached from two different angles – first, it can be studied as human behavior, and second, it can be explored as a social phenomenon. However, according to the basic definitions of language, it is generally accepted that language is a pathway that provides communication between individuals. In other words, language is the means that enables humans to pass a specific message. Some language scholars also look at a language as a collection of sentences and utterances that can be used in combination to carry certain meaning (Dunlea, 2006). Therefore, the definition of linguistics can be summarized as the systematic scientific study of language and all of its components.

We will write a custom Linguistics Development in the 20th Century specifically for you
for only $14.00 $11,90/page
308 certified writers online
Learn More

The study of language has come a long way, and it can be traced back to many centuries (Kebbe, 2010). However, the detailed study of language began not so long ago, to be more precise, the scholars started to explore language specifically only in the twentieth century. Before this, all attempts to study languages were isolated and restricted based on geographical regions and districts. For example, ancient materials on the study of the Arabic language have been found, they were recognized as early forms of linguistics, but mainly focused on certain aspects of linguistics (Dunlea, 2006). Regardless of the specific and narrow focus of the materials, they made a big contribution to the field of linguistics.

Before the twentieth century, the study of language was known as philology, and the scholars were called philologists, they studied language from an external perspective (Kebbe, 2010). Many scholars were either interested in characterizing language as a social phenomenon or studied it mainly as human behavior. Those who saw language as a social phenomenon equated it to other various phenomena and believed that language was subject to changes brought about by alterations in the social environment. Philologists also studied language from the point of view of its structure, so philology paved the path for modern linguistics.

During this period, linguistics was not studied as a unified subject; instead, its sub-specialties such as semantics, phonetics, or morphology were studied separately. In the twentieth century, all the various parts of linguistics were integrated together, which enabled a more detailed study of language.

The history of linguistics in the twentieth century has been divided into various periods depending on the predominant views at the time, and each period was represented by a hypothesis (Kebbe, 2010). This classification first emerged in the 1950s and has greatly helped the development of linguistics, the appearance of various schools of thought, and the growing recognition of the classification system. It is noteworthy to state that the hypotheses are in a continuum, each idea leads us to the next one, and they include the genetic hypothesis (1786), the regularity hypothesis (1876), the quantizing hypothesis (1916), and the exact accountability hypothesis (1957) (Hockett, 1958).

The genetic hypothesis (1786)

It is believed that there are more than 5000 languages in the world and that they can be grouped into families according to the similarities and close origins of the languages. This idea is based on the observation that languages undergo both semantic and phonetic changes over time (Kebbe, 2010). The genetic hypothesis is of the view that many modern languages that appear different can be traced back to a common ancient language as an ancestor (Hockett, 1958).

Languages that have a similar ancestor and can be united into a family based on this notion are called cognates. This hypothesis is concerned with a grouping of languages based on their origin, and most scholars who follow it are mainly interested in the European languages. It made sense for most scholars to trace the origin of the European languages since materials written in the old languages were available. This interest continued throughout the twentieth and in the twenty-first centuries, and this breakthrough was significant because it led to the development of more recent ideologies.

Get your
100% original paper on any topic done
in as little as 3 hours
Learn More

The genetic hypothesis has its origin in the study of Sanskrit, Greek, and Latin, the similarity between which had been pointed out in 1767, but there was no in-depth comparison or any significant analysis (Kebbe, 2010). The interest in comparative linguistics arose in 1786, after Sir William Jones declared that the languages of Europe and Sanskrit were genetically related. An in-depth comparative study of Sanskrit and the European languages then followed and led to the view that these languages shared a common origin.

The scholars began by grouping languages into families or cognates, for instance, the semantic and phonetic similarities between English and German led to an idea that the two languages are cognates and their common language of origin was named Proto-Germanic. Further examination of other languages yielded another family which comprised of languages derived from Latin, like Spanish, Italian, and French. Comparison of Latin and Germanic languages indicates that they have a common ancestor, and their parent language has been named proto-Indo-European language. This was after studies revealed semantic and phonetic similarities between Latin, Proto-Germanic, and Sanskrit. Another family named the Indo-European family was found to be a cognate of Latin and Proto-Germanic.

Later, the linguists who studied the Indo-European languages realized that vowel changes occurred in some related words, which gave rise to new sounds and as a result, the new words carried a different meaning. This observation was summarized in Grimm’s Law and Verner’s Law the authors of which were well-known philologists.

The regularity hypothesis

By 1836, the scholars who were involved into studying Germanic and Romanic languages had done extensive research on their subjects, which was boosted by the availability of written texts especially in Latin (Kebbe, 2010). The Germanic scholars relied on reconstruction of the parent language using the middle age Germanic languages (Fasold and Connor-Linton, 2006). After this intense interest in linguistics, there was a more systematic and structured approach to the study.

In 1875, a group of German scholars who were later nicknamed as the neogrammarians founded a school; they became the first scholars to put comparative studies in a logical historical order and to generate the regularity hypothesis (Kebbe, 2010). The hypothesis held that sound laws did not have exceptions, so the scholars were of the opinion that if a particular sound changed, it would affect all the words containing that sound and that the new sound would affect words in a particular place within a specified phonetic context (Fasold and Connor-Linton, 2006). The neogrammarians stated that one of four scenarios could explain words that did not follow the rules put forth by their hypothesis.

Chance correspondence

Sometimes words from two different language families may exhibit some phonetic and semantic similarity. The words should not be considered descendants of the same language even if they have closely related meaning and pronunciation, so the similarity is merely coincidental (Kebbe, 2010).

Rule deficiency

Some exceptions may occur due to a deficiency in sound laws. The sound rules may fail to put into consideration all the words that are affected by the sound in a different way. The neogrammarians suggested that if such a scenario occurs, then the laws should be adjusted to accommodate all the words affected by the sound.

We will write a custom
Linguistics Development in the 20th Century
specifically for you!
Get your first paper with 15% OFF
Learn More

Loan words

Some words borrowed from other languages may enter a language long after sound changes have occurred. The new words will not conform to the sound system of other words in the borrowing language.

Analogical reformation

Some borrowed words may not follow the laws of sound owing to their reformation; a borrowed word can be modified to fit the existing language rules without changing its sound. This may be attributed to fact that language speakers seek to remove irregularities in the language. However, it has been noted that some words clusters may move from a regular pattern to an irregular one.

Critique of the neogrammarian hypothesis

Some linguists have criticized the regularity hypothesis developed by neogrammarians for being overly reliant on sound changes (Kebbe, 2010). The critique says that is not possible for a change in sound to affect all the words that contain the sound at the same time. Furthermore, some words may retain their original sound, which indicates that the sound laws are not universal in nature. The neogrammarians also refer to geographic boundaries in their work; the physical span of such geographic region is not specified.

The quantizing hypothesis

A major breakthrough in the field of linguistics occurred in 1916 following a publication by Ferdinand De Saussure, which led to renewed interest in linguistics (Kebbe, 2010). The quantizing hypothesis is based on phonetic distinctiveness of words and focuses on the differences between distinctive and non-distinctive sounds of a language (Yule, 2005). As a result, the study of phonemes led to the emergence of another related hypothesis called the phonetic hypothesis.

De Saussure also made a significant contribution to linguistics through his work on speech and language, where he pointed out that language and speech are different concepts (Kebbe, 2010). By separating the two, he was able to study language comprehensively, and gave rise to the view that language consists of rules and regulations. This implies that when one learns a new language, they are learning the rules governing the language. On the other hand, speech is an outward expression of the learnt rules and it may be modified to suit certain contexts, so language usage may be adapted to suit the listener.

The quantizing hypothesis lays emphasis on sound distinctiveness and implies that sounds vary according to where and how they are used (Kebbe, 2010). It is noteworthy to state that words are collections of various sounds. The study of distinctiveness of sounds has led to the discovery of two types of sounds – distinctive and non-distinctive ones. De Saussure maintained that sounds must occur in a similar phonetic setting for us to realize their difference. Following his work, it was also discovered that most sounds vary depending on the position they occupy in a word. A sound may be stressed in one position while the same sound may not be stressed in other words. However, a variation of the same sound is not considered a distinctive parameter and is referred to in phonetics as non-distinctive sound.

The phonemes are distinctive sets of sounds that give words their meanings; they are numerous, versatile and are present in every language. The combination of the phonemes varies with the language and context. Therefore, the phoneme hypothesis is concerned with use of sounds in particular phonetic environments.

Not sure if you can write
Linguistics Development in the 20th Century by yourself?
We can help you
for only $14.00 $11,90/page
Learn More

This breakthrough is significant in the field of linguistics because it laid the foundation for a more detailed hypothesis which is also a bit restrictive. It is a well-known fact that sound schemes of language are not subject to rigid rules due to the notion that certain sounds have always been present while others have recently emerged. Therefore, it has been suggested that the sound rules should be used as guidelines in the study of linguistics and should not be allowed to interfere with our understanding of language.

The exact accountability hypothesis

The exact accountability hypothesis followed a publication called Syntactic structure by Chomsky and intended to provide an exact view of language as understood by a native speaker (Kebbe, 2010).

The hypothesis is an attempt to study language in its entirety to contrast with previous approach that focused on structure only. The new school of thought encourages the incorporation of meaning in the study of languages which was followed by the discovery that the knowledge people have of their language influences the way they use the language. This way, the exact accountability hypothesis discourages the study of other aspects of linguistics in the absence of meaning which implies that focus should be on a fluent speaker’s mastery of the language that has come to be known as competence.

Therefore, it can be said that competence is determined by the understanding of meaning. Competence provides an avenue through which a language can be assessed. As a result, semantics, which previously had been neglected, gained widespread acceptance as a branch of linguistics and finally stopped being considered the weakest link in the study of languages. Some early scholars thought that semantics could not be studied objectively as it was believed not to be measurable.

Children provide an opportunity to study language competence in detail as it is thought that by the age of 5 years, a child has mastered the native language. The child learns the vocabulary and the phonemes and stores them in memory. When competence has been attained, a child can make almost an infinite use of the stored information, generate new sentences and use them in various ways to communicate, and also generate new combinations of the sentences and utterances. Chomsky was of the view that every child is born with an intrinsic ability to acquire language, but this ability is not language specific and is not restricted to parents’ native language. The hypothesis holds that a child must be provided with an evaluation procedure that enables them to choose the right grammar for their language.

The exact accountability theory has given rise to other theories and explanations on language acquisition and usage. The generative grammarians are of the view that there must be a universal tool available to each child at the time of birth; it has been named the language universal. All children appear to acquire their native language within a certain period. This observation was used to come up with the conclusion that all children are born with an ability to acquire language. In 1963, Greenberg came up with a generalization on this topic after investigating a number of languages. He noticed that languages show similarity in spite of their divergence and based on this they can be grouped as absolute or relative trends. The language universals have been put into the following categories:

  • Substantive universals: – this comprises all the grammar requirements each language needs. The requirements enable users to construct grammatically correct sentences and use them appropriately. They include vocabulary, grammatical categories, and phonological categories.
  • Formal universals: – this includes rules and regulations governing use of language. These set of rules guide the speakers of a language. For instance, when a child speaks, it is thought that in addition to choosing the vocabulary, the child also reviews the rules applicable to the language. In general, formal universals cover grammatical rules, phonological rules, and semantic rules.
  • Functional universals: – these are rules that guide the application of grammatical rules. Certain rules are not applicable all the time. There are exceptions to the rules. The exceptions rely heavily on functional universals. In some instances, application of grammar rules renders a sentence grammatically incorrect.

The language universals in their original state should be approached with caution and serve as guidelines that enable us to understand languages. Looking for similarities in totally unrelated languages may be difficult. In addition, the view that children are born with some intrinsic language abilities should be questioned because this approach tends to focus on the individual, while it is important to understand the speaker of the language, it should not be allowed to take precedence.

Conclusion

Language can be studied as a human behavior or as a social phenomenon. However, it is generally accepted that language is a communication pathway which enables humans to pass a specific message. Language also can be viewed as a collection of sentences and utterances that can be used in combination to carry certain meaning. Therefore, the definition of linguistics can be summarized as the systematic scientific study of language.

The genetic hypothesis has its origins in the study of Sanskrit, Greek, and Latin. As early as 1767, the similarity between Latin and Sanskrit had been pointed out. The genetic hypothesis is of the view that many modern languages that appear different can be traced back to a common ancient language. Therefore, this period is concerned with grouping of languages based on their origin.

In 1875, a group of German scholars who were later nicknamed the neogrammarians founded a language school. They hypothesized that sound laws did not have exceptions and theorized that if sound changed, it would affect all the words containing that sound. The hypothesis postulated that the new sound would affect words in a particular place within a specified phonetic context.

The quantizing hypothesis is based on phonetic distinctiveness of words and is focused on the differences between distinctive and non-distinctive sounds of a language. The quantizing hypothesis lays emphasis on sound distinctiveness and it implies that sounds vary according to where and how they are used. The study of distinctiveness of sound has led to the discovery of two types of sounds namely: distinctive and non-distinctive ones.

The exact accountability hypothesis is an attempt to study language in its entirety, which contrasts with the previous approach that was focused on structure only. The new school of thought encourages the incorporation of meaning in the study of languages that was followed by the discovery that the knowledge people have of their language influences how they use the language. Therefore, the exact accountability hypothesis discourages the study of other aspects of linguistics in the absence of meaning.

References

Dunlea, A. (2006). Vision and the Emergence of Meaning. Cambridge, UK: Cambridge University Press.

Fasold, R., and Connor-Linton, J. (2006). An Introduction to Language and Linguistics. Cambridge, UK: Cambridge University Press.

Hockett, C. (1958). A Course in Modern Linguistics. New York, NY: Macmillan Company.

Kebbe, M. (2010). Lectures in general linguistics: an introductory course. Aleppo: El Hannach University Press.

Yule, G. (2005). The Study of Language. Cambridge, UK: University Press.

Check the price of your paper