An etymological fallacy is an argument of equivocation, arguing that a word is defined by its etymology, and that its customary usage is therefore incorrect.[1][2]
Ancient Greeks taught that a word's meaning could be tracked across time, creating a distinction between formal and informal language, with a similar practice existing among ancient Vedic scholars. In modern linguistic anthropology, semiotics and semantics are intertwined, and reflective of a society's culture across time. The discipline operates on the principle that current meaning derives from previous meaning, which has embedded itself in the predicate of subsequent derivatives. Removing the etymological history thus removes necessary context on which present meaning ontologically depends.
An etymological fallacy becomes possible when a word's meaning shifts over time from its original meaning. Such changes can include a narrowing or widening of scope or a change of connotation (amelioration or pejoration). In some cases, modern usage can shift to the point where the new meaning has no evident connection to its etymon.[1]
An example of a word with a potentially misleading etymology is antisemitism. The structure of the word suggests that it is about opposition to and hatred of Semitic peoples, but the term was coined in the 19th century to specifically refer to anti-Jewish beliefs and practices, and explicitly defined Jewish people as a racial class. Modern anthropology and evolutionary biology overwhelmingly reject the concept of race,[3][4] and the term Semite is rarely used anymore except in discussing Semitic languages. An etymological fallacy emerges when a speaker asserts that antisemitism is not restricted to hatred of Jews, but rather must include opposition to all other Semitic peoples.[5][6] However, sources like Encyclopædia Britannica still consider it a misnomer. [7]
Original source: https://en.wikipedia.org/wiki/Etymological fallacy.
Read more |