Subscribe to Blog via Email
January 2021 M T W T F S S « Mar 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
Why does the definition of one word recall other n words and m definitions?
The question is somewhat opaque, but OP is getting to the question of, why is the definition of a word such a complex, and potentially circular, graph of links to other definitions. Your original question, OP, was in fact about circularity.
The answer is:
- Dictionary definitions aren’t particularly concerned about rigour or non-circularity: you’re assumed as a language learner to already have a baseline understanding of the definitional human language, which you can use to bootstrap any other definitions.
- Attempts at a rigorous semantics of definitions will inevitably have to bottom out on a list of Semantic primes, a set of concepts that have to be taken as givens rather than defined themselves.
- Identifying that list of primes, and using them for definitions, has not been a popular pastime. It’s work. Natural semantic metalanguage is an admirable initiative in that direction.
- Unfortunately, NSM also wanted to use those primes in human-intelligible definitions. That makes things dirtier. The initial Spartan beauty of Anna Wierzbicka’s Lingua Mentalis had 14 primes; now it’s in the 60s.
- Definitions of words in NSM are a valuable discipline to get into: they really force you to break concepts down. They are also a hilariously forced subset of English.
Look into Wierzbicka’s work, OP. Even if you don’t like the approach, it’s got some excellent insights. And start with the early stuff, including Lingua Mentalis itself.