A Tangle of Words Woven by Mind...
Culture

A Tangle of Words Woven by Mind...

From Chomsky's universal grammar to the FOXP2 gene, an exploration of whether human language is a unique evolutionary leap or a sophisticated statistical trick.

A Tangle of Words Woven by Mind...

From so simple a beginning, endless forms most beautiful and most wonderful have been, and are being, evolved.

— Charles Darwin, On the Origin of Species

In 1860s London, a shy man with a soon-to-be-iconic beard received a letter that must have rattled him. It wasn’t threatening, but to Charles Darwin, its contents bordered on blasphemy. The sender, Alfred Wallace—co-father of evolutionary theory—had written:

“Natural Selection could only have endowed the savage with a brain a little superior to that of an ape, whereas he actually possesses one but very little inferior to that of the average members of our learned societies. [...] We must therefore admit the possibility that [...] a Higher Intelligence has guided the same laws [of variation, multiplication, and survival] for nobler ends.”

No supernatural force was needed for humans to emerge. Yet, Wallace’s claim remains useful to highlight something undeniable: humans are cognitively distinct, perhaps too distinct. What makes us different? Why only us? Why did we develop the kind of intelligence that reflects on itself?

We’re not physically impressive—we can’t fly or outrun cheetahs—but no other species comes close to our cognitive and linguistic complexity. And yet, both concepts are slippery. Defining what sets human intelligence apart remains one of the great scientific and philosophical challenges.

Many cognitive scientists argue that our ability to communicate—and the cultural accumulation it enables—explains our success. According to the “language-as-communication” view, language is a refined tool, evolved under pressure, just like other animal communication systems. We’re just smarter monkeys, with quantitative—not qualitative—differences.

This view has real philosophical weight. It downplays human uniqueness. And while some find that deflating, many scientists today accept it. Still, for those craving a sense of distinction, there’s another camp.

In a room at MIT, a young philosopher named Noam Chomsky once responded to behaviorist B.F. Skinner’s take on language. Skinner argued that children learn language like pigeons learning to peck a lever: reinforcement and repetition. This inspired reinforcement learning in AI decades later. But Chomsky saw a problem: children acquire grammar too quickly and accurately from limited input.

That marked the birth of modern linguistics. Chomsky proposed three key ideas:

  1. All human languages share fundamental principles—recursive, hierarchical rules that allow infinite meaning from finite words.
  2. Language is innate to humans; animals don’t share this “faculty of language.”
  3. Language isn’t just for communication—it supports internal thought, letting us combine concepts into complex ideas.

Supporters of the “language-of-thought” hypothesis argue that while animals have sophisticated signaling—bees dance, primates gesture—these systems lack the flexibility, depth, and hierarchy of human language. More importantly, they only serve communication, whereas ours may not.

Chomsky claimed human syntax is “overkill” for communication alone. It sits between context-free and context-sensitive in computational terms, enabling immense complexity. Often, language prioritizes structural elegance over ease of understanding.

EEG data backs this up. When parsing sentences, the brain doesn’t process words one by one; it builds tree-like structures, grouping words into nested units. This reduces memory load and makes comprehension efficient. For instance, we say:

“The scientist who discovered the theory wrote a book”

rather than,

“The scientist wrote a book who discovered the theory”.

The latter disrupts hierarchical grouping and makes processing harder.

For “language-of-thought” proponents, this syntactic ability enabled higher cognition—a qualitative leap, not a gradual change. It was an abrupt evolutionary shift that distinguishes us from other species.

“Syntactic ability enabled higher cognition—it was an abrupt evolutionary shift that distinguishes us from other species”

And yet, the tide turned again.

In 2001, a geneticist in Munich identified mutations in a gene called FOXP2, associated with speech and motor control. The media dubbed it “the gene for language”—an oversimplification—but it was a clue. The gene contains ultraconserved regions: nearly identical across species, suggesting extreme evolutionary importance. But two small differences between humans and chimps hint at selection shaping our linguistic abilities—supporting a gradual tweak over a sudden leap.

This kind of evidence sparked a revival of the quantitative hypothesis. At MIT, Evelina Fedorenko’s lab found that the brain’s language network is distinct from other systems supporting logic, math, or social reasoning. Patients with language deficits can still think clearly, suggesting that language isn’t the basis of cognition.

Fedorenko and others argue our brains didn’t evolve a unified “language-of-thought” system, but rather multiple systems, each evolving incrementally. Structurally, our brains are nearly identical to chimpanzees’—just scaled up.

So which is it? Are we unique? Or just better by degree?

If you’re confused, good: you’ve understood the state of the field. There’s no consensus. Evidence points both ways, with the truth maybe suggesting lying somewhere between a sharp qualitative leap and a slow quantitative climb.

I won’t dive into how deep learning and Large Language Models have reshaped this debate (maybe next time), but here’s something to ponder. Recent studies show human language processing may reduce to linear mappings, much like LLMs. Models like BERT and GPT blend syntax and semantics in ways eerily close to human parsing.

Still, many dismiss LLMs as “just statistical tricks.” But then—what are we? What is intelligence, if not the most sophisticated statistical trick nature has ever pulled off? Maybe our uniqueness doesn’t lie in being special. Maybe it lies in not being. In that gap between our humble evolutionary roots and the stars we now reach for.

For the complete version, check the associated web page.

Human Brain Illustration

What is intelligence, if not the most sophisticated statistical trick nature has ever pulled off?

Jacopo Minniti
Written byJacopo Minniti