Does Language Constrain The Speed of Thought?
And a bigger question: How is our mental life constrained by speech altogether?
These questions elicited a lot of lively discussion on my Instagram account. To the broader question, Rosemary reminded me of the truth and limitations of the Sapir-Whorf Hypothesis, also known as linguistic relativity, which argues that our thought is shaped by language we speak.
One problem with such a theory is that it tends to assume that all thought is verbal. Artists, musicians, architects, and engineers know otherwise. We can form purely visual or musical ideas which surely qualify as a form of thought or reasoning. If you've ever had an artistic encounter with another artist with whom you share no spoken language, you know that those visual ideas can be shared between people no matter what languages they speak.
Regarding the first question about the speed of thought, it occurs to me that when we are speaking, our language necessarily places upper limits on the pace at which we can roll out ideas, a problem for human-computer interfaces. An artificial intelligence can generate paragraphs in milliseconds, but it takes us a lot of time to type or say a series of ideas.
Sometimes I have the opposite problem, where my brain works a little too slowly to articulate a sentence fluently, so the result almost sounds like aphasia. I believe that for most people, our receptive capacity for language by timing how fast you can read, or while listening to an audio book by doubling or tripling the normal speech velocity. You can demonstrate this on YouTube or your favorite podcast app by increasing the speed settings on audio playback.
Certain non-linguistic modes of thought don't seem to be limited by velocity of expression. For example, the thought that goes into solving a Rubix cube seems almost like an instantaneous pattern recognition, and the act of puzzle solving appears to be limited only by the neuro-muscular action of the hands.
Most people are familiar with the face / vase illusion (below). Psychologists refer to it as a "bistable percept."
A bistable percept is an image that can be perceived in two different ways. The perception can switch back and forth between the two interpretations, but you only see one at a time.
Another example of a bistable percept is the Necker cube which switches from appearing above you and projecting to the right, to appearing below you and projecting to the left.
Take a look at this picture. What do you see? When you look at it again, do you see something else?
Most people see a man, off balance, running into a snowy forest. Then after looking again, they see a dog running toward us. Some people see the dog first and have a hard time seeing the human.
What's going on is that there are two opposite streams of information processing going on in your brain. One stream is like a camera. Light enters your eye and resolves into shapes and patterns that move to the back of the brain and up through the cerebral cortex to higher level processing.
But while this is going on, the brain is constantly generating theories of what it's seeing and delivering those theories down the pipeline, optimizing what you're actually seeing to fit its dominant conception.
All along you're reality-checking the top-down theory against the information coming up the pipeline from the eyes.
If the first top-down reading doesn't continue to fit the bottom-up facts, you start generating new interpretations.
A similar process happens with auditory processing when you hear a gunshot...or was that a firecracker?...or was someone popping a paper bag? You can feel your adrenaline surge when you think it's a gunshot, and all that changes when you realize it isn't.
Weird Science of Visual Perception
This was a cover feature I proposed for ImagineFX Magazine . They never used it, but I still believe it would be an awesome subject for artists.
Birdman, oil. I used mainly four colors: viridian, permanent alizarin, yellow ochre, and cerulean.
Read my previous posts on Visual Perception
The Role of Prediction in Perception
"The mind is a prediction engine, and nowhere is this more true than in visual perception."
In his book "The Mind: Consciousness, Prediction, and the Brain," by E. Bruce Goldstein details current research about how the process of visual perception.
What happens when we see is that the brain creates a top-down model of the world and continually checks it against the input coming from our senses."The mind encompasses everything we experience," he says, "and these experiences are created by the brain--often without our awareness. Experience is private; we can't know the minds of others. But we also don't know what is happening in our own minds."
We bestow our visual attention very selectively, and that hierarchy of awareness is called the attentional (or foveal) spotlight. It's like exploring a pitch-black house at night with a narrow-beamed flashlight.
The fovea is the central spot of the retina, which is packed with photoreceptors, especially color receptors. In the peripheral retina there are fewer receptors, and they tend to be more responsive to tone and movement.
As I understand it, the attentional spotlight is more than just a structural feature of our photoreceptors. It also describes an aspect of our cognitive awareness of the world around us; some would say it's a central quality of consciousness itself. We focus our attention on elements of our world that match our conscious or unconscious search parameters, or distractions that pop up, competing for attention.
Painters can capture the experience of the attentional spotlight, by helping the viewer know what's important, and downplaying the rest. It helps to darken, simplify, or blur areas that are less important. In the painting by Robert Blum, look at how much he downplays the peripheral areas in the foreground, and below the chairs, and keeps our attention within the circle of illuminated faces and hands.
Introduction to the attentional spotlight
Brain Scanners that Recognize What You Have Looked At.
In recent years, brain imaging studies have been able to recognize what image a person is looking at purely from brain activity. This is possible because the image maps onto the visual cortex almost like a blurry projection.
The research examines how you remember — and imagine— pictures that you've actually seen. It turns out that similar mechanisms come into play when you imagine something compared to how you process the real thing.
The scanning system is still in its infancy, but it portends the kind of mind-reading device described by science fiction authors. "It's what you would actually use if you were going to build a functional brain-reading device," said Jack Gallant, a neuroscientist from the University of California, Berkeley.
CNN: Brain scans reveal what you've seen
Brain Inspired Podcast: Thomas Naselaris | Seeing vs. Imagining