Is it like a rough inference of what’s being said based on mouth movements, or is it more precise somehow? Would it be a mistake to think you knew exactly what was said by reading lips (even if you were good at it)?
Seinfeld had an episode that revolved around lip reading.
deleted by creator
Thanks! I appreciate the perspective on this, as lip-reading is kinda like “eye-reading” to me in that I’ve struggled to understand what’s involved.
To put my experience into perspective, which might work for at least a few people: subtitles. I mean… I’ve never asked anyone else but yall arent just reading them, right? To me they just clarify the speech subconsciously (for the most part), rather than me reading them off the screen when I need them. Subtitles are weird… Who knows if this is accurate to my experience or similar to others.
This also helps me understand, as I often do watch stuff with subtitles to help better follow dialogue, and I’m usually not closely reading them all throughout.
When you say subtitles do you mean closed captions? Because I agree those are a boost for me to follow what I’m also seeing and hearing the person say. But with subtitles they’re speaking a different language so lip-reading isn’t helpful and hearing just adds tone of voice.
deleted by creator
I mix them up all the time myself
Your experience seems very much like my own. I don’t have hearing loss, but what I assume is an auditory processing issue with speech.
It’s much easier for me to understand what someone is saying when I can see their mouth and microexpressions. If my back is turned, I don’t always catch everything. Sometimes I keep hearing the “wrong thing” no matter how much I ask for them to repeat it… so I just learned to repeat back whatever non-sensical thing I “heard”; and that either helps me process what they were trying to say, or they will repeat it back slower and more clearly. It’s frustrating sometimes, especially in noisier environments with a lot of other stimulus… that’s when seeing someone’s lips will help the most for me
And of course, I love subtitles. Otherwise I have to blast the TV, and still will miss things. The subtitles just clarify what I’m pretty sure I heard, or what I missed. I’m not just reading my way through everything, unless it’s in another language… which than does feel like a switch in the way I “see subtitles”
As far as being similar to others’ experiences: I don’t have any significant hearing loss, but you basically just described my subjective experience reading lips (and subtitles).
I’ve heard that it’s easier if you’re familiar with the person, past that I’m curious too!
Hearing loss my dude. Army. Necessity is the mother of invention. It makes date night fun with my partner though. I can read and sign remote convos to her and sometimes that’s spicy/fun. It’s not perfect but I can usually follow the thread. More so if the target is animated/angry/excited. Unfortunately the best ones are the hardest. We love the first date awkward convos, the public breakups, and admissions of guilt but those tend to be subdued and difficult. When you get them though it is choice.
As a linguist, I suspect that everyone lipreads to some extent as a conversation repair mechanism. Accuracy probably depends on skill and context. Family members with hearing loss are pretty good at understanding a speaker that they can see clearly, even when there’s no sound information available at all.
It’s inference based on mouth movements, but it isn’t as rough as it seems like - context plays a huge role on disambiguation, just like it would for you with homonyms that you hear. It’s just that the number of words that look similar when you mouth read is larger than the number of words that sound the same, since some sounds are distinguished by articulations that you can’t immediately see (such as [f] vs. [v] - you won’t see the vocal folds vibrating for the later, so “fine” and “vine” look almost* the same.)
Also, the McGurk effect hints that everyone uses a bit of lip reading, on an unconscious level; it’s just that for most of us [users of spoken languages] it’s just to disambiguate/reinforce the acoustic signal.
*still not identical - for [v] the teeth will touch the bottom lip a bit longer.
All I know is that when people started wearing masks, suddenly I had trouble understanding them. I guess I’d picked it up subconsciously alongside my hearing loss.
The muffling of the mask doesn’t help at all
I still wear masks in public, but holy shit does it make people harder to understand when I can’t see their lips. I wanna say that someone made a “clear” mask for that exact reason. Dunno how exactly or where I remember that from, but it’s a good idea
There’s others but this looks like a good one
I can’t do it at all, scouring this thread for tips. I suspect it is pattern recognition my brain has not yet been trained to do.






