官方APP下载:英语全能特训(微信小程序版,支持苹果手机、安卓手机)

创办于2003年
UNSV记不住?那就记中文谐音“忧安思危”吧!
  Slow and Steady Wins the Race!
UNSV英语学习频道 - Slow and steady wins the race!
公众微信服务号
英语全能特训(微信公众服务号)
UNSV英语学习频道淘宝网店
客服短信:18913948480
客服邮箱:web@unsv.com
初级VIP会员
全站英语学习资料下载。
¥98元/12个月

How a Conversation Can Really Become a Meeting of the Minds

阅读次数:


VIP会员专享下载:(非VIP会员无权下载!如果想下载,但还不是VIP会员,请点此订购
下载方式:使用鼠标右键(注意是鼠标右键!)点击下面的MP3音频/MP4视频链接,然后选择“另存为…”。
MP3节目录音 MP3节目录音 
文章正文
同步字幕

AA: I'm Avi Arditti with Rosanne Skirble, and this week on WORDMASTER: When two people "click," that means they really understand each other. Well, that metaphorical clicking could be the sound of what researchers call "speaker-listener neural coupling."

RS: Studies to date have largely analyzed speech production and comprehension independently within individual brains. But new research at Princeton University examines their relationship in producing successful communication.

AA: We talked with lead authors, Lauren Silbert and Greg Stephens. They began the study by recording Silbert as she lay with her head inside an fMRI -- a functional magnetic resonance imaging machine, essentially a giant magnet that scans the brain at work. She reminisced about her life.

LAUREN SILBERT: " I spent a good amount of time in the fMRI scanner telling stories, just as if I was talking to a friend, trying to communicate something about my life. We then played the recordings back to eleven different listeners while they were also individually in the scanner.

Lauren Silbert
Lauren Silbert

"After the scan we have a behavioral assessment of how much they actually did understand. So then we can sort of measure this coupling between the speaker and listener in correlation with how well the communication was."

RS: "And, Greg, what did you find?"

GREG STEPHENS: "At the end of the experiment, we have a functional scan of Lauren's brain and we have a functional scan of these eleven listener brains. So then we're faced with the task of how do we assess how similar are these brains patterns, how coupled are they.

"And so the first result was that actually it turns out that there's extensive coupling between the two, which extends well beyond sort of low-level auditory areas and it goes all the way up into sort of central cortex. It involves a lot of the language areas that people have seen.

"So that was kind of the first finding, which is that there was sort of this extensive coupling between the two processes that you might have thought, naively, that they would be kind of different, the production side and the comprehension side."

AA: "So, in plain English, you're basically saying it's a real meeting of the minds."

LAUREN SILBERT: [Laughs]

Greg Stephens
Greg Stephens

GREG STEPHENS: "Basically we are saying that, actually. And, as a little bit of context, you know, what we think might be going on is simply that we're all human, we have similar brains and so that when you comprehend speech you might use similar machinery, similar algorithms as you do when you produce it, because we're all sort of using the same fundamental machine, the same brain."

RS: "What does this tell us about who we are and how we operate?"

LAUREN SILBERT: "First, I think it tells us that these two processes that we look at as opposite processes are actually not so opposite. Sort of an extrapolation from the mirror neuron hypotheses, where in order to understand what somebody is saying you have to also produce what they're saying, which, you know, could tell us a bit about empathy or whatever it is you want to think from that.

"Another thing that it tells us is that our brains don't exist in isolation. We grow up, we develop in relation to our surroundings, to people around us, and we communicate in relation to other people. We have interactive processes. We don't exist solely in isolation."

AA: "What did you find, actually, when you started looking at people speaking two different languages to each other? What did you find when you looked at their brain activity?"

RS: "And how were they any different from what your paper relates?"

LAUREN SILBERT: "Well, so in the paper we use one control where we have a Russian speaker and non-Russian speaking listeners. So in that case there's no understanding that's going on at all. And in that case we see no coupling between the brains."

AA: "Now do you think you would get the same results, for example, if -- I mean, you were just using some technical language, you were talking about mirror neurons and this and that. And now if someone maybe wasn't familiar with it who was just listening to you talk about that, do you think if you were to look at their brain and your brain you would see different patterns, showing that you weren't coupled?"

LAUREN SILBERT: "Yes, I do, actually. I think that their brains would start searching for something that I have already moved on to something else, and there would be a difference in processing."

AA: "So they can just blame their brain, right? It's not just lack of understanding."

LAUREN SILBERT: "Or, on my side, I have to bring whoever I'm speaking with into my world in order to make it as successful a communication as possible."

RS: Lauren Silbert and Greg Stephens are researchers at Princeton University in New Jersey. Their study appears in the Proceedings of the National Academy of Sciences.

AA: And that's WORDMASTER for this week. With Rosanne Skirble, I'm Avi Arditti.

网友的学习评论(0条):
版权所有©2003-2019 南京通享科技有限公司,保留所有权利。未经书面许可,严禁转载本站内容,违者追究法律责任。 互联网经营ICP证:苏B2-20120186
网站备案:苏ICP备05000269号-1中国工业和信息化部网站备案查询
广播台