聽寫填空,只寫填空內(nèi)容,不抄全文,5-10句,不用寫標號,注意標點,口語中因結巴等問題造成的重復單詞只寫一遍~

Nick Bostrom is director of the Future of Humanity Institute at Oxford University. [---1---] Those are potential events that could threaten your existence and mine, and the whole human species.

Nick Bostrom: Nuclear proliferation, biological weapons, pandemic disease, despotism, various forms of social and economic collapse scenarios rank high on the list of near to mid-term potential catastrophes.

[---2,3---]

Nick Bostrom: [---4---]

[---5---]

Nick Bostrom: [---6,7---]

If you're concerned about surviving a global catastrophe, Bostrom's best advice is to stay healthy. After all, it's still survival of the fittest.

I'm Lindsay Patterson from ES, a clear voice for science. We're at Es. Org.

【視聽版科學小組榮譽出品】
He's organized a 2008 conference for scientists to gather to discuss catastrophic risks. Humans have experienced global catastrophes in the past, Bostrom said. But modern technology has brought new potential risks. We have risks that arise from powerful new technologies that we might develop, such as advanced nanotechnology and superintelligent machines. He said the events of this century could determine the survival of our species. This critical transition period might pose the biggest existential risk for humanity that we've ever faced, because we are developing very powerful technologies that we have no experience with. And it's unclear at this point whether we have the wisdom to use these technologies to our advantage rather than to our destruction.