安徽 北京 重庆 福建 广东 甘肃 广西 贵州 河南 河北 湖北 湖南 海南 黑龙江 江西 辽宁

江苏 吉林 宁夏 内蒙古 青海 山东 陕西 四川 山西 上海 深圳 浙江 天津 新疆 云南 西藏

湖南省中小学教师招聘英语学科笔试模拟题4

http://hunan.hteacher.net 2023-08-07 13:45 湖南教师招聘 [您的教师考试网]

D

There was a time when the major concern with AI safety had been the one evil super intelligence, reflected in the movie “The Terminator”. However, the game “Tacoma” takes a different approach. It assumes that there will be numerous AGI (artificial general intelligence) in the world and that any AGL, even a safely designed one, in the wrong hands at the wrong time could cause live to be lost. That’s the future that a growing number of AI safety experts are worried about.

This is not a new idea. In the book “Engineering a Safer World”, MIT professor Nancy G. Leveson addresses common misunderstandings about safety-critical systems engineering: engineering systems whose failure could lead to human loss. Such safety-critical technologies include aviation, nuclear power, automobiles, heavy chemicals, biotechnology, and, of course, AGI.

So what can be done?

Technology isn’t always the solution. A famous example is the invention of sonic radars (声波雷达) that were supposed to help ships detect nearby obstacles, but which only increased the rate of accidents. Why? Captains sailed faster, thinking they could get away with it thanks to the new safety technology.

Instead of technologies, Leveson’s book suggests, we should be making organizational changes. Additionally, Leveson suggests, among many complicated guidelines, organizations should be aware that safety guidelines will inevitably become lax over time. As a consequence, measures should be carried out to prevent potential disasters.

What lessons can we draw from concern with AI safety? The answer may lie in recent disaster narratives, which remind us that, especially in limes like this, we shouldn’t forget the potential for other disasters. Public conscience really does matter. And if we’re all better at thinking about safety we citizens, maybe we really can prevent disasters.

12. Why does the author mentioned “The Terminator” in the first paragraph?

A. To arouse readers’ interest in The Terminator.

B. To introduce the topic of concern with AI safety.

C. To mention the similarity between “The Terminator” and “Tacoma”.

D. To make readers recall the evil super intelligence reflected in the movie.

12.【答案】D

【解析】推理判断题。根据第一段第一句“There was a time when the major concern with AI safety had been the one evil super intelligence…”和全文内容可知,文章第一段提到《终结者》这部电影是为了引出对人工智能安全的担忧的主题。故选D。

13. Why did the rate of ship accidents still increase after the invention of sonic radars?

A. Because captains seldom used them.

B. Because the radars failed to work properly.

C. Because captains depended on them too much.

D. Because the ships couldn’t detect nearby obstacles.

13.【答案】C

【解析】细节理解题。根据文章第四段最后一句“Captains sailed faster, thinking they could get away with it thanks to the new safety technology”可知,发明声学雷达后,事故率反而增加了,原因是船长们认为有了新的安全技术就可以逃脱事故,所以航行得更快了。故选C。

14. What does the underlined word “lax” in paragraph 5 refer to?

A. Safe. B. Important.

C. Unreliable. D. Unnecessary.

14.【答案】C

【解析】词义猜测题。根据第五段最后一句“As a consequence, measures should be carried out to prevent potential disasters.”可知,应采取措施防止潜在的灾害。由此可推测其原因是随着时间的推移,安全指导方针将不再安全。Safe意为“安全的”;Important意为“重要的”;Unreliable意为“不可靠的”;Unnecessary意为“不必要的”。故选C。

15. Which of the following can be the best title for the text?

A. Disaster prevention Lessons from AI.

B. Safety problems in modern society.

C. Engineering development in modern days.

D. Future applications of artificial intelligence.

15.【答案】A

【解析】主旨大意题。根据最后一段第一句“What lessons can we draw from concern with AI safety?”及全文内容可知,本文主要讲述我们人类在人工智能安全中获得的教训。故选A。

关注公众号

推荐阅读:

教师招聘考试公告

教师招聘考试大纲

教师招聘职位表

教师招聘报名时间

责任编辑:欣欣

教师教育网 教育信息网 人事考试网

>>更多湖南相关信息/资料查看

教师招聘初中生物《骨骼肌》模拟题湖南省中小学教师招聘英语学科笔试模拟题汇总湖南省中小学教师招聘英语学科笔试模拟题8湖南省中小学教师招聘英语学科笔试模拟题7湖南省中小学教师招聘英语学科笔试模拟题6湖南省中小学教师招聘英语学科笔试模拟题5

精彩推荐

换一换

有报考疑惑?在线客服随时解惑

公告啥时候出?

报考问题解惑?报考条件?

报考岗位解惑   怎么备考?

冲刺资料领取?

咨询

备考资料预约

  • 省份
  • 市区
  • 考试类型
  • 姓名
  • 手机号
  • 验证码
在线客服×