皮皮学,免费搜题
登录
logo - 刷刷题
搜题
【简答题】
In the beginning of the movie I, Robot, a robot has to decide whom to save after two cars plunge into the water—Del Spooner or a child. Even though Spooner screams "Save her! Save her! " the robot rescues him because it calculates that he has a 45 percent chance of survival compared to Sarah’s 11 percent. The robots decision and its calculated approach raise an important question: would humans make the same choice? And which choice would we want our robotic counterparts to make? Isaac Asimov evaded the whole notion of morality in devising his three laws of robotics, which hold that 1. Robots cannot harm humans or allow humans to come to harm; 2. Robots must obey humans, except where the order would conflict with law 1; and 3. Robots must act in self-preservation, unless doing so conflicts with laws 1 or 2. These laws are programmed into Asimov's robots—they don't have to think, judge, or value. They don't have to like humans or believe that hurting them is wrong or bad. They simply don't do it. The robot who rescues Spooners life in I, Robot follows Asimovs zero law: robots cannot harm humanity (as opposed to individual humans) or allow humanity to come to harm—an expansion of the first law that allows robots to determine what's in the greater good. Under the first law, a robot could not harm a dangerous gunman, but under the zero law, a robot could kill the gunman to save others. Whether it's possible to program a robot with safeguards such as Asimov’s laws is debatable. A word such as "harm "is vague (what about emotional harm? Is replacing a human employee harm?), and abstract concepts present coding problems. The robots in Asimov’s fiction expose complications and loopholes in the three laws, and even when the laws work, robots still have to assess situations. Assessing situations can be complicated. A robot has to identify the players, conditions, and possible outcomes for various scenarios. It's doubtful that a computer program can do that—at least, not without some undesirable results. A roboticist at the Bristol Robotics Laboratory programmed a robot to save human proxies ( 替身 ) called “H-bots” from danger. When one H-bot headed for danger, the robot successfully pushed it out of the way. But when two H-bots became imperiled, the robot choked 42 percent of the time, unable to decide which to save and letting them both "die." The experiment highlights the importance of morality: without it, how can a robot decide whom to save or what's best for humanity, especially if it can't calculate survival odds?
手机使用
分享
复制链接
新浪微博
分享QQ
微信扫一扫
微信内点击右上角“…”即可分享
反馈
参考答案:
举一反三
【判断题】流过单相半波整流二极管的平均电流等于负载中流过的电流。
A.
正确
B.
错误
【单选题】下列选项对信息保持等级描述不正确的是( ) 。
A.
基本信息即设施运营需要的信息,这类信息必须在设施的整个生命周期中加以 保留
B.
法律强制信息即运营阶段一般情况下不需要使用,但是当产生法律和合同责任时 在一定周期内需要存档的信息,这类信息不需要明确规定保持周期
C.
阶段特定信息即在设施生命周期的某个阶段建立 ,在后续某个阶段需要使用,但长期运营并不需要的信息,这类信息必须注明被使用的设施阶段
D.
临时信息即在后续生命周期阶段不需要使用的信息,这类信息不需要包括在信息 提交要求中
【简答题】一直径为 的实心轴,另一内径为 ,外径为 ,内外径之比为 的空心轴,若两轴的长度、材料、所受扭矩和单位长度扭转角均分别相同,则空心轴与实心轴的重量比 (__)。
【单选题】在单相半波整流电路中,设负载电阻上流过的电流的平均值为IO,则流过二极管的平均电流ID等于( )。
A.
0
B.
IO
C.
1/2IO
【简答题】n.过渡,转变A. transition
【简答题】已知反应N2(g)+3H2(g)=2NH3(g)在673K时△rGΘm=48.5KJmol-1,△rHΘm=-104kJmol-1,试估计该反应在773K时的△rGΘm。
【判断题】实心轴和空心轴的外径和长度相同,所受扭矩也相同,则实心轴的相对扭转角更小。
A.
正确
B.
错误
【简答题】直径为D1的实心轴,另一内径为d2,外径为D2,内外径之比为d2/D2=0.8的空心轴,若两轴的长度、材料、所受扭矩和单位长度扭转角均分别相同,则空心轴与实心轴的重量比W2/W1= ( )。
【简答题】一个反应的活化能为180 kJmol-1,另一个反应的活化能为48 kJmol-1。在相似条件下,这两个反应中哪一个进行得较快,为什么?
【简答题】葡萄糖C6H12O6燃烧时释放2 870kJmol-1能量,而棕榈酸C16H32O2释放9790kJmol-1能量,它们可作为糖类与脂类释放能量的代表,试问它们每克可释放多少能量?
相关题目:
参考解析:
知识点:
题目纠错 0
发布
创建自己的小题库 - 刷刷题