In this age of groundbreaking technology and completely autonomous cars, we are faced with several new morally challenging questions. This is the future and we have to answer them. If we don’t, we run the risk of never advancing as a species. Lauren Davis concurs with this statement in her article “Would You Pull the Trolley Switch? Does it Matter?” written in The Atlantic “you eventually reach a point where you have to make some decisions, and not everybody will agree.” Whose life should be valued more? In the following situation: a completely autonomous car on a collision course with a pedestrian(s), and the only other option is to swerve and kill the passenger(s). Who should the car protect? In most scenario variations, the car default should be programmed to swerve and kill the passenger with the expectations of unbuckled passengers and infant/pregnant passengers. Imagine if you were behind the wheel, and there was no way to stop in time. You were going the speed limit. You kept your eyes on the road. You followed all the laws. But your choices would still be either hitting them or swerving and killing yourself. Cars don’t have the luxury of emotions to control what they do. All they have are algorithms. If someone stepped in front of the car, the car should assess what the safest course of action would be. To avoid the pedestrian at …show more content…
And I’m not sure that’s the best thing,” says Oren Etzioni, a computer scientist at the University of Washington and the CEO of the Allen Institute for Artificial Intelligence. These situations are terrible, and nobody likes the thought of a car that kills, but these decisions must be made if there is no human being behind the wheel. So why not make them to follow the law. If a person was driving, it’s a reflex to swerve if safely possible. Modeling the car's decisions after human’s reactions is the most ethical way to handle the swerve or hit issue with completely autonomous
SUMMARY: Business reporter, Drew Harwell in the article, We Drove Cars That Can Drive Themselves — and Cost Only $20,000, published on April 5th, 2016 addresses the issue of driverless cars in the 21st century and explains the reasons why we should not rely on these relatively new cars. Harwell supports his claim first by framing his argument, through the use of evidence. He explains in detail what these cars actually are, for example, he cites an experiment he conducted driving these types of cars and provided information regarding price, miles per gallon, technology, and efficiency of each of the cars tested. Second by, appealing to the reader’s emotions.
Easterbrook explains why the program of road fatalities is being ignored by society (1). According to Gregg Easterbrook’s article, “Road Kill”, he writes that “the first fundamental difference between harm because of accidents and harm because of deliberate action; the second, society’s strange assumption that traffic fatalities cannot be avoided” (1). He writes that, because, unlike terrorism, car crashes were planned to cause harm. Easterbrook writes that
You’re driving down an interstate highway in the right lane; there’s an 18-wheeler to your left. You’re driving the 70 mph speed limit. Suddenly, 4 people appear in your lane in front of you. You can’t stop in time to avoid killing them.
For example, car accidents occur every day. Drivers that pass by them have the choice of either pulling over to help them or ignoring it and going on with their life. Little do they know, the person in the accident could be severely injured, but this person
Gregg Easterbrook, a fellow of the Brookings Institution and author of The Progress Paradox, argues in his article “Road Kill,” that people in the United States are not paying attention to a major killer: our roads. In his essay, Easterbrook explains the lack of attention to the threat of road accidents, even comparing it to 9/11, which has become a serious threat to Americans on the road, and even worldwide. Many Americans dismiss the problem, even though it is huge even if it is not perceived as a huge threat like terrorism. Some causes of the rise in accident casualties proposed by Easterbrook are distracted driving and the rapid increase of horsepower. Easterbrook proposes multiple solutions to the problem, such as increased legislation on distracted driving and reducing horsepower in cars (A1-4).
As a company, or an organization, ages the ethical standpoint of that enterprise can move and adjust to the surrounding climate. Ethics can even be challenged or vastly changed simply by altering the viewpoint of a problem slightly, even when the final outcome is the same. The greatest example of this challenging of ethical standpoints of a person, with only a small altering of viewpoints, can be seen in the “Trolley Problem” first outlined by philosopher Philippa Foot. In Foot’s original ethical problem a driver of a runaway tram must make a choice, stay on route and kill five workmen ahead of him or steer the tram onto another spur of track killing only one man.
How to Fight Distracted Driving by Lianna Thompson Safer roads is a goal that virtually everyone benefits from since we all share the road infrastructure. However this goal of safe roads can be difficult since there are many circumstances that can put drivers in dangerous situations. But perhaps the one that has gotten the most attention as of late is the issue of distracted driving. There have been countless attempts in order to help stop this issue, yet it still remains one of the leading causes of auto accidents.
Robert Peterson’s article, “Will Self Driving Cars Be Good for America?” (2016), asserts that Americans are ready for autonomous cars, and that self-driving cars have many advantages over their counterparts. Peterson first develops his claim by stating that Americans have used autonomous travel ever since horses pulled buggies, and that autonomous travel is not new, it is just better; Peterson then supports his claim by presenting a statistic which states that roughly 32,000 people die each year from vehicular accidents and 93-95% of them are caused by human error, Peterson advocates that self-driving cars would decrease the number of fatalities from such accidents, as the technology of the vehicles would work to avoid these tragedies, and
It can lead to death and in that case life is much more important than a person’s
The moral philosophy for crash decisions that seems the most ethical to me would be utilitarianism. The goal of utilitarianism is to “make decisions that result in the greatest total utility, or the greatest benefit for all those affected by a decision” (Ferrell, Fraedrich, & Ferrell, 2013). When considering autonomous vehicles (AVs), who makes the decision on what is the greatest benefit for all those affected? Is it the programmer, the government, or the drivers who will be affected by the decisions the most? There is little argument that AVs are the transportation of the future, and when considering whether utilitarianism is the more ethical choice, you must consider some facts.
“It is easy to argue that people should be stopped from putting themselves in danger”. Some people may say that people in life or death situations deserve to pay the consequences since they were the one’s that put themselves that that position .These people should not be held for their actions or situations that they are in for the following reason from the story “The Cost of Survival”, “Usually, when people need to be rescued , it is because something unexpected happened.” First of all , these people who are in these type of situations may not have even known that this would occur or that they were putting themselves in danger. According to the story “The Most Dangerous Game”, “ he realized he had reached too far and had lost his balance.
The companies today that are attempting to make self driving cars have had a huge success and this is a good thing because they have found that if the road is occupied only by self driving cars then it would be safer in a number of ways. One reason stated in the article “ The 3 biggest ways self- driving cars will improve our lives,” by Cadie Thompson, is that “ if 90% of cars were autonomous, the number of accidents would fall from 6 million a year to 1.3 million a year.” This would be a drastic change, and would help save hundreds of thousands of lives each year. Also in the article titled “3 reasons you should embrace self driving cars,” by Drew Hendricks, it says that “ there’s no emotion involved, and certainly no distractions.” This means that all distractions
Consider, there are group of school children at the corner of the street, suppose the driverless hits the child among
the driverless cars parked in spots it wasn’t supposed to park? As robots become more popular lawmakers will have to come up with ways to control machines and have computers responsible. “Only four states and the District of Columbia have passed laws specific to driverless cars” says Miller. Lawyers and the car’s designers say none of these issues are going to prevent self-driving cars from being legalized because current
The ethical problem that is happening here is that should the passenger of the car