Is there anyone to blame?

A self-driving or self-driving car is a car that does not require a driver to operate. He can use motion sensors, cameras, audio receivers, and advanced programming to move on his own, analyze the environments around him, and take appropriate action regarding the speed and direction of his movement. In other words, it’s a machine that does what most cars can do these days, but also performs the actions otherwise required of a driver. All one has to do is steer the car to a destination or follow a certain route, and the car will drive itself, much like a computer code that prepares an output or outcome when given certain information .(1)

Self-driving cars are part of a scale of six levels of automation, where each level has a varying degree of automated functions, with 0 possessing the lowest degree of automation and 5 possessing the highest. With improving engineering and technology, each successive level of automation in cars becomes feasible. Currently, semi-automated cars with features and requirements for manual human control are being manufactured, and most of the cars sold fall under levels 0-2 of automation. However, with the speed at which innovation is advancing, fully autonomous Level 5 automated vehicles will be available in the coming decades.[2] With their imminent arrival, the law must adapt appropriately to accommodate them without harming the interests of the stakeholders involved.

The importance of such vehicles comes from their revolutionary nature, which surpasses anything that mankind has seen in the field of locomotion.

Automated cars have two main advantages, among others. These cars are more inclusive in their access, as those who cannot drive a car due to age or disability would have access to efficient means of getting from place to place. Plus, because they’re automated and computer-driven, they’re much less likely to make mistakes than human drivers. This absence of human error would make their use safer, since most road accidents are due to driving errors and lack of concentration or judgment. Therefore, self-driving cars would significantly reduce the possibility and occurrence of road accidents.[3][4] Another benefit they offer is better traffic management.[5] With speed management that adapts to the environment, efficient navigation technology, machine learning algorithms and vehicle-to-vehicle communication, self-driving cars can not only reduce travel times, but also reduce traffic jams and optimize the use of space on the roads. [6]

Along with an impressive array of benefits, however, comes a list of concerns that can’t be ignored, and that’s what makes it such a big talking point. One of the main concerns is that these cars would be extremely expensive and therefore not available to the general public due to their lack of affordability. Additionally, with full and almost exclusive use of computer programming, various security issues would come into play, such as susceptibility to hacking and security breaches. Added to this is the problem encountered with the advent of the production of most machines – the loss of job opportunities. For delivery people, postal workers, taxi drivers and other road transport players, their jobs would effectively be replaced by machines capable of working more efficiently. From a more legal and ethical point of view, however, the question of responsibility must be carefully examined. With automated cars, many legal issues arise with respect to traditional traffic laws on negligence, accidents and accident liability.

Before self-driving cars are mainstreamed into production and access, authorities around the world need to ensure the law is equipped to accommodate them. For this, one of the major legal principles that deserves attention is that of negligence and the nature of liability.

Usually, with cars that rely entirely on driver inputs, drivers take on responsibility and liability when their actions are the primary causative factors in accidents or damage to other people or property. Examples include damage caused by speeding, traffic signal violations, drunk driving, etc. negligence, for which they are held responsible.[7] This is a simple way to impose accountability. Since the driver has control of the car, the driver should be held liable for errors causing harm.

However, the issue of responsibility begins to get complicated when the degree of control over the mechanics of the vehicle shifts from the driver to the car itself, as seen with self-driving cars. It would be natural to lay the blame on the car itself with a higher degree of automation, as the car would be able to control itself. However, assigning liability to the vehicle is not straightforward. Even with automated cars, the driver can be held liable for negligence, if the car is at an intermediate level of automation where features such as manual control are still present, and the driver does not take control even after being warned by the car to do so.[8] However, with full automation, holding drivers liable for negligence would be unfair to them because they do not control the actions of the car beyond simply setting a destination or route, and therefore their own actions do not cannot directly harm others.[9]

With regard to automated cars, 3 main actors come into play whose responsibility can be engaged: the driver, the car manufacturer and the developer of the automation software.[10] In order to then identify the person responsible, the functions of each of the stakeholders must be well defined and distinguished. For lower levels of automation, the driver’s function would be to be aware of warnings and to control the car in situations requiring manual control. Failure of this function could lead to neglect. The car and software developers could be held liable for negligence if they failed to take proper precautions in the manufacture or maintenance of their product, resulting in inherent defects. Beyond that, however, negligence does not come into play with self-driving cars, as the concept of due diligence, being an inherently human attribute, would not apply to machines.[11]

Just because self-driving cars can’t be careless doesn’t mean they can’t cause damage or be involved in accidents. Despite their lack of human judgment and human error, they are still prone to “machine errors”, or situations where machine actions, although fully consistent with the software, lead to undesirable results. Illustrations of this include emergency scenarios where the car is forced to make a choice between protecting a passenger or a pedestrian, or between moving in one direction and injuring some pedestrians or going in the other direction and injuring others.[12] Such situations do not yet have definitive answers, and researchers are looking for patterns that can clearly assign responsibility and identify liability.

A popular model currently being followed requires drivers to control the steering wheel. This gives the driver greater responsibility and accountability. But this model would not be effective with fully automated cars because the driver would have almost no role to play. Other possible models include imposing strict liability on car owners, full transfer of liability to the manufacturer, sharing of liability between the two and compulsory insurance for road accidents.[13] Some researchers have even suggested adopting the system used in the past for horse-drawn transport, since horses were intelligent beings prone to situations of miscalculation and misunderstanding, leading to accidents, similar to automated cars. .[14] However. none of these models are perfect, so the search for solutions continues.

There is hope that driverless technology will progress and develop to a stage where errors and accidents no longer occur at all. But until that dream promises to come true, laws must adapt using various models and contingency plans. And if that is not possible, then innovation must wait for the law to catch up, because only then can innovation truly benefit mankind.

The author is a 2nd year student at West Bengal National University of Legal Sciences (WBNUJS). Opinions are personal.

[1] Synopsis, What is an autonomous caravailable at (last visited 1 December 2021).

[2] Peter Y. Kim, Where We’re Going, We Don’t Need Drivers: Autonomous Vehicles and the Responsibility of AI Attendants69 Catholic University Law Review 343 (19 October 2020).

[3] NYU Journal of Intellectual Property and Entertainment Law Blog, Self-driving Cars: negligence, product liability and warrantiesApril 20, 2018, available at (last visited December 3, 2021).

[4] Valiente Mott, Self-driving cars: pros and consavailable at (last visited 2 December 2021).

[6] Grace Chen and Vishrut Rana, Say goodbye to rush hour trafficAugust 18, 2019, available at (last visited December 3, 2021).

[7] UNC School of Law, Who bears the responsibility when self-driving cars cause an accident?October 12, 2020, available at (last visited December 3, 2021).

[9] Bogdan Cialci, Accountability for self-driving cars: getting rid of negligence?2019, available at (last visited 3 December 2021).

[10] Stephani R Johnson, Autonomous vehicles and new tort implicationsApril 11, 2019, available at (last visited December 3, 2021).

[11] NYU Journal of Intellectual Property and Entertainment Law Blog, Self-driving cars: negligence, product liability and warrantiesApril 20, 2018, available at (last visited December 3, 2021).

[14] Stephani R Johnson, Autonomous vehicles and new tort implicationsApril 11, 2019, available at (last visited December 3, 2021).