Accelerating the Industrial Internet of Things
Log In
Use Cases Autonomous Vehicles

Autonomous Vehicles

Autonomous vehicles are cars or trucks that perform functions to support dependent on connecting devices with intelligence such as lights, radars, steering etc. to situation awareness and planning. The fusion of components and intelligence is what makes AV different from regular vehicles. 

We differentiate autonomous vehicles from autonomous transport systems. Whereas autonomous transport systems are interconnected fleets of vehicles owned by a business to service a particular need systematically, autonomous vehicles serve individual passengers (who may or may not own the vehicle). 

Autonomous vehicles are widely divided into five degrees of autonomy. The movement towards greater autonomy is impacted by technical, environmental, and regulatory or legal factors. A given vehicle may be technically capable of an advanced level of autonomy but be unable to perform to that level in a highly chaotic environment or may be prevented by regulatory prohibition or legal risk. 

Level Zero – No Automation

At level zero, the operator performs all tasks. The vehicle has no autonomy.

Level One – Driver Assistance

At level one, the vehicle can assist with specific functions such as applying modest breaking force when the vehicle approaches too close to an obstacle. However, the vehicle operator is responsible for accelerating, braking, and monitoring of the surrounding environment. 

Level Two – Partial Automation

At level two, the vehicle can assist with steering or acceleration functions and allow the operator to disengage from some of their tasks for a limited duration. However, the operator must always be ready to take control of the vehicle and is responsible for safety-critical functions and monitoring of the environment. Many vehicle manufacturers are developing vehicles at this level.

Level Three – Conditional Automation

At level three, the vehicle controls all monitoring of the environment using sensors such as LiDAR. The operator's attention remains critical but the operator can disengage from “safety critical” functions like braking and expect the vehicle to navigate safely under normal conditions. In the case of trucks, many level three vehicles require no human attention to the road at speeds under 37 miles per hour. 

Level Four – High Automation

At level four, the vehicle is capable of steering, braking, accelerating, and monitoring the vehicle and environment, and responding to unexpected events in most driving conditions. At level four, the vehicle notified the driven when conditions are safe for autonomous transportation. The vehicle is then expected to be able to operate as well as a typical human operator. However, the vehicle may request to transfer control back to the human operator under highly dynamic circumstances.

Level Five – Complete Automation

At level five, no human attention is required. Level five vehicles do not require space for an operator. Likewise, there is no need for pedals, brakes, a steering wheel or other manual controls. The autonomous vehicle system controls all critical tasks, monitoring of the environment and identification of unique operating conditions.

As noted above, it is significantly easier to reach level five automation in a controlled environment such as a mine or metro track than in a highly dynamic environment such as a city road. 

Read More

In 2016, Mckinsey forecast that the self-driving vehicle market could be worth USD 1.5 trillion in 2030.

Source: Mckinsey

A study, prepared by Strategy Analytics, predicts autonomous vehicles will create a massive economic opportunity that will scale from USD 800 billion in 2035 (the base year of the study) to USD 7 trillion by 2050.

Source: Strategy Analytics

The global autonomous vehicles market revenue is expected to grow at a CAGR of 39.6% during the forecast period 2017-2027 reaching USD 126.8 billion by 2027.

Source: PR Newswire

What is the business value of this IoT use case and how is it measured?
Your Answer

What is the business value of autonomous vehicles?

1. Improved safety: Research indicates that up to 90% of road traffic accidents are caused by the driver. Advocates for driverless vehicles argue that autonomous systems make better and faster decisions than humans. Self-driving vehicles can monitor and adapt to varying traffic and weather conditions with more diligence, speed, and safety than human drivers.

2. Lower environmental impact: Autonomous systems can be programmed to minimize their environmental impact or to achieve specific regulatory targets. This is done by, for example, optimizing acceleration and minimizing idling time. 

3. Traffic efficiency: Congestion can be reduced by coordinating the flow of vehicles. Using vehicle-to-vehicle communication, autonomous vehicles can drive safely in close proximity at high speeds. They can also reroute to avoid congestion points in road networks. The reduction in accidents will also reduce unanticipated congestion. 

4. Greater comfort: Commuters will be able to enjoy their morning coffee, read the news, participate in meetings, or catch up on sleep during their daily commute. Vehicles will be redesigned to optimize for comfort and convenience when equipment such as steering wheels and breaks are no longer required. And erratic driving behavior of tired or stressed drivers will not impact the smooth flow of traffic. 

Which technologies are used in a system and what are the critical technology?
Your Answer

What sensors are typically used to enable autonomous control of vehicles?

Sensors are critical to the functioning of an autonomous vehicle. They provide dynamic environmental data that supplements system data, such as maps, to enable safe operations. There are two domains where sensors can be deployed, on the vehicle and in the environment. Over time, more sensors, such as cameras deployed on traffic lights, will be placed in the environment and provide data feeds to all vehicles in the vicinity. However, in the absence of highly aware and connected environments, autonomous vehicles rely primarily on the sensors installed in the vehicle itself.

The data from these sensors must be processed on the edge (i.e. the vehicle) rather than in the cloud due to the need for ultra-low latency and relatively high throughput. As sensors are fused with a central computing system they can be distributed to support higher autonomy of sub-systems using a distributed architecture. To accomplish this, the communication system must have subsystems to support the flow of data.

There are three common sensors categories:

RADAR sensors acquire information from nearby objects like distance, size, and velocity (if it is moving) and warn the driver if an imminent collision is detected. Should the driver fails to intervene within the stipulated time (post-warning), the radar’s input may even engage advanced steering and braking controls to prevent the crash. The high-precision and weather-agnostic capabilities of radars make them a permanent fit for any autonomous vehicle prototype, notwithstanding the ambient conditions.

LiDAR sensors are “light-based radars” that send invisible laser pulses and ascertain their return time to create a 3D profile around the car. Unlike cameras and radars, LiDAR does not technically detect the nearby objects; rather they “profile” them by illuminating the objects and analyzing the path of the reflected light. This, when repeated over a million times per second, yields a high-resolution image. Since LiDAR sensor uses emitted light, its operation is not impaired, notwithstanding the intensity of ambient light which means same intensity in night or day, clouds or sun, shadows or sunlight. The result is a greater accuracy of perception and high resilience to interference.

Camera-based systems are either mono-vision i.e. having one source of vision or stereo-vision i.e. a set of multiple (normally two) mono-vision cameras just like human eyesight. Depending on the need, they may be mounted on the front grilles, side mirrors, and rear door, rear windshield etc. They closely monitor the nearby vehicles, lane markings, speed signs, high-beam etc. and warn the driver when the car is in danger of an imminent collision with a pedestrian or an advancing vehicle. However, the most advanced camera systems do not only detect obstacles but also identify them and predict their immediate trajectories using advanced algorithms.

What types of analysis are typically used to transform data into actionable information?

Autonomous vehicles require subsystems that perform data fusion in order to provide localization, situational awareness, route planning, vehicle control, and other functions. This data is typically stored and searched to provide results in real-time. For example, a system could include the DDS interoperability protocol, a databus to move data in real-time, and administrative console, web-integrated services, a code generator and other components.

 

What business, integration, or regulatory challenges could impact deployment?
Your Answer

What business challenges could impact autonomous vehicle deployment?

Safety and cybersecurity, with its potential safety implications, are the most significant impediments to widescale deployment of autonomous vehicles. They are particularly challenging given the current lack of clarity regarding legal liability for accidents involving an autonomous vehicle. 

Stan Schneider
Download PDF Version

p: +86 21 6010 5085 (ext 188)  m: +86 156 0183 9705
a: 338 Nanjing West Road, Shanghai 200003 China
e: erik.walenza@iotone.com  w: www.iotone.com
twitter@IoTONEHQ  linkedin: IoT ONE


test test