As a general look, it seems very easy for autonomous vehicles to collect information from the sensors and act accordingly. But, in actual, perception is the most challenging as well as a complex phase to model. The reason lies in the simultaneous working of multiple sensors and integrating them to establish a single cruck of information. Depending on the environment in which autonomous vehicle is being driven, the volume of the data that needs to be processed can be in huge volumes. Generally, a massive amount of data is generated by autonomous vehicles in a fraction of seconds. This narrates the horrific amount of processing power that is required to process such a large volume of data. It is another challenge, but for the time being, let’s stick to the interpretation of perception signals and their influence on the control of the autonomous vehicle. To understand the mode of translating perception signal to steer the control system of the autonomous vehicle, it is empirical to understand how perception signals are interpreted.
From the autonomous vehicle’s perspective, the perception model is the combination of different sensors. These sensors sense the external environment and pass on this information to the translation block or model known as interpretation. Since the obtained data can be in the scattered form, the interpretation block not only processes this information but also fine-tunes this data to be more discrete and understandable.
For example, consider a situation where an autonomous vehicle’s perception has detected that obstacle is just 10m ahead and collision is possible. So based on this information, an emergency brake system will be triggered, and the vehicle will eventually stop to avoid the collision. Now the discrete information about the distance to be just 10 meters wasn’t conveyed by the sensor or perception model. Instead, the sensor used their sensing system to detect the obstacle and passed on this information to the interpreter. The interpreter then filtered the data and processed it using the speed time formula to compute the distance of 10 meters. This is how interpretation works in autonomous vehicles.
However, depending on the features and working model of the different autonomous vehicles, information processing and algorithms may differ. But the general essence of the interpretation model remains the same.
It is said that it takes about 0.15 to few seconds for humans to decide and act in case of any road calamity. This time accumulates to the situation analysis, deciding the course of action, and then taking actions to control the unfavourable circumstances. However, critics say that it is not the same in every case, and generally, it may extend to several seconds. So, in this lieu, it won’t be wrong to say that time it takes to generate and interpret the perceptions as well as the time to let vehicle control act on those interpretations should be close to this perception-reaction time.
When the perception model detects something, it sends the collected data to the interpreter model. Here this information is first of all filtered and fine-tuned for better processing. Afterwards, this discrete data is matched with the underlying conditions. Here it is checked that does this data correspond to any of the triggering values, say violation of safe distance from the objects. If it finds out that received information is demanding for some contingency action, then this information is translated into the format where defined algorithms and processes can understand it. After that, the relevant algorithm is called, and the corresponding control mechanisms are triggered to perform their designated operations. Now, this is the overview of the process of sending perception signal to the vehicle control. Let’s understand it in an interactive way by referring to a specific scenario.
Consider a situation where an autonomous vehicle is on its way to reaching the Point X. While commuting on the highway, it constantly takes perceptions from the number of sensors such as cameras and radars. A road sign was detected, calling to limit the speed to 90 km/hr. Now, it will take capture and locate this road sign on the camera and sends it to the interpreter. Here, with the help of image processing algorithms, the image will be refined and tuned to extract that 90 km is written and the detected object resembles the road sign. Now a question arises that how come interpreters will recognize it to be a road sign? It could be some sticker on some moving car. The answer lies in the intelligence of autonomous vehicles. With the use of neural networks, they are trained to distinguish amongst different road signs. The detected image when gets processed is passed on to the speed control algorithm. This speed control algorithm will activate the vehicle control system to trigger deceleration. Within no time, deceleration will continue until and unless the vehicle reaches the speed of 90 km/hr. From now, until the next perception, speed will be maintained at 90 km/hr.
All this systematic process seems to cause a lot of time delay. But, in reality, with the help of fast processors and intelligent control strategies, the process of sending perception signals to the triggering vehicle control system is extremely fast.
There are a lot of myths associated with the working mechanism of autonomous vehicles. To some extent establishment of these myths make some sense, as people are generally reluctant to adopt something which they have never heard. In the case of autonomous vehicles, the invention is so rich in its essence that people get astonished. For them, such inventions make sense in science fiction movies. But when it comes to realizing such concepts, they can’t believe. Taking the example of the explained process of perception building to controlling vehicles, it is very easy for people to comprehend the idea of autonomous vehicles. The involvement of high-tech algorithms and high-end perception models can surely make them believe the efficiency of autonomous vehicles.
Answering a question online, I just found a slide deck I have used in the… Read More
Agile leadership is a modern approach to management that focuses on empowering teams to deliver… Read More
I was recently surprised by the notification coming from Starlink: My area will be available… Read More
Most recently, I was creating my first YouTube video, explaining what a Product Owner in… Read More
I proposed the Hyundai Kona EV to my mother. Read More
With the evolution of electric vehicles, a lot of debate has been started on its… Read More