Engineering

Assisted driving to autonomous driving

0 comments

Revolutionizing transportation as we know it, autonomous vehicle technology has taken the world by storm. Tesla has earned a reputation for its “self-driving” cars. But many people misunderstand the difference between advanced driver assistance systems (ADAS) and fully autonomous vehicles. A Tesla may run independently, but it lacks many of the environmental sensing and decision-making capabilities — and liability — of a true self-driving vehicle.

Differences between ADAS and full autonomy

There are six levels of advanced driver assistance systems (ADAS), from Level 0 to Level 5, which are determined by a range of features, scenarios, and capabilities.

A Level 0 car has no intelligent systems and is controlled entirely by a human driver. A Level 1 car has just one automated system, such as adaptive cruise control, but the driver remains responsible for steering, braking, and monitoring the environment. Level 2 is granted to partially automated vehicles like the Tesla Autopilot. A Level 2 vehicle can steer, accelerate, and decelerate itself, but the human driver must be able to take over at any time.

A Level 3 vehicle is highly automated. These vehicles can gather and make decisions based on environmental data. But if the vehicle cannot execute a specific task, the human driver must take over.

Level 4 and Level 5 are reserved for the highest levels of automation. The primary difference between Level 4 and Level 5 is that a Level 4 vehicle can still be overridden by a human driver, although there will theoretically be very little need. A Level 5 vehicle is truly autonomous, requiring no human attention.

A critical factor in defining each level is human intervention and, thus, liability. If a car rated Level 3 or higher runs on autonomous mode and an accident occurs, the liability rests with the OEM. However, if the vehicle ran in manual mode, liability may go to the driver or be split with the OEM.

The complex needs of successful AV software:

Every driving function, whether executed by a human driver or an autonomous system, follows a three-step process: perception, planning, and execution. The immediate environment is perceived, a decision is made based on that data, and the appropriate action is executed. This can be as simple as turning into a parking lot or as complex as avoiding a collision. A fully autonomous system must be able to evaluate a vast array of event data and environmental inputs very quickly—and all that computing power must be contained to a relatively small computing unit.

A system this sophisticated requires an advanced, adaptive software stack: High-end artificial intelligence (AI) algorithms, high-performance systems-on-chips (SOCs), and cloud-based data management that can manage the massive quantities of data required for a fully autonomous vehicle to move safely through the world. It also requires a deep understanding of the complexities of transportation networks and the many systems within a motor vehicle.

At Quest Global, solving engineering problems is in our DNA. Our core competencies are exceptionally well suited to the challenge of designing complex autonomous vehicle safety systems. Our global teams have a strong foundation in AI algorithms, cloud solutions, and transportation systems, offering unparalleled depth and breadth of expertise. We can provide the systems expertise, technical competencies, and human power needed to deliver autonomous driving and ADAS solutions.

Autonomy is an engineering problem that aligns perfectly with Quest Global’s greatest strengths. With our unrivaled resources and expertise, we can help our customers in the transportation sector journey from assisted driving to full autonomy.

Kamal Sethi

Kamal Deep Sethi is the Global ADAS/Autonomous Mobility CoE Leader at Quest Global. With over 19 years of experience, Kamal is an enthusiastic leader dedicated to planning, leading, and enhancing business operations. He excels in developing technical roadmaps, managing teams, and fostering technological innovation. Kamal believes in the four pillars of organizational growth: Employees, Customers, Culture & Values, and Innovation. His expertise spans across Advanced Driver Assistance Systems (ADAS), Autonomous Driving, and various sensor technologies including cameras, radars, and lidars. He is also proficient in Artificial Intelligence, Image Processing, Machine Learning, Deep Learning, and Computer Vision. Kamal's leadership and strategic vision drive the success of work streams in Autonomous Driving Operations, Data Pipeline, ML Ops, Validation Ops, and Software Engineering.