Ultimate Guide to Autonomous Systems
What are autonomous systems?
Autonomous systems, such as self-driving cars and some robots, operate without human intervention. In this section, you’ll learn what defines autonomous systems and the various levels of autonomy. You also will find examples of autonomous systems in the world today.GET THIS GUIDE AS A PDFContact Us
An autonomous system is one that can achieve a given set of goals in a changing environment – gathering information about the environment and working for an extended period of time without human control or intervention. Driverless cars and autonomous mobile robots (AMRs) used in warehouses are two common examples.
Autonomy requires that the system be able to do the following:
- Sense the environment and keep track of the system’s current state and location
- Perceive and understand disparate data sources
- Determine what action to take next and make a plan
- Act only when it is safe to do so, avoiding situations that pose a risk to human safety, property or the autonomous system itself
An autonomous system can sense, perceive, plan, and act without intervention.
Autonomous vs. semi-autonomous systems
The vast majority of current systems known as autonomous are semi-autonomous, rather than fully autonomous. Cars with driver support systems such as lane keep assist and advanced braking systems are semi-autonomous, as are robotic surgery systems, robot vacuums and most unmanned aerial vehicles (UAVs and drones). Most fully autonomous systems, such as driverless cars, remain too costly, data intensive, power consuming or unsafe for widespread public use.
Significant progress is being made. Electric vehicles, autonomous vehicle perception systems and connected car platforms have passed Gartner’s "peak of inflated expectations" and through the "trough of disillusionment." These systems have reached the "slope of enlightenment" in the Connected Vehicle and Smart Mobility Hype Cycle.
Examples of autonomous systems
Autonomous vehicles, autonomous robots, autonomous warehouse and factory systems and autonomous drones are some examples of autonomous systems.
Both passenger and commercial vehicles include autonomous capabilities, which exist on a continuum, as identified by the six levels of driving automation defined by SAE J3016:
- SAE levels 0–2 provide driver support, ranging from warnings and momentary assistance to collision avoidance features, such as autonomous steering, braking and acceleration support.
- SAE levels 3–5 provide fully automated driving features that range from those suitable for limited conditions or applications, such as a traffic jam or local taxi service, to those that enable a driverless vehicle to take passengers or cargo anywhere and in all conditions.
Many passenger vehicles in widespread use meet SAE J3016 Level 0, 1 or 2.
To deliver the types of autonomy described in SAE J3016, passenger vehicles have transformed into semi-autonomous, connected mobile devices on wheels. Cars of the future will provide even more autonomous features and eventually become fully autonomous. In many cities, autonomous cars and shuttles, such as an all-electric shuttle without manual controls in San Francisco, are common sights.
Autonomous vehicles will account for 15 percent of global light vehicle sales by 2030.
As for autonomous commercial vehicles, while many fewer trucks are built annually than cars, autonomous trucks will radically change cargo transport and the use of public highways. One autonomous cargo transport technique is known as truck platooning. In this, a human driver in a tractor-trailer truck leads a convoy of autonomous tractor-trailer trucks, which enables a single driver to move much more cargo. Automated driver support systems allow each autonomous truck to follow the truck in front of it at a set distance. The human driver can drop off the follower trucks, and pick up new ones, at predetermined locations along a highway.
Autonomous vehicles also include those that use rail. Autonomous freight trains will move cargo with a human as merely an observer.
Autonomous robots vary from simple robot floor cleaners to complex autonomous helicopters. Otto, the first autonomous snowplow in North America, keeps runways clear at an airport in Manitoba.
In agriculture, the idea of fully autonomous tractors has engaged generations of farmers, but currently – like the snowplow – autonomous tractors require a supervising operator. Other autonomous systems used on farms include automatic milking machines and strawberry-picking robots.
In the medical field, robots assist surgeons in performing high-precision procedures, such as coronary artery bypass and cancer removal.
High-mobility autonomous systems, such as four-legged walking robots, can navigate obstacles, access difficult-to-reach locations and perform tasks hazardous to humans. Examples include industrial inspection robots and rescue mission robots.
Autonomous warehouse and factory systems
From mail sorting systems to material conveyors to assembly robots, a diverse array of autonomous systems performs routine and repetitive tasks, enabling better use of human labor. One type of warehouse autonomous system is a robot forklift that moves products around an e-commerce giant’s automated distribution center. On assembly lines, autonomous factory robot arms perform many heavy and precision tasks – such as arc welding, painting, finishing and packaging.
Unmanned aerial vehicles, known as UAVs or drones, are small self-piloting autonomous aircraft. Drones have long been used for reconnaissance, surveying, asset inspection and environmental studies. Two common uses for drones are agriculture and oil well inspection.
Autonomous UAV technology has its basis in the autopilot technology used by passenger airplane captains in commercial aircraft. Fully autonomous (pilotless) passenger aviation remains a distant goal. Similarly, un-crewed UAVs are usually piloted remotely, rather than directing their own flight paths. In UAVs, two tasks commonly performed by fully autonomous systems are airborne refueling and ground-based battery switching.
Sensors and sensor fusion
Sensors and sensor fusion play a vital role in autonomous systems. They enable such systems to gather data from sources in the environment and make use of the data to plan and take action. In this section, you’ll learn about the diverse types of sensors used in autonomous systems and how sensor fusion helps an autonomous system acquire and develop a more accurate assessment of its environment.
Sensors used in autonomous systems
Sensors help an autonomous system place its location, identify informational signs and avoid obstacles and other hazards. The following describes common types of sensors and their purposes and explains how the use of multiple sensors can overcome a weakness of one type of sensor by pairing it with another one with different strengths and weaknesses.
- Camera sensors - Video and still-image cameras are common types of optical sensors. Using multiple cameras, provides a 360° view of the environment, which is vital for identifying cars, pedestrians, traffic signs and signals, guardrails and road markings in a traffic setting. Camera sensors are equally valuable for autonomy in other environments, such as warehouses and stores.
- Ultrasonic sensors - Using high-frequency sound waves, ultrasonic sensors can estimate distances between the sensor and an object. Ultrasonic sensors are typically used in short-distance applications, such as a parking assistance system in a car.
- Infrared sensors - Infrared sensors provide images under low-lighting conditions, such as for night-vision systems.
- LiDAR sensors - LiDAR sensors use lasers to estimate distance and create high-resolution 3D images of the environment. Rotating LiDAR systems that offer a full 360° view around an autonomous device are cost-prohibitive for mass-market use, however, some cheaper solid state LiDAR systems (that offer a smaller range of view) are feasible for mass deployment.
- Radar sensors - Using short-range (24 GHz) and long-range (77 GHz) radio waves, radar sensors monitor blind spots in autonomous vehicles for safer distance control and braking assistance applications.
Radar sensors are unaffected by weather conditions, unlike image sensors and LiDAR.
Radar sensors are often paired with cameras to achieve image quality that is closer to the high-resolution images provided by LiDAR.
A comparison of radar and LiDAR is shown in this table:
|Source of name||RadarRadio detection and ranging||Rotating LiDARLight detection and ranging|
|Types of waves||RadarRadio waves||Rotating LiDARLight waves from laser beams|
|When invented||Radar1930s – detect hostile aircraft||Rotating LiDAR1960s – Apollo moon missions|
|Image quality||RadarMay lack detail to identify objects||Rotating LiDAR3D monochromatic image|
|Spatial resolution||RadarEven high-resolution radar is much lower resolution than LiDAR||Rotating LiDARGood at detecting small objects, such as the hand signals of a bicyclist|
|Distance||RadarLonger, allowing more time for reaction||Rotating LiDARShorter, 30-200 meters|
|Reliability||RadarMore reliable due to solid state components||Rotating LiDARLess reliable due to more moving parts|
|Velocity||RadarAccurately determines relative traffic speed||Rotating LiDARPoor detection of the speed of other vehicles|
|Field of view||RadarLoses sight around curves||Rotating LiDAR360° view and can track direction of movement|
|Object identification||RadarImprecise. May confuse multiple smaller objects as one larger object.||Rotating LiDARPrecise. Can detect which direction a pedestrian is facing.|
|Affect of weather||RadarUnaffected by snow, fog, rain and dusty wind||Rotating LiDARImpaired by snow, fog, rain and dusty wind|
|Power requirements||RadarLess power||Rotating LiDARMore power|
|Best uses||RadarLow-speed vehicles, such as an autonomous plow, and car parking assistance systems||Rotating LiDARVehicles that travel at higher speeds|
- Accelerometer sensors - Accelerometers measure changes in speed. Accelerometers are used for orientation, slope and collision detection.
- Gyro sensors - Gyro sensors measure orientation and angular velocity. A gyro sensor uses a spinning disk that pivots and self-orients with respect to Earth’s gravity. Gyro sensors help maintain direction in applications such as tunnel mining and aerospace.
- IMU sensor - An inertial measurement unit (IMU) uses gyro sensors and accelerometers to measure position and orientation.
- Inertial navigation system - An inertial navigation system (INS) includes an IMU and is used to estimate vehicle position, orientation and speed.
- Global positioning system (GPS) receiver - A GPS receiver is a type of sensor that identifies the position of the system in its local environment through satellite triangulation. GPS receivers are used with maps to estimate position, direction and speed.
Wireless vehicle communications
In addition to sensors on an autonomous system, machines in the environment can also provide information to the system. Autonomous vehicles will pass information to nearby systems, and vice versa. Vehicle-to-everything (V2X or V2E) communication between a vehicle and its environment improves safety by enabling communication with the following types of systems:
- Vehicle-to-device (V2D): onboard devices within the vehicle
- Vehicle-to-grid (V2G): plug-in electric vehicles
- Vehicle-to-infrastructure (V2I): smart traffic signals and connected parking spaces
- Vehicle-to-network (V2N): services ranging from cooperative intelligent transport systems (C-ITS) to advanced driver assistance systems (ADAS) to OEM telematics and aftermarket services
- Vehicle-to-pedestrian (V2P): roadway signaling and alerts to pedestrian smartphones
- Vehicle-to-vehicle (V2V): speed and position of other vehicles for collision detection and avoidance
Such communication is achieved with one of two technologies: 5G or dedicated short-range communication (DSRC) using the IEEE Wireless Access in Vehicular Environments (WAVE) 802.11p standard. Current LTE cellular technology does not allow for peer-peer communication which is necessary for V2V or V2P. In the future, 5G vehicle-to-everything communication, called C-V2X, will make cars smarter and driving safer.
Each type of sensor and source of environmental information has advantages and disadvantages. Sensor fusion produces more detailed and more accurate information about the environment than could be achieved with a single source. As a branch of machine learning and signal processing that focuses on perception, sensor fusion combines data from multiple sensors and databases to produce higher-quality information so that an autonomous system can make better, safer decisions. Sensor fusion is also known as data fusion, filtering or target tracking.
How does sensor fusion work? Sensor fusion systems typically take data from cameras, LiDAR, radar, sonar and other sensors and fuse it together, essentially augmenting data from one type of sensor with data from another type of sensor. Highly efficient, high-throughput software and special purpose hardware called accelerators handle the massive quantity of sensor data that needs to be processed for quick decision making by an autonomous system.
Sensor fusion combines data from multiple types of sensors to produce a more accurate picture of the environment.
To perceive and understand the sensor data, the system must filter disparate data sources, localize data assets, interpret those assets and classify data. Sensor fusion algorithms – such as the Kalman filter, Bayesian networks and convolutional neural networks – help the autonomous system ensure that it is working to extract the maximum possible information from the sensor data.
- The Kalman filter is used for navigation and tracking to predict and then update the system’s prediction, about the future location and state of a moving object.
- Bayesian networks provide a probabilistic computational model for decision making in the presence of uncertainty.
- Convolutional neural networks are used in machine learning for pattern recognition.
Software for autonomous systems
Building safe, secure and reliable autonomous systems begins with the architecture, then the design, and then the software foundation. In this section, you’ll learn about autonomous systems architecture, functional safety systems, the future of autonomous system software, a safe software foundation and the many security considerations for autonomous systems.
The growth of autonomous systems will involve a transition from hardware-driven equipment to software-driven electronics, regardless of whether the autonomous system is a snowplow or a surgical system. The software requirements for real-time control, safety, security and reliability are similar across industries. At BlackBerry® QNX®, we provide the operating system or software foundation for developers facing the myriad challenges of building safe, secure and reliable software for use in a wide variety of autonomous systems.
Autonomous system software architecture
The four software components for autonomous systems are sensing, perceiving and understanding, making decisions and taking action, as shown in the figure.
|Sense||Perceive and Understand||Decide||Act|
Gather information from sources at hand:
|Perceive and Understand|
Decide next actions based on classified data and plan:
When safe perform action(s)
Functional safety systems
Unsafe autonomous systems are a grave concern of consumers, governments and industry watchers. It may not be fair, but autonomous systems are held to a higher standard of safe behavior than humans. That is, they must prove a better safety record than humans or they will not be accepted by the public.
In passenger vehicles, safety features, such as lane-departure warnings and collision detection, and active-control features, such as automatic braking and lane-keep assist, must always work. In comparison, the public is willing to accept mistakes from human drivers.
People are willing to accept that humans make mistakes, but autonomous systems are held to a higher standard.
To address this challenge, developers of embedded software for autonomous systems recommend that all software be safety-certified in safety-critical systems like autonomous vehicles. BlackBerry QNX has pre-certified the QNX® Neutrino® RTOS and QNX® Hypervisor to functional safety standards including ISO 26262 (automobiles) and EN61508 (rail). BlackBerry QNX also provides expert consulting services to help our customers achieve the safety certifications they need.
Autonomous systems today are propriety networks of systems from multiple suppliers integrated by a system developer, which makes it difficult for developers to scale software and limits the availability of third-party after-market applications.
In the future, developers will interact with autonomous systems via a software platform that abstracts the hardware, abstracts the sensors and pushes the interface to a higher-level set of software services via an application programming interface (API). This will free developers from having to interact with a specific type of LiDAR, radar or camera in use – and enable them to simply request information via high-level services provisioned with APIs. Connectivity will be similarly abstracted, so the communications within the car, with other vehicles and with the cloud will be simpler for a developer to implement without requiring a deep understanding of a specific communication technology.
In the future, a developer will call an API to get information from a car, without needing to understand the underlying hardware.
Software foundation for safe systems
At the core of any safety-critical autonomous system is a safe, reliable and secure software foundation or operating system. The following components provide the building blocks for safe autonomous systems.
- RTOS – A deterministic real-time operating system ensures that higher-priority tasks get the time they need, when they need it, by pre-empting lower-priority tasks.
- Hypervisor – A hypervisor provides safety and security through secure virtualized isolation – and enables secure and reliable consolidation of multiple subsystems (virtual domains) on a single system-on-chip (SoC).
- Secure data communications and over-the-air updates (OTA) – Autonomous systems depend on reliable data communications at every level: on the system, among systems and with the environment. Sensor data and authenticated OTA updates must arrive without interference.
- Advanced driver assistance systems (ADAS) – Pre-configured frameworks for automated driver assistance systems provide a sensor framework and middleware to ease the burden of sensor fusion for developers.
How BlackBerry QNX can help you
Building software for autonomous systems is time intensive, costly and high risk. BlackBerry QNX reduces the burden for developers through a safe, reliable and secure real-time operating system designed specifically for safety-critical systems. We also offer engineering, security and safety certification services and have extensive experience with autonomous system development. In this section, you’ll learn more about our software solutions, engineering services and industry leadership.
BlackBerry QNX Software
From surgical robots and cars to trains and traffic lights, BlackBerry QNX software provides the foundation for safe, reliable and secure systems:
- QNX OS for Safety is designed specifically for mission- and safety-critical software for autonomous systems. This pre-certified RTOS reduces development time, risk and cost for autonomous system developers.
- QNX Hypervisor enables developers to partition and isolate safety-critical systems from non-safety critical systems or run multiple OS versions on one SoC, helping to ensure that critical systems will continue to run safely.
- QNX® Black Channel Communications Technology helps to ensure the safety and integrity of data during communication by encapsulating data in transit in a safety header and performing safety checks to validate it at both ends.
- ADAS (advanced driver-assistance systems) and automated driving systems are a highly collaborative effort, requiring IP from an array of technology partners. This BlackBerry QNX ADAS framework minimizes integration efforts and speeds development of advanced driver assistance systems.
The BlackBerry QNX RTOS is used in hundreds of millions of systems, including driverless vehicles and autonomous robots.
BlackBerry QNX Engineering and Security Services
Developing safe, reliable and secure software for mission-critical and safety-critical applications of all kinds is a core competency at BlackBerry QNX. We offer training, consulting, custom development, root cause analysis and troubleshooting, system-level optimization and onsite services for developers of a variety of embedded systems in automotive, medical, robotics, industrial automation, defense and aerospace, among other industries.
Learn more about our engineering, security and safety certification services.
BlackBerry QNX's Autonomous Vehicle Innovation Centre (AVIC) helps developers of software for connected and self-driving vehicles through production-ready software developed independently and in collaboration with partners in the private and public sectors.
In addition, as part of a program with L-SPARK, Canada’s largest software-as-a-service accelerator, BlackBerry QNX helps companies research and develop product prototypes in areas such as robotics, device security, sensor fusion (e.g. LiDAR, radar, cameras and GPS), functional safety, analytics, medical devices and autonomous vehicles.