Vision System is Further Applied in Contemporary Cars

One of the driver's most important tasks is to observe the surrounding environment and respond accordingly. As automakers implement Advanced Driver Assistance Systems (ADAS), vision systems have drawn attention to their role.



According to the National Highway Traffic Safety Administration (NHTSA), every vehicle moving on the road contains a dangerous component, the person behind the wheel. The NHTSA conducted a detailed survey of 2,189,000 road traffic accidents and concluded that 94% (2,046,000) were due to the driver's fault. The most common driver error (identification error) is due to lack of attention, or the driver is distracted in some way, which together account for 41% of the driver's accident. The safety of road users is very important. With this in mind, the world's leading car brands are constantly adding new features to their cars to help drivers drive safer or develop more advanced systems. In the long run, these advanced systems will make driving no longer completely dependent on the driver.



One of the most important tasks for the driver is to observe the surrounding environment and react accordingly, usually by adjusting the speed or direction. To do this, the driver must know a lot about things, including actual obstacles, potential obstacles, speed limits, other forms of road signs, and weather conditions. If the vehicle is to help the driver drive and ultimately assume full responsibility for autonomous driving, then it must have at least the same human ability to perform these tasks.



As automakers implement Advanced Driver Assistance Systems (ADAS), various types of sensors are now integrated into vehicle designs to provide the support data needed for vehicles, with vision-based sensing technology developing the fastest. It has surpassed the simple reversing camera to provide a powerful and efficient imaging solution.


Important car camera application


Forward cameras have begun to be widely adopted in the market, not only in high-end cars, but also in mid-range models. The forward camera, combined with appropriate image processing software, can detect obstacles in front and can couple them to the steering wheel sensor so that the system can determine if the obstacle is within the trajectory of the vehicle. More advanced systems can even plot pedestrian trajectories to determine if it is likely to intersect the vehicle's path and pre-process to avoid accidents. After that, the vehicle ADAS can make many responses, from simple warnings to evasive actions, depending on its complexity.



However, the vision system is much more than just a forward camera. Many high-end models are now equipped with multiple cameras, so they can be viewed 360 degrees around the vehicle when maneuvering. While these cameras can prevent minor but still costly "collisions," their real value lies in the ability to identify pets or children that are close to the vehicle without the driver's attention.



If you do not follow the published speed limit, it may seem to result in a large amount of fines. However, this can also lead to more serious consequences, such as major accidents that cause personal injury, and even more serious cases of life-related deaths. Today's vision systems use existing forward cameras or dedicated cameras to identify and read roadside speed signs. The vehicle's response to the speed limit sign may be to reproduce these signs in the driver's line of sight, if the speed limit is exceeded. An audible alarm; or limit the speed of the vehicle to always comply with these speed limits.



The development of lighting technology has led to the shift from traditional luminaires to LED lighting, whether it is street lighting or headlights, and flicker has become a challenge for vision systems. Due to the persistence of human vision, flicker is not obvious to the driver, but it can be displayed as a dark bands on periodically captured images, causing character recognition (such as numbers on speed limit markers) to change. It is even more difficult. Advanced image capture algorithms capture multiple images instantaneously and combine them to provide high-quality composite images that can be accurately read. This technique, commonly referred to as flicker mitigation, is now used in leading automotive image sensors.



The camera is not only suitable for use outside the car, but also for use in the car. Since the driver plays an important role in vehicle control (at least for now), imaging techniques are needed to monitor the driver itself, which ensures that the driver is ready and reacts quickly to upcoming events. If the driver is not prepared, ADAS can intervene. The camera will continuously monitor the driver's face and head position and use the artificial intelligence (AI) to confirm the driver's frequency of checking the mirror to assess the driver's sleepiness and traffic attention. If the driver becomes unfocused or obviously tired, he can give a reminder or suggest a break. This monitoring can also be used to measure the height and size of front passengers to achieve proper airbag deployment in the event of a possible collision.



The camera used to monitor the driver can also add many other conveniences. Simple face recognition software identifies the driver and sets cab preferences (including seat position, steering wheel position, mirror and cab temperature) to pre-configured values ​​for a more comfortable driving environment. In addition, imaging can also be used to monitor other occupants of the vehicle, for example, to check if they are comfortable, and to adjust the air conditioning system if comfort is not achieved.

Automotive vision technology

Almost everything in the technology world is constantly changing, especially as the ever-developing visual sensing hardware (and accompanying software) in the automotive industry. As vehicular communication technology migrates from CAN bus to automotive Ethernet, higher speed frame rates and higher resolution video can be achieved, which improves image quality and provides greater security.


One of the most attractive automotive image sensors today is ON Semiconductor's AR140AT, a 1/4-inch 1 megapixel CMOS device with an active pixel array of 1280x800. Optimized for low-light driving conditions, it offers high dynamic range (HDR) mode with rolling-shutter readings. The AR140AT produces very sharp and sharp digital images in both still mode and video mode (60fps). It also integrates next-generation camera functions such as in-pixel binning and windowing, and is fully compliant with the AEC-Q100 standard ( And support PPAP).



As contemporary cars rely more on ADAS systems, the safety of this technology is a primary consideration for designers. The FS32V234 includes a CSE with 16kB of on-chip secure RAM and ROM, and a system-level JTAG controller that supports ARM's secure TrustZone architecture. The device supports the ISO 26262 Automotive Safety Integrated Level (ASIL) for functional safety and provides a safety manual and a complete FMEDA report. Integration with today's vehicle ADAS infrastructure is simplified by providing FD-CAN and FlexRay interfaces.


Visual technology for vehicle control


Vehicle vision systems are expanding into other innovative applications. As automakers seek to provide an intuitive human-machine interface (HMI) for more complex vehicle systems in a limited space within the car to complement and improve existing control systems, gesture-based HMI allows drivers to provide input commands due to There is no need to ask them to look away from the road, so they won't be distracted while driving.


Melexis' new chipset is based on Time-of-Flight (ToF) technology, which provides a simple modular design for 3D vision systems for creating gesture-based HMIs. The chipset includes the MLX75023 1/3-inch optical format ToF sensor and the MLX75123 synergistic IC, which includes many of the external components required to develop a 3D vision solution, and Melexis also offers an evaluation module. Because of the high level of integration of the chipset, designers no longer need to add expensive (and space-consuming) external FPGAs and ADCs to reduce size, design and price per unit, and reduce time-to-market.


MLX75023 ToF Evaluation Board




The MLX75023 ToF sensor incorporates Melexis' advanced pixel technology to provide HDR functionality for challenging lighting conditions. The MLX75123 MLX75308 synergistic chip directly connects the sensor IC to the host MCU and provides fast data reading. The modular design of the split sensor and the cooperating (processing) chip means that the sensor unit can be upgraded without changing the overall system architecture.




to sum up

  

The automotive industry is adding an unprecedented amount of technology to vehicles to make it more economical, more convenient, and most importantly safer. Advanced sensor technology has enhanced the driver's ability to drive through systems such as ADAS. As technology advances, automated driving systems will eventually replace drivers. The variety of sensors is also proliferating, and sensor fusion ensures that these sensors work together to ultimately make the vehicle safe to operate. Vision technology is the key to future vehicles, and many mid-range models now integrate multiple vision sensors to more fully perceive the world around them, read roadside signs, and monitor drivers. Image sensors also have many levels of resolution to identify details, and advances in pixel technology can keep the vision system running accurately even in low-light conditions that are often encountered during night driving or bad weather.

留言

這個網誌中的熱門文章

Vicor Introduces the Latest 800V bus Converter Module

Why Computer/Cell Phone manufacturer, TSMC support Huawei?

Top 8 Semiconductor Companies in the World