2026년 2월 7일, 토요일
식민지역사박물관
aw 2026

Automotive Electronic Systems—a View of the Future [icnweb]

By Ron Wilson, Editor-in-Chief, Altera Corporation

Automotive driver assist systems (ADAS) are the hot topic today in automotive electronics. The systems range from passive safety systems that monitor lane exits, active safety systems like advanced cruise control, to, in the future, situation-aware collision-avoidance systems. The increasing demands ADAS evolution places on data transport and computing are fundamentally changing automotive electronics architectures. And it is becoming clear that these changes foreshadow the future for many other kinds of embedded systems.

 

Goals and Requirements

Today vehicle-safety electronic systems are isolated functions that control a specific variable in response to a specific set of inputs. An air-bag controller, for example, detonates its explosive charge when an accelerometer output trips a limit comparator. A traction-control system applies a brake to reduce torque on a wheel when the data stream from a shaft encoder indicates sudden acceleration. While these systems make contributions to vehicle safety, they can also act inappropriately because their inputs give them a very narrow view of the world. Hitting a pot-hole or bumping into a car while parking can fire an air bag. A rough road can puzzle traction control.

All that is about to change, according to Steve Ohr, semiconductor research director at Gartner. “Advanced air-bag controllers have multiple sensors that literally vote on whether a crash is happening,” Ohr explained in introduction to his panel at the GlobalPress Summit in Santa Cruz, California, on April 24. “In the near future, the controllers will consult sensors that monitor passengers and cargo to identify how best to deploy the various air bags during a crash.”

At this point, the air-bag controller has crossed a critical threshold: from responding to an input to maintaining—and responding to—a dynamic model of the vehicle. This change, Ohr emphasized, is being echoed in other systems throughout the vehicle, with profound consequences. “We see the same pattern in safety systems such as lane-exit monitors and impending-hazard detectors,” Ohr stated. “Each system is getting more intelligence, moving to sensor integration and then to sensor fusion.”

This evolution is happening in an already astoundingly complex environment. Panelist Frank Schirrmeister, senior director of product marketing at Cadence Design Systems, observed “In 2010, a high-end car could have 750 CPUs, performing 2,000 different functions, and requiring one billion lines of code.” Schirrmeister said that this degree of complexity was forcing developers to adopt hardware-independent platforms such as Automotive Open System Architecture (AUTOSAR), and integrated mechanical-electrical-software development suites. In this fog of complexity, system designers are struggling to cope with a sudden surge of change in the way the systems handle data.

 

Isolation to Fusion

Hazard-avoidance systems offer a microcosm of this sweeping changes, according to panelist Brian Jentz, automotive business-unit director at Altera Corporation. Today, relatively simple systems like back-up cameras can already have significant processing requirements, Jentz said. “Inexpensive cameras need fish-eye correction to fix the perspective so drivers can interpret the display easily.” These cameras also need compensation to produce useful images in low light, and often they will require automated object recognition. These functions can be done better in the camera, but it’s often cheaper to do them in the central engine control unit (ECU). “Cameras are moving to high-definition,” Jentz continued, “and this can mean megapixels per frame. If you are sending images to the ECU, you may have to compress the data before it leaves the camera.”

Further evolution will complicate the data transport problem further. Hazard detection will move from simply showing an image from a rear-facing camera to modeling the whole dynamic environment surrounding the car. At this point the system must stitch together images from multiple cameras—at least eight for a 360-degree view with range and velocity detection, as shown in Figure 1. A central processor is absolutely necessary, and the ADAS must transport many streams of compressed video to the ECU concurrently.

Figure 1. Placement and use of cameras determines the algorithms required to process the images.


But things get harder still. Video cameras are hampered by darkness and disabled by rain, snow, road spray, and other sorts of optical interference. So designers team the video cameras with directed-beam, millimeter-wave radar to improve reliability in low-visibility conditions. Now the ECU must fuse the video data with the very different radar signal in order to interpret its surroundings. This fusion will probably be done using a system-estimation technique called a Kalman filter.

Kalman and its Discontents

A Kalman filter can take in multiple streams of noisy data from different sorts of sensors and combine them into a single, less-noisy model of the system under observation. It does this, roughly speaking, by maintaining three internal data sets: a current estimate of the state of the system, a “dead reckoning” model—usually based on physics—for predicting the next state of the system, and a table rating the credibility of each input. On each cycle, the Kalman filter assembles the sensor data and uses it to create a provisional estimate of the system state: for example, the locations and velocities of the objects surrounding your car. Simultaneously, the filter creates a second estimate by applying the dead-reckoning model to the previous state: the other cars should have moved to here, here, and here, the pedestrian should have walked that far, and the trees should have stayed where they were. Next, the filter compares the two state estimates, and taking into account the credibility ratings of the inputs, updates the previous state with a new best estimate: here’s where I think everything is really. Finally, the Kalman filter sends the new state estimate to the analysis software so it can be evaluated for potential hazards, and it updates its sensor-credibility table to make note of any questionable inputs.

The good news is that the Kalman filter can assemble a stable and accurate model of the outside world despite intermittent readings, high noise levels, and a mix of very different kinds of sensor data. But there are issues, too. Kalman filters working with high-definition (HD) video inputs can consume huge amounts of computing power, and the analytic routines they enable can take far more, as suggested in Figure 2. “Algorithm development is already ahead of silicon performance,” Jentz noted. “There is basically an unlimited demand for performance.”

Figure 2. Sensor fusion concentrates many heavy algorithms and network terminations on one chip.


There is another issue with important system implications. While Kalman filters are inherently tolerant of noise, they cannot be immune to it. And variations in the latency between the sensors and the ECU—particularly if the variation is large enough for samples to arrive out of order—appear as noise. Such latency variations can cause the filter to reduce its reliance on some sensors, or to ignore altogether information that could have made a vital difference.

This is important because of trends in vehicle network architectures. Purpose-built control networks such as the controller-area network (CAN) or the perhaps-emerging FlexRay network can limit jitter and guarantee delivery of packets carrying some sensor data, although they may lack the bandwidth for even compressed HD video. In principle, system designers could calculate the bandwidth they need for a given maximum jitter, and then provision the system with enough network links to meet the need, even if that resulted in dedicated CAN segments for each camera and radar receiver. But in practice, automotive manufacturers are headed in a different direction: cost control.

“The direction is Ethernet everywhere in the car,” argued panelist Ali Abaye, senior director of product marketing at Broadcom. Abaye said that as the number of sensors increases, cost-averse manufacturers—including the high-end brands—are trying to collapse all their various control, data, and media networks onto a single twisted-pair Ethernet running at 100 Mbits or 1 Gbit.

But a shared network raises the latency issue again. Because Ethernet creates delivery uncertainties, some sort of synchronizing protocol—IEEE 1588, Time-Triggered Protocol (TTP), or Audio Video Bridging (AVB)—would appear necessary. “This is still an active discussion,” Schirrmeister said. “The existing protocols are not yet sufficient for everything these systems need to do.” Abaye, who has 100 Mbit transceivers to sell, is more confident. “Our opinion is that the AVB protocol is sufficient,” he stated.

These debates will have system implications well beyond the cost of cabling. Gigabit Ethernet implies silicon at advanced process nodes, where issues like cost, availability, and soft-error rates become questions. Synchronizing protocols are not exactly light-weight, implying the need for more powerful network adapters. And the need to store and possibly reorder frames of time-stamped data from many sensors could impact memory footprints.

 

A Multibody Problem

As a final point, when you put radar or scanning lasers into the ADAS architecture, you get a fascinating side-effect. The ADAS on nearby vehicles can now interact with each other. This could lead to sensor interference, or even to an unstable multivehicle system in which two cars hazard-avoid right into each other. This is not a whimsical question: there are hazard-avoidance algorithms that, when used by multiple vehicles in the same traffic stream, are known to lead inevitably to crashes.

“There has already been some research into the behavior of multi-ADAS systems,” Schirrmeister said. “It is an area of continuing interest.”

Such questions will almost certainly involve regulatory agencies in North America and the European Union in the design of ADAS algorithms at some level. Schirrmeister speculated that in developing countries, where cities can spring up and create all-new infrastructure as they go, there may be a move to coordinate ADAS evolution with the development of smart highways.

In any case, it is clear that verification of these systems will involve a significant degree of full-system, and perhaps multisystem, modeling. These will be huge tasks, going well beyond the experience of most system-design teams outside the military-aerospace community.

We have traced the evolution of one automotive system, ADAS, from a set of isolated control loops to a centralized sensor-fusing system. Other systems in the car will follow the same evolutionary path. Then the systems will begin to merge: ADAS, for example, working with the engine-control and traction systems can bypass the driver altogether and maneuver the car away from trouble. The endpoint is an autonomous vehicle—and a network of intelligent control systems of stunning complexity built around a centralized model of the outside world.

 

www.icnweb.kr

뉴스레터 구독하기

아이씨엔매거진은 AIoT, IIoT 및 Digital Twin을 통한 제조업 디지털전환 애널리틱스를 제공합니다.
테크리포트: 스마트제조, 전력전자, 모빌리티, 로보틱스, 스마트농업

AW2026 expo
ACHEMA 2027
오윤경 기자
오윤경 기자http://icnweb.co.kr
아이씨엔매거진 온라인 뉴스 에디터입니다. 오토메이션과 클라우드, 모빌리티, 공유경제, 엔지니어 인문학을 공부하고 있습니다. 보도자료는 아래 이메일로 주세요. => news@icnweb.co.kr
fastech EtherCAT
as-interface

Related Articles

World Events

Stay Connected

440FansLike
407FollowersFollow
224FollowersFollow
120FollowersFollow
372FollowersFollow
152SubscribersSubscribe
spot_img
spot_img
spot_img
automotion
InterBattery
Power Electronics Mag

Latest Articles

Related Articles

PENGUIN Solutions
NVIDIA GTC AI Conference
AW2026 expo

Related Articles

fastech EtherCAT
as-interface
ABB, ‘Automation Extended’ 공개… DCS 현대화 및 가용성 확보의 새 이정표 제시

ABB, ‘Automation Extended’ 공개… DCS 현대화 및 가용성 확보의 새 이정표...

0
글로벌 기업 ABB가 공장을 멈추지 않고도 인공지능 같은 최신 기술을 손쉽게 추가할 수 있는 새로운 시스템 관리 프로그램을 출시하여 공장의 안전과 혁신이라는 두 마리 토끼를 잡았다
슈나이더 일렉트릭, AW 2026서 자율제조 청사진 공개한다

슈나이더 일렉트릭, AW 2026서 자율제조 청사진 공개한다

0
슈나이더 일렉트릭이 AW 2026 전시회에서 인공지능과 소프트웨어를 활용해 공장을 스스로 움직이게 하고 에너지를 절약하는 차세대 자율제조 솔루션을 대거 공개한다
충전기 하나로 모든 기기를… USB-C 설계 혁명 이끄는 STUSB4531 등장

충전기 하나로 모든 기기를… USB-C 설계 혁명 이끄는 STUSB4531 등장

0
ST마이크로일렉트로닉스가 복잡한 프로그램 설치 없이도 다양한 전자기기를 USB-C 단자로 빠르고 안전하게 충전할 수 있게 해주는 새로운 반도체 칩을 출시했다
“실내외 사각지대 없다” 수년 가는 배터리 갖춘 차세대 IoT 트래커 ‘주노’ 등장

“실내외 사각지대 없다” 수년 가는 배터리 갖춘 차세대 IoT 트래커 ‘주노’...

0
센티넘이 노르딕의 초전력 칩을 사용해 실내외 어디서든 물건의 위치와 상태를 수년간 추적할 수 있는 작고 똑똑한 자산 관리용 트래커를 출시했다
콩가텍, AMD 라이젠 AI 기반 ‘conga-TCRP1’ 모듈 출시… 엣지 AI 한계 넓힌다

콩가텍, AMD 라이젠 AI 기반 ‘conga-TCRP1’ 모듈 출시… 엣지 AI 한계...

0
강력한 NPU 성능과 SWaP-C 최적화 설계를 결합한 콩가텍의 신규 모듈은 팬리스 구성이 필요한 가혹한 산업 현장에서 실시간 결정론적 성능을 보장하며 엣지 컴퓨팅의 새로운 표준을 제시한다
노르딕, NPU 탑재 nRF54L 시리즈로 초저전력 엣지 AI 시대 연다

노르딕, NPU 탑재 nRF54L 시리즈로 초저전력 엣지 AI 시대 연다

0
노르딕 세미컨덕터가 초소형 IoT 기기에 AI 인텔리전스를 구현할 수 있는 업계 최고 수준의 초저전력 엣지 AI 솔루션을 공개했다. NPU를 통합한 새로운 초저전력, 대용량 메모리 기반 무선 SoC 이다
- Our Youtube Channel -Engineers Youtube Channel

Latest Articles