This year’s InCabin Europe conference in Barcelona signaled a pivotal transformation in how the automotive industry perceives and safeguards the vehicle interior. The conversation around driver and occupant monitoring systems (DMS/OMS) has moved from exploring if they will become standard to examining how quickly automakers can deploy production-ready technologies to meet Euro NCAP’s 2026 requirements and other global safety mandates.
Beyond regulatory compliance, OEMs are now seeking integrated, efficient ways to leverage in-cabin sensing data to enhance both safety and user experience. InCabin Europe, a niche and highly technical event dedicated to the in-cabin sensing ecosystem, offered a concentrated glimpse into the future of interior intelligence from vital sign and presence detection to gesture-based interfaces and adaptive comfort systems. Within this landscape, HTEC, in collaboration with Tobii, D3 Embedded, and Texas Instruments, unveiled a breakthrough solution in single-camera and radar fusion.


The in-cabin monitoring solution developed by HTEC, Tobii, D3 Embedded, and Texas Instruments, featured in car #3 at the conference.
Technologies shaping in-cabin intelligence this year
The camera and radar fusion technology drew a lot of attention from the industry leaders at the conference, signaling the potential of becoming the next generation of in-cabin sensing. Still in a nascent stage, this approach was demonstrated by only a few companies at InCabin. Among them, the Tobii–HTEC–D3 Embedded–Texas Instruments solution stood out as the most mature and production-ready, showcasing how high-performance sensing can now be achieved on a cost-effective chipset.
By combining the strengths of both modalities, camera–radar fusion improves system robustness and reliability across key safety and HMI functions, including drowsiness, presence detection, micro-motion tracking (such as breathing), gesture recognition, and occupant monitoring. It enables reliable sensing even in low light, obstructed, or complex cabin conditions where multiple camera approaches often struggle.

Camera-radar fusion solution debuted at the conference.
Meanwhile, Edge AI is establishing itself as the default architecture for in-cabin intelligence, enabling real-time processing that addresses latency, privacy, and reliability without cloud dependency.
Radar and camera in-cabin fusion by Tobii, HTEC, D3 Embedded and Texas Instruments
The joint Tobii–HTEC–D3 Embedded–Texas Instruments technology unveiled at InCabin Europe marked a significant milestone in the evolution of interior sensing. It represents the first integration of single-camera and radar for full-cabin monitoring on Texas Instruments’ automotive-grade platform. Designed to help automakers meet upcoming Euro NCAP and global safety requirements, the system delivers real-time driver and occupant monitoring with a level of precision and reliability that sets a new benchmark for in-cabin sensing.

Panel discussion at InCabin with the experts from the four companies that created the solution: Texas Instruments, HTEC, D3 Embedded and Tobii
At its core is Texas Instruments’ TDA4VEN processor, part of the Jacinto™ 7 family, a platform built specifically for demanding automotive AI workloads. Its architecture combines dedicated AI accelerators (up to 4 TOPS) with ARM® CPUs, vision accelerators, and DSPs, enabling complex deep learning models to run efficiently under strict performance and latency constraints. This ensures ultra-low-latency decision-making even in edge conditions, a requirement for safety-critical in-cabin applications.
The visual sensing layer, powered by the STMicroelectronics VG5761 image sensor, captures detailed imagery with a wide dynamic range and reliable performance in low-light conditions. Complementing this is the Texas Instruments AWRL6844 mmWave radar sensor, operating in the 60 GHz band, which detects occupant presence, micro-movements, and vital signs even in complete darkness or when the line of sight is obstructed.
By fusing these two modalities, the system provides robust coverage of edge cases that the current mainstream solutions often miss, including child-left-behind scenarios, intruder alerts, and detections obstructed by in-cabin elements. The radar effectively compensates for visual limitations while the camera delivers the contextual understanding needed for advanced driver and occupant analysis. Together, they enable a safer, simpler, and more cost-efficient architecture for automakers seeking scalable solutions that bridge safety and comfort.
Key takeaways brought home by the HTEC experts
Feedback from our experts at InCabin Europe reflected a dynamic and rapidly evolving interior sensing landscape. One of the strongest impressions was that OEMs are exploring different deployment and fusion strategies for DMS and OMS solutions, depending on the vehicle segment—from high-end models adopting early or complex fusion architectures to mid- and lower-segment vehicles opting for later-stage or simplified integration. At the same time, some DMS/OMS providers remain cautious, waiting to see whether automakers will commit to multi-camera configurations before scaling investment in fusion-based systems.

In front of Tobii-HTEC-D3 Embedded-Texas Instruments space at the conference
The conference brought together key players and decision-makers across the automotive sensing ecosystem, a valuable opportunity to assess the current state and trajectory of the field. Because InCabin was co-located with AutoSens Europe, there was a strong exchange of ideas between interior sensing, exterior perception, and ADAS development, underscoring how these once-separate domains are now converging toward holistic vehicle intelligence.
Several topics dominated technical and business discussions. The harmonization of global DMS/OMS regulations across Europe, the U.S., and Asia remains a challenge, prompting interest in scalable architectures that can flexibly adapt to regional standards. Validation datasets and independent testing were repeatedly cited as critical differentiators for earning OEM trust and regulatory approval.
Another strong theme was the use of synthetic data and simulation (digital twins) to augment real-world datasets, especially for testing Euro NCAP readiness, edge cases, and rare events that are difficult or unsafe to reproduce physically. Simulation-based validation is now moving from research into practice, showing a level of maturity and standardization not seen in previous years. Meanwhile, touchless HMI concepts, such as digital mirrors powered by DMS signals, illustrated how interior sensing can serve both safety and user experience, blending compliance with comfort.
From sensing to sense-making: The future of interior sensing and in-cabin intelligence
The next phase of in-cabin innovation will be defined by multi-modal fusion: systems that combine vision, radar, and soon audio or thermal inputs to move beyond detection toward interpretation. Instead of simply identifying driver states or occupant presence, vehicles will increasingly be able to understand context, recognizing not only who is inside, but how they are interacting with the environment. This deeper layer of perception will enable the car to respond intelligently, closing the gap between sensing and sense-making.
While today’s production-ready systems already fuse radar and camera inputs for reliable monitoring, the real value for OEMs lies in turning those insights into adaptive cabin features that enhance safety and personalize the driving experience. The industry’s trajectory is clear: toward a holistic interior experience where vehicles no longer just monitor, but continuously adapt comfort, HMI, and safety systems to real-time conditions. This marks a decisive shift from passive monitoring to meaningful in-cabin interaction—a space where HTEC’s expertise in embedded intelligence and system integration will continue to help automakers bridge technology maturity with production-scale deployment.
