One of the clearest shifts in the embedded industry is that the question of whether AI belongs at the edge is no longer up for debate. Products are already proving that it does. This was evident at this year’s Embedded World Conference in Nuremberg, which felt notably quieter on hype and far more focused on working demos.
The conversations on the ground reflected that maturity. AI is becoming increasingly tangible, built directly into robots, sensors, and other devices that operate, sense, and interact with the physical world in real time.
At the same time, the regulatory landscape around connected, AI‑enabled devices is tightening. The EU Cyber Resilience Act introduces mandatory vulnerability reporting and incident notifications for connected devices, with full enforcement starting this September.
Here is what each of these areas looked like on the ground in Nuremberg.
Bringing AI into Embedded Systems
Edge AI and Physical AI dominated conversations at EW26. Yet, behind the demos and hardware announcements, engineering teams are facing a less glamorous reality. Bringing AI into resource-constrained embedded systems comes with hard tradeoffs, and we wanted to know where teams are feeling the most pressure. We asked visitors at our booth about their biggest challenges when bringing AI into embedded systems, and the results were telling.

Overall, the responses suggest that bringing AI into embedded systems is a full-stack challenge, involving hardware constraints, system integration, and model lifecycle considerations.
Robotics: Physical AI in motion
As Angel Corona, Senior Embedded Software Engineer at HTEC, noted, the surge of robotics at EW26 was hard to miss:
“What stood out wasn’t just the volume of exhibits, but the tone of the conversations around them. Robotics is no longer discussed as a distant possibility. Combined with the maturity of edge AI, it is increasingly treated as something buildable, deployable, and within reach.”
One demo that captured this well was the humanoid robotic head built jointly by Infineon and HTEC. The system combined radar-based spatial awareness, time-of-flight depth sensing, and high-performance digital MEMS microphones for intelligent audio recognition – all processed locally, on the device. Beyond the sensing architecture, what the demo highlighted was the increasingly important role of the interaction of humanoid robots with the physical world. As the market accelerates, these robots hold the potential to become real helping hands across many sectors, from manufacturing and logistics to retail, hospitality, healthcare, agriculture, and even the home.

“What is making this shift possible is the convergence of sensing, compute, and software into increasingly compact, capable systems. Sensors give robots the context awareness they need to emulate human-like perception – vision, depth, hearing. Without that foundation, precise execution would break down. Bringing physical AI to life doesn’t happen in isolation. It requires trusted partnerships between companies that understand the importance of combining hardware and software, thus turning innovative ideas into real-world solutions.” – Nenad Belancic, Director, Head of Application Management Robotics at Infineon
The Infrastructure Opportunity
As AI capabilities accelerate, infrastructure is struggling to keep pace. Cloud-based solutions frequently cannot meet the latency demands of real-time and interactive applications. The gap between what AI can do and what teams can deploy efficiently remains one of the most pressing challenges in the field.
“What came through consistently in conversations on the floor was where the real friction lives: low-level optimization, compiler behavior, and the challenge of deploying models efficiently on hardware that was never designed with AI workloads in mind. We also observed a clear increase in defense-oriented applications, as well-funded startups and established players shift more capacity toward high-reliability, mission-critical use cases.” – Tihomir Andjelic, Director of Engineering and Delivery at HTEC.
Efficient infrastructure isn’t just what makes edge AI work. It’s also what keeps it economically viable at scale. We asked visitors where they see the most innovation happening in embedded systems today. Unsurprisingly, infrastructure and edge AI came out on top.

Standalone Accelerators are No Longer Sufficient
The era of on-edge AI and Physical AI has just begun, but one thing is already very evident: there is a growing need for comprehensive SoC solutions, as standalone accelerators are no longer sufficient. As AI workloads become more complex, especially with real-time, multimodal processing at the edge, integrating compute, memory, and ML-optimized acceleration is critical.
At EW26, SiMa.ai and HTEC jointly showcased a demo built on SiMa.ai’s Modalix MLSoC platform, with HTEC delivering the production-grade AI software stack. Running multimodal workloads entirely on-device – voice, text, and live camera in real time – it illustrated exactly this. The combination of 50 TOPS at 10 watts sparked some of the most engaged conversations at the booth.

“The edge is where efficiency becomes performance. If you can run LLMs and VLMs within tight power and hardware limits, you’re not just optimizing — you’re enabling an entirely new domain of use cases.” – Seif Hanna, Machine Learning and Embedded AI Engineer, SiMa.ai.
Cybersecurity: A Non-Negotiable Design Requirement
As embedded systems become more connected and AI-enabled, the attack surface grows with them. Security can no longer be addressed at the end of the development cycle. The regulatory landscape is now making that official. The EU Cyber Resilience Act introduces mandatory vulnerability reporting and incident notifications for connected devices, with full enforcement starting in September 2026 and complete implementation required by December 2027.
For product teams, this means security by design is no longer a best practice. It has become a compliance obligation that touches hardware architecture, firmware, software, and lifecycle management alike.
Looking Ahead
EW26 made one thing clear: keeping pace means treating silicon, AI inference, cybersecurity, and system software not as separate workstreams but as a single problem to solve. Physical AI is becoming real. The regulatory clock on security is ticking. And the hardware to support scalable edge deployment is ready – it’s now on software to catch up.
None of this is being figured out in isolation. The most compelling work on display, from AI-enabled robotics to multimodal edge inference, was the product of close partnerships between hardware and software companies that have built genuine trust and shared technical depth. In a space this complex, that kind of collaboration isn’t a nice-to-have. It’s how things actually get built.
Get in touch to explore how HTEC can help you turn embedded complexity into production-ready solutions.






