Your address will show here +12 34 56 78
MUSE

Interactions

January 2024 - January 2027

The MUSE project is a collaborative initiative focused on advancing the domain of automotive technologies through the development of a Multimodal Sensing Environment for mobile applications. This innovative endeavor aims to integrate cutting-edge artificial intelligence (AI) models with digital simulations encompassing digital twins, under the strategic innovation domain (DIS 3) which targets the modernization of technology to create tailored solutions for industrial needs.
At the heart of MUSE's collaborative effort are AISIN Europe, UMONS, MULTITEL and UCLouvain. These partnerships leverage existing resources, such as a driving simulator developed by AISIN. It offers visual and audio modalities to explore human (driver) interactions with semi-autonomous vehicles and to train AI models for decision-making in driving or traffic monitoring.
MUSE's objective is to extend simulation capabilities to include various sensing modalities like lidars, radars, IR cameras, etc., and to facilitate the qualification of a wide array of sensors through simulation/validation loops in intelligent car development. The project will test its generality across three distinct use cases, focusing on data from vehicle passengers under various conditions and sensor types, external sensors for obstacle and driving information, and V2X (vehicle-to-everything) communication data from infrastructures.
Ultimately, MUSE aims to deliver a final product of an iterative optimization loop of multimodal sensors, relying on a realistic simulator that models both the environment and sensor signals, alongside AI decision-making modules. This deliverable is expected to be versatile enough to be applied across different use cases in smart automotive technology. The project will also produce a commercially viable Decision Support Software (DSS) to optimize sensor selection for autonomous vehicles, based on the driving simulator's multimodal signal outputs, tailored by AISIN for their applications.

People