Intel Labs Introduces SPEAR Open-Source Photorealistic Simulator

Intel Labs collaborated with Computer Vision Center in Spain, Kujiale in China, and the Technical University of Munich to develop the Simulator for Photorealistic Embodied AI Research.

Intel Labs collaborated with Computer Vision Center in Spain, Kujiale in China, and the Technical University of Munich to develop the Simulator for Photorealistic Embodied AI Research.

Scenes may be cluttered with objects that can be manipulated individually. Messy room configurations could serve as initial states for a cleaning task. Image courtesy of Intel.


To better serve the embodied artificial intelligence developer community, Intel Labs has collaborated with the Computer Vision Center in Spain, Kujiale in China, and the Technical University of Munich to develop the Simulator for Photorealistic Embodied AI Research (SPEAR). This simulation platform helps developers to accelerate the training and validation of embodied agents for tasks and domains.

With its collection of photorealistic indoor environments, SPEAR applies to a range of household navigation and manipulation tasks. SPEAR aims to drive research and commercial applications in household robotics and manufacturing, including human-robot interaction scenarios and digital twin applications.

To create SPEAR, Intel Labs worked with a team of professional artists for over a year to construct a collection of handcrafted interactive environments. SPEAR features a starter pack of 300 virtual indoor environments with more than 2,500 rooms and 17,000 objects that can be manipulated individually. These interactive training environments use detailed geometry, photorealistic materials, realistic physics and accurate lighting. New content packs targeting industrial and healthcare domains will be released soon.

By offering diverse and realistic environments, SPEAR helps throughout the development cycle of embodied AI systems, and enables training agents to operate in the real world, straight from simulation. SPEAR helps to improve accuracy on many embodied AI tasks, especially traversing and rearranging cluttered indoor environments. SPEAR aims to decrease the time to market for household robotics and smart warehouse applications, and increase the spatial intelligence of embodied agents.

Challenges in Training AI Systems

In the field of embodied AI, agents learn by interacting with different variables in the physical world. However, capturing and compiling these interactions into training data can be labor intensive. In response, the embodied AI community has developed interactive simulators, where robots can be trained and validated in simulation before being deployed in the physical world.

While existing simulators have enabled rapid progress on increasingly complex and open-ended real-world tasks such as point-goal and object navigation, object manipulation, and autonomous driving, these sims have several limitations. Simulators that use artist-created environments typically provide a limited selection of unique scenes, such as a few dozen homes or a few hundred isolated rooms, which can lead to severe over-fitting and poor sim-to-real transfer performance. On the other hand, simulators that use scanned 3D environments provide larger collections of scenes, but offer little or no interactivity with objects. 

Overview of SPEAR

SPEAR was designed based on three main requirements: (1) support a collection of environments that is as large, diverse, and high quality; (2) provide physical realism to support realistic interactions with a range of household objects; and (3) offer as much photorealism as possible, while still maintaining enough rendering speed to support training complex embodied agent behaviors.

Sources: Press materials received from the company and additional information gleaned from the company’s website.

More Intel Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

DE Editors's avatar
DE Editors

DE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via DE-Editors@digitaleng.news.

Follow DE
#27217