DAIMON Robotics Unleashes World’s Largest Tactile-Rich Dataset to Give Robots a Sense of Touch
Breaking: DAIMON Robotics Releases Daimon-Infinity Dataset
Hong Kong-based DAIMON Robotics has announced the release of Daimon-Infinity, the largest omni-modal robotic dataset for physical AI, featuring ultra-high-resolution tactile sensing. The dataset spans over 80 real-world scenarios and 2,000 human skills, from folding laundry to assembly line tasks.

“This is a pivotal moment for embodied AI,” said Prof. Michael Yu Wang, co-founder and chief scientist at DAIMON Robotics. “We are delivering the tactile intelligence that robots desperately need to interact with the physical world.” The dataset includes millions of hours of multimodal data, with 10,000 hours open-sourced to accelerate global research.
Background: The Tactile Revolution
DAIMON Robotics, founded two and a half years ago, specializes in advanced tactile sensor hardware. Its monochromatic, vision-based tactile sensor packs over 110,000 effective sensing units into a fingertip-sized module, enabling human-like touch resolution.
The company collaborates with Google DeepMind, Northwestern University, and the National University of Singapore. Using a distributed out-of-lab collection network, DAIMON can generate millions of hours of data annually, powering large-scale robot manipulation datasets.
Prof. Wang, a Carnegie Mellon PhD graduate and former editor-in-chief of IEEE Transactions on Automation Science and Engineering, has spent four decades in robotics. He pioneered the Vision-Tactile-Language-Action (VTLA) architecture, which elevates tactile feedback to a primary modality alongside vision.
What This Means for Robotics
The Daimon-Infinity dataset addresses the critical “insensitivity” of current robot manipulation models, which rely heavily on Vision-Language-Action (VLA) without tactile input. By integrating high-resolution touch data, robots can achieve dexterous manipulation in unpredictable environments.

“Without tactile sense, robots drop objects, misapply force, and fail at tasks like handling fragile items,” Prof. Wang explained. “Our dataset enables learning how to grip, fold, and assemble with human-like precision.”
The open-source portion of the dataset is expected to lower barriers for startups and academic labs, fostering innovation in tactile sensing and embodied AI. Early applications include hotel service robots, convenience store automation, and factory assembly lines in China.
Industry Impact
- Accelerated R&D: Researchers can now train models on million-hour scale tactile data, reducing the need for expensive real-world trials.
- Real-world deployment: Touch-enabled robots will soon handle tasks like sorting packages, folding laundry, and assembling electronics.
- Collaborative push: Partnerships with Google DeepMind and top universities ensure global validation and adoption.
Prof. Wang summed up the urgency: “We are moving from brittle, vision-only robots to truly intelligent machines that feel their way through the world. The Daimon-Infinity dataset is the fuel for that transformation.”
For more details, visit DAIMON Robotics.
Related Articles
- Uber Revs Up Autonomous Vehicle Development by Tapping Its Fleet as a Living Sensor Network
- Industrial Automation Cybersecurity: Q4 2025 Threats and Trends
- Unlock Personalization Success: Why Your Team Needs a Prepersonalization Workshop First
- Kickstart Your Personalization Strategy: A Step-by-Step Prepersonalization Workshop Guide
- 10 Breakthroughs in Robotic Touch: How DAIMON Robotics Is Redefining Dexterous Manipulation
- 10 Things You Need to Know About the Fitbit Air Teaser
- How to Kickstart a Successful Personalization Strategy with a Prepersonalization Workshop
- New Framework for AI Transparency: Decision Node Audit Breaks Down the Black Box