
Modern robots are highly capable in structured environments but struggle to handle unstructured tasks that require delicate touch, such as grasping irregular objects or performing fine manipulations. Traditional robotic grippers rely primarily on vision, which are insufficient for dynamic or contact-rich interactions.
This technology introduces an AI-driven tactile intelligence platform coupled with tactile-sensing robotic fingers that can perceive and interpret contact pressure, texture, and shape in real time. By integrating advanced tactile sensors with a foundation model trained on tactile data, the platform enables robots to feel and adapt their actions with human-like precision.
The technology owner is seeking adopters and collaborators such as robotics OEMs, automation system integrators, healthcare robotics developers, and deep-tech companies working on sensors, embedded systems, or AI analytics. Institutes of Higher Learning and research centres specializing in robotics or tactile perception are also key partners. These groups can leverage the platform to enhance robotic dexterity, precision, and safety across industrial, service, assistive, and manufacturing applications—particularly where delicate handling and high-fidelity tactile sensing are critical.
The platform combines compact, non-optical tactile sensors with an AI foundation model for real-time interpretation and autonomous adaptation. This technology provides faster response, greater durability, and AI-driven tactile analytics (rather than fixed feedback) that continuously learn across objects and tasks—delivering smarter, more adaptable robotic manipulation.
Key Components
Industrial Robotics
Applications in industrial robotics include automated assembly, sorting, and material handling of fragile or irregular objects.
Relevant products: Smart robotic fingers and grippers; tactile AI control modules for industrial robotic arms.
Healthcare & Assistive Robotics
In healthcare and assistive robotics, the technology supports surgical aids, rehabilitation robots, and prosthetic devices that require safe, compliant, and highly sensitive touch. It enhances patient safety, dexterity, and human–robot interaction in medical environments.
Relevant products: Adaptive prosthetic or rehabilitation devices; smart robotic fingers integrated into assistive tools.
Service Robotics
Service robotics—such as food handling, retail assistance, and hospitality robots—benefit from adaptive gripping capabilities and tactile sensing for safe interaction with diverse objects and customers.
Relevant products: Smart robotic fingers and grippers for food-service robots; tactile AI modules for autonomous service systems.
Logistics & Warehousing
In logistics and warehousing, tactile-enabled manipulation supports efficient pick-and-place automation for e-commerce fulfilment and packaging. The technology improves accuracy when handling varied packaging materials and irregular items.
Relevant products: Smart robotic grippers for parcel handling; tactile AI control modules for automated picking systems.
Research and Education
For research and education, the technology provides tactile perception tools and AI training datasets valuable for advancing human–robot interaction, manipulation research, and foundation model development.
Relevant products: Tactile data foundation model licensing for robotics OEMs; research-grade tactile sensor modules and datasets.
Unlike conventional robotic grippers that rely mainly on vision, this technology provides true tactile sensing and AI-driven interpretation of touch. It allows robots to understand what they are holding — not just detect that they are touching something.
This technology offers real-time tactile feedback for adaptive grasping and slip prevention, powered by an AI foundation model that learns transferable tactile representations across objects and tasks. It is compatible with both rigid and soft robotic systems and operates reliably in any lighting or environment without the need for cameras or external sensors. The scalable data platform further enhances performance by continuously improving model accuracy across deployments.