The Large MUAP Models (LMM) AI-powered neural gesture technology enables personalized, intuitive interactions for the AI and XR era
Wearable Devices Ltd. (the “Company” or “Wearable Devices”) (Nasdaq: WLDS, WLDSW), an award-winning pioneer in artificial intelligence (“AI”)-based wearable gesture control technology, is proud to announce a groundbreaking advancement in human-computer interaction: Large MUAP Models (“LMM”). Building on the success of LLMs in natural language processing, Wearable Devices is actively developing LMMs with the goal to revolutionize how we interact with digital devices, aiming to offer personalized, intuitive gesture control powered by neural data.
While still in development, this innovative technology, as previously announced, holds immense potential to redefine human-device interaction.
The LMM Revolution: Decoding the Neural Alphabet
Just as LLMs unlocked the power of language for AI, LMMs aim to unlock the power of neural gestures for seamless, natural interactions. By decoding Motor Unit Action Potentials (MUAPs)—the body’s language for communicating with muscles—Wearable Devices has created a new paradigm for gesture control. LMMs are harnessing the potential of big data to enable devices to understand and predict user intentions with unprecedented speed and precision, making interactions faster and more intuitive than ever before.
Personalized Gestures for a Natural User Experience
At the heart of LMMs is personalization. The technology learns from individual users, creating a unique neural profile that will enable gestures tailored to each person’s natural movements. Whether it’s a subtle thumb swipe to select an option or a pinch-to-zoom gesture in augmented reality, LMMs will make interactions feel effortless and intuitive. “With LMMs, we are decoding the neural alphabet, potentially unlocking a strategically vital technology that fuses human neurology with AI. This breakthrough has the potential to create sci-fi-like superhuman abilities, giving a fundamental edge to whoever masters it first,” said Guy Wagner, Chief Scientific Officer of Wearable Devices.
Wearable Devices’ flagship products, such as the Mudra Band for Apple Watch and the Mudra Link for universal device control, are already demonstrating the power of neural interfaces. These devices allow users to control their digital environments with simple, natural gestures. LMMs have the potential to make our current technology user-personalized, paving the way for a future where wearable technology is seamlessly integrated into our daily lives.
The Future of AI and XR: Powered by Neural Gestures
As spatial computing becomes the next computing platform, LMMs will provide the intuitive, natural interactions needed to unlock its full potential. Wearable Devices is focused on developing this technology and plans to seek collaboration with leading companies to integrate LMMs into next-generation extended reality (XR) platforms, ensuring that users can interact with their digital environments in ways that feel as natural as moving their hands.
“The future of XR and AI interactions is here, and it starts with your wrist,” added Mr. Wagner. “With LMMs, we are not just imagining the future—we are building it.”
Explore AITechPark for the latest advancements in AI, IOT, Cybersecurity, AITech News, and insightful updates from industry experts!