Ia: a ring capable of translating the language of signs into words

Ia: a ring capable of translating the language of signs into words

By Dr. Kyle Muller

A ring, thanks to the AI, translates the language of the signs into words while the fingers make up the letters of the manual alphabet (for now only the English one).

The main problem for a deaf -mute person is to make people understand. Those who have this kind of communicative deficit, in fact, while understanding the world that surrounds it from an early age, has difficulty in bringing their thoughts to others and often has the only means available the only language of the signs, however incomprehensible to most people. To remedy this problem, a research team of Cornell University has developed spellring, a ring with artificial intelligence and micro-osar technology capable of tracing the fingers in real time while making the letters of the manual alphabet of the language of American signs (ASL). This innovative device could revolutionize communication for people with auditory problems and mutism, managing to translate the simple gestures of the hand into words.

One ring. Unlike other bulky systems, covered with threads and sensors, Spellring looks like a very light ring to wear on the thumb, but inside it houses a microphone and a speaker capable of issuing and Receive imperceptible sound waves and to map hands and fingers.

A little gyroscope records the rotations of the wristwhile an algorithm of Deep Learning analyzes Sonar imagestranslating the letters of the so -called “fingerspelling” in real time (manual alphabet). All this technology is enclosed in a shell printed in 3D as big as a coin. The goal of the research team was precisely to create a precise device, as well as comfortable and intuitive to use, responding to the real needs of the community.

Tests and results. To evaluate their effectiveness, the researchers involved 20 people, experts and non -experts who have composed more than 20,000 words of different complexity. The results? A accuracy rate between 82% and 92%comparable to much more complex and bulky systems. Obtaining these performances was not easy: the main challenge was train the IA to recognize the 26 configurations of the fingers corresponding to the phonemes of the English alphabettaking into account that each user, to make the movements more fluid and faster, slightly changes the shapes of the letters and the speed of execution of the gestures based on habits and context. This flexibility required a specific training of the model, capable – at the end of the learning process – to correctly interpret even the thinner variations.

Accessible future. The next goal of the research team is of Integrate micro-osar technology into smart glassesto also capture movements of the bust, facial expressions and gestures of the boss, essential elements in the grammar of the ASL.

Further development could bring to the translation of whole sentencesmaking communication more fluid and natural and then, again, to instruct the ring with other signs languages ​​(they exist well 121 different in the world). The same researchers admit that the journey towards a complete translation of the signs language is still long, but they are sure that, in the coming years, it will lead to the total reduction of linguistic barriers.

Kyle Muller
About the author
Dr. Kyle Muller
Dr. Kyle Mueller is a Research Analyst at the Harris County Juvenile Probation Department in Houston, Texas. He earned his Ph.D. in Criminal Justice from Texas State University in 2019, where his dissertation was supervised by Dr. Scott Bowman. Dr. Mueller's research focuses on juvenile justice policies and evidence-based interventions aimed at reducing recidivism among youth offenders. His work has been instrumental in shaping data-driven strategies within the juvenile justice system, emphasizing rehabilitation and community engagement.
Published in