An IIT madras research team has developed an Artificial Intelligence (AI) technology that can convert speech signals from speech impairments into Language. In addition to helping people with speech impairment, the technology can also be used to interpret the signals from nature, such as the photosynthesis process of plants or their response to an external stimulus.
The team was led by Dr. Vishal Nandigana, assistant professor in the Department of Mechanical Engineering, IIT Madras.
Electrical signals, brain signal or any signal, in general, are waveforms that are decoded into meaningful information using physical law or mathematical transformations such as Fourier Transform or Laplace transform. These physical laws and mathematical transformations are science-based languages discovered by renowned scientists such as Sir Isaac Newton and Jean-Baptiste Joseph Fourier.
Building on this research Dr. Vishal Nandigana, the principal investigator: “The result is the ion current, which represents the flow of charged particles. These electrically driven ionic current signals are processed to be interpreted as human language, meaning that there is speech. This would tell us what the ions are trying to communicate with us. If we succeed, we get electrophysiological data from neurologists to let the brain signals of speech impaired people know what they are trying to communicate “.
“The other important application of this field of research that we potentially see is that we can interpret the signals from nature, such as the photosynthesis process of plants or their reaction to external forces when we collect their real data signal. We think the data signal will also be in some wave-shaped pattern with peaks, bumps and crusts, so the major breakthrough will be that we can interpret what plants and nature are trying to communicate with us, which will help predict monsoons, earthquakes, floods, tsunami 's and other natural disasters using our Artificial Intelligence and Lowering algorithms. If we understand nature's signals well, we can take good care of it and that is our goal that we want to bring in from our laboratory, “added Dr. . Nandigana ready.
IIT Madras Researchers are working on how these real data signals can be decoded in human language such as English and if the real data signal can be interpreted as a simple human language that all people can understand.
Brain signals are typically electrical signals. These are wave-shaped patterns with peaks, bumps and crusts that can be converted into simple human language using Artificial Intelligence and Deep Learning algorithms, meaning that they can speak. This enabled the researchers to read direct electrical signals from the brain.
They tested this concept by getting experimental electrical signals through laboratory experiments to get signals from nanofluid transport within nanopores. The nanoparticles were filled with saline and mediated using an electric field.
Translated with www.DeepL.com/Translator (free version)