Indian-American Researchers Develop Multisensory Neuron

By siliconindia   |   Friday, 22 September 2023, 00:02 IST
32
cmt right
28
Comment Right
48
cmt right
8
cmt right
Printer Print Email Email
Indian-American Researchers Develop Multisensory Neuron

Indian-American professor Saptarshi Das and his team developed an artificial multisensory neuron integrating tactile and visual cues, mimicking human sensory fusion, offering potential for AI advancement.

A research led by an Indian-American professor Saptarshi D as applied the concept of biological inputs for application in AI to develop the first artificial and multisensory integrated neuron.

Saptarshi Das, assistant professor of engineering science and mechanics at Penn State University, has joint appointments in electrical engineering and materials science and engineering.   

Robots make decisions by assessing their surroundings, although their sensors typically do not communicate with one another. Utilizing a sensor processing unit for collective decision-making is a potential approach. However, its efficiency scale remains unnoticed. Thus, Das concludes that the interconnectedness of senses in the human brain enables individuals to enhance their judgment of situations by allowing one sense to influence another. For instance, a car might have one sensor scanning for obstacles while another sensing darkness to modulate the intensity of headlights. On an individual note, these sensors relay information to a central unit, which instructs the car to brake or adjust the headlights.

This process consumes increased energy, allowing sensors to communicate directly with each other and can be more efficient in terms of energy and speed, especially when inputs from both are faint. Das states that biology enables small organisms to thrive in environments with limited resources, minimizing consumption in the process.

In various contexts, the requirements for different sensors are context-dependent. For instance, in a dark forest, one would tend to rely more on their sense of hearing than sight. However, individuals typically do not base their decisions solely on a single sense. Individuals have a complete understanding of their surroundings, and their decision-making is critically based on the integration of what they see, hear, touch, or smell. These senses have evolved together in biology but are separate in AI. Das assures that he, along with several other scientists, is looking to combine sensors and mimic how brains work. The team concentrated on integrating a tactile sensor and a visual sensor, enabling them to mutually influence each other's outputs through visual memory.

Muhtasim UI Karim Sadaf, a third-year doctoral student in engineering science and mechanics, states that even a short-lived flash of light can significantly enhance the chance of successful movement through a dark room. This is because visual memory can subsequently exert an influence and provide assistance to tactile responses when it comes to navigation. 

However, if the visual and tactile cortex of individuals were to respond to their mere respective unimodal cues, it might lead to less or nil chances. A photo memory effort of where the light shines and its remembrance is readily available, incorporated into a device through a transistor that provides the same response. 

The researchers fabricated the multisensory neuron by connecting a tactile sensor to a phototransistor based on a monolayer of molybdenum disulfide, a compound that exhibits unique electrical and optical characteristics useful for detecting light and supporting transistors. The sensor produces electrical spikes that resemble how neurons process information, enabling it to combine visual and tactile cues.

It's akin to observing an on-light indicator on a stove while simultaneously sensing heat emanating from a burner. Noticing light doesn’t potentially guarantee that the burner is hot at that moment, but the human hand only requires a split-second of feeling heat before instinctively withdrawing from the potential danger.

In this instance, the input of both light and heat stimulated signals that prompted the hand's reaction. The researchers measured the artificial neuron's emulation of this process by observing the signaling outputs produced in response to visual and tactile input cues. To replicate touch input, the tactile sensor employed the triboelectric effect, where two layers slide against each other, generating electricity and encoding the touch stimuli as electrical impulses.