BrainTech Projects

Our scientific activity includes the building of novel devices and related algorithms to convey different types of sensory experiences via our sensory systems - such as conveying vision via audition to blind people or audition via touch to deaf people, but even thermal information or “backward vision” via audition or touch to the healthy population; the developing of special virtual reality environments where we expose our participants to body manipulations (e.g., experience the world upside-down) or where participants can train on our algorithms for novel sensory experiences. Here are several projects that we have developed:

The EyeMusic is a sensory substitution device (SSD) that conveys visual information into audition, while preserving, shape, color and location. X-axis information is conveyed through time, such that visual details on the left are heard before those on the right. Y-axis information is conveyed via pitch, as higher parts of the image are conveyed through higher tones than parts lower in the image. Colors are differentiated through musical instruments. The EyeMusic, as other SSDs, has rehabilitative potential for blind individuals especially in the effort to create a low resolution whole colored image in their brain in a resolution of up to 1500 pixels. 

 

 

 

 

 

 

 

 

In yet another example, Crickets have slowly evolved during evolution to hear better via their legs! Can we do something similar? In our Speech and Music to touch project we are developing technology and training that improved hearing via touch with minimal one hour of training. This can help hearing impaired but also for normal hearing people when we try to understand speech in a noisy environment or when the speaker lips are hidden behind masks (did someone said COVID-19 times?).

 

 

 

 

In collaboration with the World Hearing Center in Warsaw, Poland we developed and implemented a various array of audio to touch sensory substitution devices. The first device allows to deliver vibrotactile stimulation, representing certain features of sound such as speech signal, on fingertips. This part of the body has most densely represented Pecunian cells - tactile cells coding low-frequency vibration. We already showed in 30 individuals that speech in noise understanding improves when the auditory signal is complemented with corresponding vibration. The device will be further extended to provide stimulation on other parts of the body, and is compatible with a 3T MRI scanner. Other devices include the TactileGlove which can convert speech and music information (from fundamental frequency to sound envelope) into touch information. We also work on several new sound to touch devices including multisensory chair and a multisensory bed for music enhancement and neuro-wellness. We also in the process of creation of a special DOME FOR NEURO-WELLNESS

Creation of a multisensory (visual, touch, audio, movement and smell) 360 Dome for neuro wellness (in collaboration with Simno company https://www.simnoa.com and Joy Ventures https://www.joyventures.com ). We then use all this knowledge and our unique facilities like the multisensory ambisonic room to develop new multisensory technologies. for example, in the field of rehabilitation and in the field of neuro-wellness (e.g., by using tools from certain senses - such as hearing and touch - to produce technologies that lower anxiety).

 

 

Finally, snakes have evolved during evolution to have a far sensing thermal infrared sense. In our new ERC project How Experience Shapes the human brain: NovelExperieSENSE 2019-2023 (https://cordis.europa.eu/project/id/773121) we also ask if humans can develop in weeks such superhero like abilities. Other ongoing project we are currently recruiting for or open for collaborations are BEASTSENSE (Expanding the auditory range using non-perceptible-to-human auditory frequencies, from very low (infrasound, <20Hz) to very high (ultrasound, >20kHz). These broad frequency ranges are perceptible to animals, including household pets); TIMESENSE: Conveying time via sensory (e.g. tactile) cues at set intervals (e.g., 15sec, 30sec, 1min, 5min, 15min). We are expanding upon an ability humans already possess, but we aim to sharpen the sensory experience related to it by transforming idiothetic (internal) time cues into allothetic (external) signs. CITYPULSE In this project a person is provided with information from existing ever changing maps of the environment, that will include parameters, e.g. pollution level, crime levels or radiation levels. Users will be able to perceive various properties of their surroundings continuously both as current point information (user location) and as full spatial maps of their surroundings. This project will enable us to investigate how the brain represents differences stemming only from contextual info (i.e. same environment with different degrees of pollution).

In collaboration with IDC media innovation lab directed by Dr. Oren Zukerman http://milab.idc.ac.il

Concluding remarks for our BrainTech to Brain Imaging projects:

All These technological tools, based on the understanding of human & animals brains allow us to decipher the mysteries of the brain and map it in an unprecedented way. The main goal of the Baruch Ivcher Institute for Brain, Cognition, and Technology is using this unique approach to examine age-old mysterious basic science questions about the brain and the human civilization, such as what has greater impact on us - nature or nurture? Can the human brain develop senses that are able to perceive inputs far sensing thermal infrared information? (Just like snakes developed during evolution)?  Is our brain more elastic or more stable across the lifespan and can we reverse the time and get older brain to be more plastic/young? And which of our senses corresponds to which daily tasks and how do these tasks are represented in maps and areas in the brain?

 

 

Other Brain Rehabilitation projects open for collaborations and open positions

This field of research incorporate projects that have the goal to rehabilitate or improve sensory residual skills of patients with different kind of disabilities

 

1. BODY SONIFICATION

The aim is to provide multisensory feedback of body movement in order to improve proprioception and time perception: muscle activity will be recorded by electro-miography (EMG) and replaced by sound, light, or vibrations that scale muscle contraction intensity. The EMG and sensory output will be built into a unique portable device that people can use to train movement and gait after stroke, as well as during workout or  relaxation.

 

2. REAL LIFE SOUNDSCAPES

In the multisensory room we are building real-life sound scenarios, such as a shop/a restaurant/city traffic/concert/ forest/ sea, etc. This space can be used for instance for the rehabilitation of speech perception in noise and for sound localization in the hearing impaired.

 

3. VIRTUAL REALITY in collaboration with the Advanced Reality Lab directed by Dr. Doron Friedman

https://www.idc.ac.il/en/research/arl/pages/home.aspx

We are using virtual reality to study brain flexibility and to develop novel applications by using the insights from our findings in different populations. For instance, we are using immersive virtual reality techniques to measure behavioral factors and train a healthy human population into developing cross modal links between the auditory system and the somatosensory system augmentation. Numerous undergoing VR experiences based on research regarding sensory substitution and motor performance, such as: Temperatures and Sound - Using VR we created an environment that requires the ability to discriminate temperatures from changing sounds, in order to accomplish tasks; this is to train the brain to create correspondences between thermal perception and other sensory inputs.  We are also using VR immersivity to change participant’s perceptions of their body and their actions. As we can play with the presentation of the seen body while tracking its movements we are able to create new sensorimotor loops that affect our body ownership and our sense of embodiment.

 

 As part of its broad global activity approach, the Institute works closely with world-renown institutions, with leading labs in IDC and with the new Innovation center at IDC and its students come from reputable institutions in their fields, such as US-based MIT and Italy’s ITT and our graduates work or start their own labs places like Harvard, Georgetown and leading Israeli universities or in the high-tech industry.

Speech and Music to touch project

How experience shape the brain  Novel Senses / NovelExperiSense

 

Seeing with the ears 

Wake up the visual brain