“Until now, little attention has been paid by the research community to the applications of the Internet of things (IoT) in the field of interactive sonification, the joint control of the public address system both by the user performing the gestures local to the system itself, and by a or multiple remote users.”
The visually impaired cannot process data from classic dashboards or visualizations. infinimesh provides the Internet of Inclusion (IoI) for all people by developing a completely new advanced auditory representation of the Internet of Things. When there are more than thousands of sensor readings, infinimesh.sonification translates the information into audio clusters that structure the various sensors into groups and information clusters. The audio signal contains harmonic structures, sound fields, overtone clusters in overtones and disharmony, and rhythmic ones /polyrhythmic representations. These configurable audio streams allow visually impaired people to read sensor information much faster than reading dashboards with graphics and visual representations than audio-only music information.
Chord structures, modal changes, scales, overtones and overtones/dissonances modulate between the standard and critical states of the sensors. Rhythmic arpeggiation, ostinato and tempo represent ongoing processes. With minute changes in sensor input, you can monitor its dysfunction to easily locate polyrhythmic changes. Additional sensor bracelets with motorized resonance zones bring even more sensory perceptions into the world of visually impaired people.
Regardless of whether you want to compare different Smart Cities from the audio print or the predictive maintenance of complex industrial processes. Any person with such a device can use this technology to navigate faster than just looking at a screen. A midi matrix offers the possibility to define your own hearing events and the weighting and quality for different activator inputs in order to easily personalize your own listening templates.