In the four-day workshop Live Data Visualisation we had the opportunity to visualise live data from different sensors via an Arduino and nodeRed on a website.
Instead of visualising the sensor data, I decided to use the sensors as controls to interact with the computer. Since I am generally interested in typography, I found the idea of manipulating typographic parameters with the help of different sensors exciting.
I have connected several sensors to the computer via an Arduino. The data streams are forwarded to a website via nodeRed and a websocket. Here I use the sensor data to manipulate individual type parameters such as width, size and style.
The data from the light sensor is used to change the weight of the font – alluding to the semantic term font-weight. The more light, the lighter the font.
By moving the desk lamp, I was able to change the weight of the font with astonishing precision. In this way, a physical object without a connection to the computer is quickly transformed into a control element.
A gyroscope provides the rotation along the X, Y and Z axes. Due to a sensor error, I have only used the rotation values from one axis.
With the sensor in my hand, I was able to manipulate the font very intuitively and precisely - a real alternative to purely digital sliders.
I mapped the distance between my hand and an ultrasonic sensor onto the font size. The further away I am from the sensor, the larger the font appears.
For hand control, the palm of the hand turned out to be too small to be recognised correctly by the sensor. Instead, I used a book as the control element.
In the end, mice, trackpads and keyboards are nothing more than sensors whose data streams connect the physical world with the digital one.
In the meantime, interacting with these devices feels so natural that one completely forgets the underlying concept. It is only when you take the familiar sensors out of their housing and replace them with other, more unconventional ones that this connection emerges clearly before your eyes.