Haptic Typography

Live Sensor Data Processing

Are there other ways to manipulate typographic parameters on the computer besides digital controls? I explored this question experimentally in the workshop Live Data Visualisation as part of the lab week at the HfG.  

Context

Idea behind the Workshop

In the four-day workshop Live Data Visualisation we had the opportunity to visualise live data from different sensors via an Arduino and nodeRed on a website.

My Approach

Instead of visualising the sensor data, I decided to use the sensors as controls to interact with the computer. Since I am generally interested in typography, I found the idea of manipulating typographic parameters with the help of different sensors exciting.

Setup

Hardware/Software

I have connected several sensors to the computer via an Arduino. The data streams are forwarded to a website via nodeRed and a websocket. Here I use the sensor data to manipulate individual type parameters such as width, size and style. 

Interactions

Brightness

The data from the light sensor is used to change the weight of the font – alluding to the semantic term font-weight. The more light, the lighter the font.

By moving the desk lamp, I was able to change the weight of the font with astonishing precision. In this way, a physical object without a connection to the computer is quickly transformed into a control element. 

Rotation

A gyroscope provides the rotation along the X, Y and Z axes. Due to a sensor error, I have only used the rotation values from one axis.

With the sensor in my hand, I was able to manipulate the font very intuitively and precisely - a real alternative to purely digital sliders.

Distance

I mapped the distance between my hand and an ultrasonic sensor onto the font size. The further away I am from the sensor, the larger the font appears.

For hand control, the palm of the hand turned out to be too small to be recognised correctly by the sensor. Instead, I used a book as the control element.

Reflection

Connected through sensors & data

In the end, mice, trackpads and keyboards are nothing more than sensors whose data streams connect the physical world with the digital one.

In the meantime, interacting with these devices feels so natural that one completely forgets the underlying concept. It is only when you take the familiar sensors out of their housing and replace them with other, more unconventional ones that this connection emerges clearly before your eyes.

Project

  • Live Data Visualisation
  • Laborwoche WS 2020/21
  • HfG Schwäbisch Gmünd

Duration

  • November 2020 (1 Week)

Supervision

  • Prof. Hartmut Bohnacker
  • Luca Stetter

previous project

next project