Tactile Data

towards a phenomenology of human-data relations.

November 29, 2013
by trevorhogan

The CO2 project

The aim of this project is to explore the peoples’ use of data-driven objects with represent environmental data from their immediate surroundings. These objects will not only represent the indoor ambient co2 levels, they will also include a sensor which which captures the co2 levels within the immediate environment of the cube and represent these through different modalities. One cube will use numbers, the second will use vibro-tactile feedback, while the third will use sounds to represent the real-time co2 levels in the space where the cubes are situated.
In order to view, feel or listen to the latest values, people must shake the cubes, if they want the cube to represent the optimum levels they can knock on one side of the cube, while knocking on the opposite side will reveal the value that is described as unhealthy as laid out in the latest health & safety guidelines.
Below are a set of visualizations of the cubes which are presently in the process of being built, each cube with be created from natural wood and contain a COZIR Ambient CO2 sensor, the actuators inside the cubes include a 7-segment LDC display, vibration motors and an 8 Ohm speaker, these components will communicate with each other through an Arduino Fio which will also be used to send data about the use of the cubes to a server.

Final Version

Version 1

Version 1 (with tops removed)

Version 2 (without electronics)

Version 2 (no electronics)


August 14, 2012
by trevorhogan

DIS’12 Poster “Data Modality and the Repertory Grid Technique”

We describe a study that adapted the Repertory Grid Technique to examine personal constructs, elicited during a group session, from three data driven artifacts. This work is part of a wider research project that aims to better understand users’ affective responses when experiencing data represented through different levels of modality.

DIS'12 Poster, Newcastle, Uk June 2012


May 10, 2012
by trevorhogan

How Does Representation Modality Affect User-Experience of Data Artifacts?

Over the last number of months I conducted a study that explored people’s affective responses when experiencing data represented through different modalities. In particular, I was interested in investigating how data representations that address haptic/tactile and sonic perception are experienced. As part of this study I created of a number of data-driven artifacts (DataBox, SonicData & a Bar Graph) that all represented the same dataset. In taking a phenomenological approach to the analysis, I used the Repertory Grid Technique (RGT) during a group session to elicit participant’s personal constructs, which are used to describe and compare these artifacts. Below are some images and videos of the artifacts as well as RGT study…


DataBox (Construction)

DataBox being used during the RGT Study

RGT Study Group Session

SonicData being used during the RGT Study

October 20, 2011
by trevorhogan


Get Adobe Flash player

**Data: ‘urban outdoor air pollution monitoring’ from almost 1100 cities in 91 countries. For the purposes of this project the data from six countries was chosen to be represented (Ireland, Greece, Turkey, India, Egypt and the United Kingdom).  The selection of these countries was based on the date of data collection (all were collected in 2009) as well as the variation in results, the six countries range from 15 to 138 annual mean PM10 ug/m3.

Experiencing Data Through Different Modalities

October 13, 2011 by trevorhogan | 0 comments

The aim of this project is to evaluate how we experience data through different modalities.  The modalities used for the data representations will be; Visual, Tactile and Haptic.  The source data will be the same for each representation. This data is the most recent global air pollution figures collected by The World Health Organisation.  This database contains results of ‘urban outdoor air pollution monitoring’ from almost 1100 cities in 91 countries. For the purposes of this project the data from six countries were chosen to be represented (Ireland, Greece, Turkey, India, Egypt and the United Kingdom).  The selection of these countries was based on the date of data collection (all were collected in 2009) as well as the variation in results, the six countries range from 15 to 138 annual mean PM10 ug/m3.

Country Annual mean PM10 ug/m3
Ireland 15
India 109
Greece 44
Egypt 138
Turkey 66
United Kingdom 23

This project will involve collecting and collating a dataset, designing and implementing three separate data representations, evaluating a group of research participants experience while using the artefacts and finally analysing and publishing the findings of this study.  The three data representations that will be produced include two bespoke artefacts; DATA STACKER (tactile) and DATABOX (haptic) both of these artefacts are discussed in detail below.  The other representation will use a traditional method of representation data i.e. Bar Graph.  The method of evaluation that will be used is the Repertory Grid Technique.  Further details of this study will be presented soon.

Bar Graph (Visual)

Bar Graph (Visual)

DATA STACKER (tactile)

DATABOX (haptic)






DATA BOX is an interactive responsive object that triggers haptic feedback, in the form ‘knocking’ that represents elements within a dataset.  Each face carries a printed code, which is linked to field within the dataset.  When you scan each face, DATA BOX responds immediately by knocking on the interior of the box.  The user will hear the sound as well as feel the vibrations created by this knocking.  The frequency of the knocks will be predetermined by the data.  DATA BOX can store the data from all 6 countries at any one time (one country from each face) When the code on the face is scanned the name of the country is displayed on a LCD monitor and the internal electronic mechanism begins to knock.  The frequency of knock is determined by the rate of pollution i.e. Ireland 15 knocks per minute with represents the annual mean PM10 ug/m3. This knocking will continue until you scan a different face.

DATABOX (haptic)










System Design
The DATABOX will incorporate many different technologies in order to function.  There will be three main elements; a physical cube, a scanning station and a base station.  The cube will incorporate a knocking mechanism and QR codes that will be read at the scanning station and all the functionality will be synchronized using a laptop.



The cube may be 3D printed using Polypropylene or constructed out of wood.  The knocking mechanism will be made using a 12v Solenoid that will be controlled by an Arduino, which will communicate wirelessly to the laptop.  The scanning station will incorporate a standard PC Webcam and a 12-character LDC display to inform the user of the country choice.  The station will also use a proximity sensor in order to only scan for QR codes when the cube is below the webcam.  Once a code has been detected it communicates the same information to the Arduino in the cube and the Arduino controlling the LCD display.  Then the knocking mechanism will begin to knock at the appropriate rate and the LCD will display the name of the country.

2 Arduino boards, 1K resistor, 3K resistor, TIP120 transistor, 1N4004 diode, External battery supply, 12v solenoid, 2 Xbees, LCD Display & An Infrared Proximity Sensor.

This piece was inspired by the tactile quality of children’s stacking rings.  DATA STACKER includes six blocks which can be stacked up on one another in any order the user wishes.  Each blocks represents the air pollution figures of the following countries; Ireland, Greece, Turkey, India, Egypt and the United Kingdom.  The blocks may be easily disassembled and compared with one another.

`Inspired by a childrens stacker

DATA STACKER (tactile)







DATA STACKER uses the length and width of each block to represent the data of from each country.  All other elements are identical in each block, including the material (all blocks are made from MDF with a transparent varnish) the height is also common across all the blocks.

DATA STACKER (from side)

DATA STACKER (from top)

DATA STACKER (from perspective)




















October 3, 2011
by trevorhogan

Plagiarism Visualized

In March 2011, Karl-Theodor zu Guttenberg, Germany’s defense minister, resigned  after admitting that he plagiarized his PhD dissertation.

Gregor Aisch visualized Guttenberg’s dissertation, highlighting the plagiarized portions.
Large bars = normal text line, small bars = footnote line. Dark red bars stand for full rip-offs and masked rip-offs, red bars represent other plagiarism categories (german).

Source: Guttenplag-Wiki.

On: vis4.net
Via: Flowing Data

September 28, 2011
by trevorhogan
1 Comment

iHCI Conference 2011

I presented a poster and paper at the Irish Human Interaction Conference 8th & 9th September 2011.  The title of my paper was “Human-Data Relations and the Lifeworld” This minor study was conducted in order to help me to classify various data visualization technology as well as contextualizing my own work.  Please see below for the full paper & poster, click here to to download a PDF version.

Human-Data Relations and the Lifeworld


This paper introduces a new approach to classifying the way in which data visualizations mediate our experience of the world.  In doing so we will use the concept of human-technology relations as developed by Don Ihde, a phenomenological philosopher of technology.  Following a synopsis of Ihde’s four human-technology relations, each is then developed in the context of specific data visualization technologies/artefacts.

Data Visualization, Design, Experience, Theory.

In contemporary society, data visualisations, in the form of demographic statistics, financial reports, economic trends and others are now been disseminated through many forms of media, which compete for people’s attention, contemplation and comprehension.  To date, research has mainly focused on effectiveness and efficiency as the main factors for assessing the value of data visualisations.  The purpose of this paper is to look beyond these criteria and focus on evaluating the way in which our world is mediated by data visualisations. In order to do this, we have used the concept of human-technology relations as developed by Don Ihde in his book Technology and the Lifeworld: from garden to earth [1].

Ihde asserts that when we think about how our everyday experience is mediated by technology we can characterize this by placing four unique relations along a continuum of human-technology relations, each of which positions us in a slightly different relation to the technology.  He classifies these as: Embodiment Relations, Hermeneutic Relations, Alterity Relations and Background Relations.

According to Ihde, Embodiment Relations are characterized by a “partial symbiosis” of a person and a technology during which the technology in use is “embodied” and becomes “perceptually transparent” [1]. An example given by Ihde is eyeglasses or a telescope, where one looks through rather than at the technology. Hermeneutic Relations involve reading and the interpretation of the technology. Although one might be focused on the technology, what one actually sees – immediately and simultaneously – is not the technology itself but rather the world it refers to.  An example would be a thermometer; we must interpret the output on the display before we can apply it to the world it refers to. Ihde calls the third type of relations Alterity Relations. In this case, technology is experienced as a being that is otherwise, or as Ihde describes, a “quasi-other”.  An example would be an intelligent robot.  Ihde’s argues that the first three relations differ from the last one as they are classified as technologies that require direct and focal attention.  The final category is located at the periphery of human attention. Background Relations is understood as “present absence”, as something not directly experienced yet giving structure to direct experiences. For example, an automated home heating system does not require daily attention, however, it continues to shape the inhabitants’ experience by providing a warm and comfortable environment [1].  Now that we have briefly summarized Ihde’s four human-technology relations the next sections will focus on these relations in the context of data visualizations, presenting a specific real-world example for each of the four relations.

Human-Data: Embodiment Relations
According to Ihde an embodiment relationship with technology involves the technology being transparent or withdrawing from our perceptual awareness. The focus of the human is not on the technology but on the content that it is referring to. To describe this further using a real-world example we have chosen SenseTable [2] as shown in Figure 1. Designed as a learning application, SenseTable utilizes physical objects, projections and sensory feedback in order to visualize a set of complex phenomena that would otherwise be difficult to comprehend using other modalities such as mathematical descriptions and formulas. When students interact with SenseTable they see through the physical and virtual objects that make up the interface to what is being visualized, the principles of System Dynamics and Chemistry. It is important to note, however, that the degree of perceptual transparency one experiences is dependent on a number of factors, including: the familiarity with the application and domain.

Figure 1. SenseTable (with kind permission of James Patten)

For experienced users, SenseTable offers directly experience of the phenomena by manipulating the physical interface, they embody these objects and their focus is now on the results of their actions (the visualisation). It is acknowledged that some people may see SenseTable and other such applications as being on the periphery of what is generally recognised as a data visualisation. Arguably such applications may be defined as simulations and not visualisations. However, it was decided to include SenseTable as an example of embodied relations as it possess many features that are typically associated with data visualisations. This open issue will however be addressed further in future research.
Human-Data: Hermeneutical Relations
Arguably, the predominant relationship that humans have with data visualisations is a hermeneutical one.  If we broadly define data visualisations as artefacts that represent data in a certain modality and which requires interpretation in order to form some insight into the data, then perhaps we maintain a hermeneutical relationship with all data visualisations.  As one of the aims of this research is to help contextualise the practical work developed in conjunction with this research, we will use one example, Vessels of Ireland’s National Debt (1910-2010) as shown in figure 2 to discuss hermeneutical relations with data [3]. Each of these vessels were created by inputting data that represents the national debt of Ireland since 1910 into an algorithm which processed this data and outputted 3D hollow vessels.

Figure 2. Vessels of Ireland’s National Debt (1910-2010)

These were then printed using a 3D-Printer. The purpose of creating vessels from the dataset was to encourage people to reflect on the economic, social and cultural implications that surround the conceived dataset in an interesting and unique manner.  It was intended that the vessels themselves would be the immediate object of interest; however as the audience touch and caress the uneven and pointed edge of the vessels they would think beyond these on the topic of national debt. This process of mediation typifies the hermeneutical relations as declared by Ihde.
Human-Data: Alterity Relations
As noted by Ihde, alterity relations emerge in a wide range of computer technologies that display a quasi-otherness within the limits of linguistics and, more particularly, of logical behaviours [1]. Arguably, no other technology exemplifies these characteristics more clearly than in-car satellite navigation systems (SatNav). Once you have programmed a SatNav it becomes the centre of attention, as a quasi-other, to which we relate by obeying intelligent directions verbalised by the device.  When describing alterity relations, Ihde also discusses the fascination humans have always had with the quasi-autonomy of technology.  This fascination is very evident with the SatNav, however, with this also comes a degree of trust involved in this relationship. When this breaks down (we reach a dead-end) this fascination and trust turn into frustration and even rage, not with oneself but with the quasi-other.
Human-Data: Background Relations
Ihde states that “background technologies, no less than the other focal ones, transform the gestalts of human experience and, precisely because they are absence presences, may exert more subtle indirect effects upon the way the world is experienced” [1].  This account may also be used to describe the concept of ambient visualizations or ambient displays. These technologies are generally defined as a category of data visualizations that convey time-varying data in the periphery of human awareness.  One such example that occupies a background relation to its near audience is the eCLOUD as shown in Figure 3. The eCLOUD [4] is an ambient data visualization sculpture inspired by the volume and behaviour of an idealized cloud.  On permanent show at the San Jose International Airport, the patterns of the artwork are transformed periodically by real-time weather data from around the world. Within the environment which it is placed (5 meters above the floor) eCLOUD has a background role, it does not occupy the focal attention but nevertheless, as a piece of architectural art, it still conditions the context of its environment.

Figure 3. eCLOUD, San Jose International Airport, USA.

This paper presented Don Ihde’s four human-technology relations and developed these in the context of data visualisation technology.  Due to space restrictions only one example was presented for each relation.  However, these examples not only show that our Lifeworld is mediated by data visualisations in education, public art and driving, it also demonstrates that data visualisations occupy each region of the continuum of human-technology relations as described by Don Ihde.
[1] Ihde, D. (1990). Technology and the lifeworld (p. 99). Indiana Univ Press.
[2] Patten, J., Ishii, H., Hines, J., & Pangaro, G. (2001). SenseTable: A Wireless Object Tracking Platform for Tangible User Interfaces.
[3] Hogan, T., (2011). Vessels of Ireland’s National Debt (1910-2010) http://tactiledata.net/?p=23, accessed: June 2011
[4] Goods, D., Hafermaans, N., Koblin, A. (2010). eCLOUD, http://www.ecloudproject.com/, accessed: April 2011