The Final Reflections of this Course

In this last blog entry, I will start by reflecting on my experiences with designing with various materials and how we have explored these. Then I will reflect on how the texts helped support our process towards the final outcome. Finally, I will elaborate on the overall impressions of this course and how it has given me knowledge that can be used in different contexts.

The trinity of forms
For me, this course offered a completely new way of approaching the design process with the focus on “designing as reflective conversation with design materials“ (Schön, 1992).

From earlier courses, I have learned how to use various approaches based on user experiences. Because of these learnings, it was challenging initially, and a completely new way of understanding and managing design. However, to manage how to explore the different materials and computations, it was found helpful to make use of what Vallgårda (2013) presents as the trinity of forms to help unfold the interaction design. As it helped manage the different components and explore it in different directions, while still keeping in mind how the final outcome could turn out to be, and the on going negotiation of the temporal form, the physical form and the interaction gestalt. Through that, it helped to overcome the challenges of perhaps trying out too many materials and ideas at once, before finally in managing how to shape all these components into a whole.  Furthermore, the way it aided us in managing the design of the interactive artefact, was through help engaging in one form at the time. For the physical form the materials were continuously explored, through moulding and building up different possibilities of what the jellyfish (I2SEA) could act out. Whereas, the narrow focus on the temporal form, engaging us in explorations of the code, in terms of how the light of the led strip should act and how to create a feedback of the light, to give the impression of a living creature that we wanted I2SEA to obtain. Finally the focus of the interaction gestalt, assisted in the constant experiments of how the physical form and temporal form acted out as a whole, to gain knowledge of how the users might engage with the artefact. This process followed up even until the last day. At this point, we choose to remove one of the functional feedbacks of a right light, that would appear if the artefact was swung too hard by the string, as it constraint the movement of the user’s, making them too careful when exploring the design.

Speaking the language of interaction design
Another important part of this process was to learn the vocabulary of interaction design. To do so, the book ‘The design of everyday thing’ by Norman (2013) and article “Interaction frogger: a design framework to couple action and function through feedback and feedforward” by Wensveen, Djajadiningrat, & Overbeeke (2004) were very beneficial, in terms of learning how to approach and manage all parts of the design. On one side, I have learned through Normans (2013) design principles of affordances, conceptual model and signifiers, understandings of of how artefacts can be perceived and approached, with regards to different components of an artefact. While on the other hand, Wensveen, Djajadiningrat, & Overbeeke (2004) terms of feedback and feedforward.  The latter gave the skills towards approaching the design process in regards to grasp what kinds of information in terms of that feedforward that takes place before a user’s action(s) and provides information hereof. Moreover, the feedback of a product or artefact and what kinds of information that is returned to the user.  These terms was especially useful to create the though around how the feedforward could provide just the right amount of information to the user, while creating a feedback that can make an impact on the user as well.

Therefore, this course, gave me understandings of how to explore design, through the materials rather than focusing on the user, as I have experienced prior to this course. Furthermore, I feel more enabled towards being able to create a more tangible outcome rather than a screen based. This aspect was previously rather challenging for me, so it has encouraged me to learn how to manage the skills of designing interactive artefacts design, in a very compressed period. I will carry these learnings with me to the next step of my specialisation as it most certainly will aid my personal skillsets.  Furthermore, the knowledge of how to manage a design in all possible contexts is very applicable in future endeavours in my professional career.

Here is a small representation of how the jellyfish I2SEA was experienced at the exhibition:

References
Norman, Donald A (2013). The design of everyday things: Revised and expanded edition. Basic books. Ch. 1,2,3.

Schön, Donald A. (1992). Designing as reflective conversation with the materials of a design situation. Research in Engineering Design 3.3, 131-147.

Wensveen, Stephan AG, Tom Djajadiningrat, and C. J. Overbeeke. (2004). Interaction frogger: a design framework to couple action and function through feedback and feedforward. Proceedings of DIS2004. ACM.

The Exhibition of I2SEA

This week, we had to focus upon the exhibition, as this was our final opportunity, to show what we have been creating this semester. Therefore, I will elaborate on our experience at the exhibition, the obstacles that occurred meanwhile and how we managed that and thereby came out with an outcome closer the intended.

The days up until the exhibition, the jellyfish was named I2SEA and were given a female personality, hence we called her by her name at the exhibition and thereby the creature I2SEA finally came to live at the exhibition. At the exhibition the temporal form of the lights were created, so if not touched the light would go into a ‘snooze’ state, lightning up slightly in random patterns, to give a playful character, that would make people pick it up by the string and start playing. When being picked up by the string and swung around in the elastic band, it will brighten up in a very strong light through the lively fade function, as explained earlier. However when being swung too hard it would give the functional feedback of lighting up in a red light for 5 seconds, the reason for that, was because we wanted the user to feel like is a creature with emotions and that you should handle her carefully.

However we experienced some troubles with regards to the playfulness, as the red light created some constraints for the user, they did not like to ruin I2SEA. Therefore they where very careful in swinging the artefact around, which made it less playful in regards of movements of the jellyfish. On one hand this might be due to the accelerometer catching it to give the output too early or the sensor not being the right for that particular part (as seen in the below video). Therefore, the interaction seemed to be interrupted with the red light appearing.

Therefore, we chose to quickly change the temporal form of the lights and removed the feedback of the red lights. Through that, the user had more potential to explore the interaction gestalt, being able to swing it around without the feeling of having to be too careful with I2SEA.  In the below video, is an example of how playful it suddenly became, with only using the functional feedback of the bright light when being swung around in the elastic band.

At the exhibition, it was great to finally get to explore how the physical form, temporal form and the interaction gestalt actually acted out as a whole. Furthermore, how a simple change of light in the temporal form, created constraints for the users, making I2SEA a less playful creature than we imagined being. Hence, this challenge, became a way for us to explore even further what actually created the constraints of the user when playing around with the artefact and how quickly it was to overcome this challenge. Thereby, the design came into a more lively state for the user and created the playful interaction that we could only hope for.

Exploration of Materials

To explore even further with the materials, we especially focused in the physical form and temporal form this week, to see how the thoughts of the looks and the computational states of the Jellyfish could be realized in practise.

Physical form
At first we had to figure out how to build up the jellyfish, to do so we found a lit of a juice at the canteen, that could be used as the top of the jellyfish and contain the lights. From that a skeleton was build up around it using plexiglas cut on the laser cutter.

To figure out how to accommodate the looks of the jellyfish in combination with the led lights we tried out different materials, to figure out which ones to choose as the right one for our design. The final materials however, ended up with something completely different by coincidence. But to get to that point, we firstly had to mould the silicon base with the lights around this shape somehow.

At first we tried to mould different silicon and latex on top of the shape, to make room for the lights inside the top. However, when trying to light the different materials up with the led strip, we found that it did not give the impression of being almost alive, as it became to mechanical to look at. Therefore, we tried to look around in the lab, where there was a round silicon shape lying on the table in one of the labs, with a hole in it, therefore we tried to insert the led strip, and it worked magically, the lights finally came to live! (as seen in the first picture). Through that experience, of exploring the different materials, it gave a good result of finding the right materials to let the led strip shine through. Furthermore, to create the tentacles of the jellyfish, a regular silicon based join filler was used, on the surface of a balloon, to try out and see if the tentacles could also be silicon based, or if we had to use fabric instead to create the aesthetic movements of the jellyfish.

Temporal form
To give the impression of the jellyfish coming alive, the lights had a very important function, as these could help in giving the impression of a pulsating living creature.  After researching a few plausible ways of going forward with this, we found that sine and cosione, could help creating this feature. Hence, in order to gain a continuous and natural wave of lights, we chose to play around with the sine and cosine functions, as they are natural poles to each other, this could be exploited by dividing the lights into an upper an lower par, where the lower part is set to exploit the waves of cosine and the upper part, sinus (as seen in the below image).

However, to actually use this function in the code was found challenging, as the function somehow had to be translated into the following fade function.

Fade function
We wanted the fade function to go through a constant loop, hence the void loop was used.

void fade() {

      // map() only deals with integers, so sin() is scaled by a factor of 1000

      // Note: sin() works with radians (2π radians = 360°)

    // Lower pixels

As we wanted the the two parts to exploit the poles of sine & cosine, the lower part was set to fade the light in and out using the sine function. Hence the title of this is the lower pixels of the lights of the LED strip.

    for (int i = 0; i < 8; i++) {

The for loop is used to set the first 8 pixels of the strip, therefore as long as the value is below 8 the loop will continue and plus it by one.

      brightness = map(sin(singleCount - pi/10) * 1000, -1000, 1000, -10, 100);

// normal toLow and toHigh is -10 and 100

sine and cosine gives values in between 1 and -1 one, therefore it is actually the angle that rotates within the function. We choose to * with 1000, as the accelerometer only reads whole numbers. Furthermore we exploit the map function at this point to map it down to -10 to make sure the light at this point can be completely shut and a 100 at its maximum in brightness.

      if (brightness > 0) {

        setColor(i, brightness); // If the brightness is positive call the set Color function with the brightness value

      }

This is to make sure, that the output of for instants the above negative value is turned into 0, and thereby insure that the light is turned off, rather than risking that the value could be turned into a different value instead.

      else {

        setColor(i, 0); // If the brightness is negative call the setColor function with the value 0 (turn the pixels off)

      }

    }

Here the same loop goes for the sinus function, which regards the sinus function, hence creates  the waves of light for the last wave is from the 8 pixels up to the 14th.

  // Upper pixels

    for (int i = 8; i < NUMPIXELS; i++) {     

      brightness = map(cos(singleCount) * 1000, -1000, 1000, -10, 100); // Utilizing the fact that cos and sin are inversions

      if (brightness > 0) {

        setColor(i, brightness); // If the brightness is positive call the setColor function with the brightness value

      }

      else {

        setColor(i, 0); // If the brightness is negative call the setColor function with the value 0 (turn the pixels off)

      }
    }


    singleCount = singleCount + countStep;

Through this specific fade function, the led strip in combination with the silicon moulded top, created a light that somehow came to live, making the jellyfish looking as a real creature.

Interaction gestalt
Through the experience of exploring the different materials that could support the appearances of the physical form it finally gave luck, in terms of what we wanted the intended outcome to be. However the appearance is supported by the temporal form as the whole experience of the artefact, in which first came in to shape after playing around with these three stages. Through overcoming that challenge of getting the light to have the correct expression through the use of the light at the temporal form, it gave a complete impression as a whole. Which hopefully can lead towards, the exhibition next week, to cast upon how the interaction gestalt will be when it is actually in use, rather than what our imagine of how it could impact people in its use and affordances of the object itself. Because we imagine people picking it up by it string, but will they? The only way to figure that out, will be through the exhibition, because for now it is only what we have explored and imagined rather to see what users could see in it, which I look forward to see.

An Aesthetic Interaction

The readings of todays lecture, gave insights of how aesthetic interactions can act out, which I will reflect on in this post and afterwards, set it into relation of our design subject of the interactive jellyfish.

Petersen et al. (2004) emphasize three aspects of aesthetics, when designing towards an aesthetic interaction which follows; 1) The socio cultural approach to aesthetics, which regards the intuitive assessment of aesthetics of an object, however the assessment does not lie within the chair but with the human appropriation of the object. 2) Designing for mind and bodyas the aesthetic experience both withhold the of both the bodily sensation and intellectual challenge 3) and the instrumentality of aesthetics, as aesthetics is something that emerges in use when understanding the interactive system and potential use, hence aesthetics is not something that can be added. I find these notions valuable in terms of capturing what an aesthetic interaction can inherent, however difficult to grasp what creates an aesthetic interaction? Petersen et al. (2004) describe it as the following.

Aesthetic Interaction is not about conveying meaning and direction through uniform models; it is about triggering imagination, it is thought-provoking and encourages people to think differently about the encountered interactive systems” (Petersen et al., 2004, p. 271)

This particular notion is something that can be drawn upon in the project of shaping the “Jellyfish” as we would like people to think about the aesthetics of the interactions rather than it being something beautiful to look at. Moreover, another notion of how aesthetics appears as is Djajadiningrat et al. (2000) that argues one should not think beauty in appearance, but in interaction. That the aesthetic interaction happens within the interaction, hence the instrumental part. Which led us to draw out possible prototypes of the jellyfish as presented below, to imagine how the temporal forms could act out.

The project we are working on at the moment, is changed in terms of the interaction, the idea is to make it inviting to pic up in a string, rather than picking up the Jellyfishitself. To do so, we would like the artefact not to invite people to touch it directly, by making the appearance looking like it would feel like touching a slimy jelly fish, hence inviting people to pick It up by the string as seen in the below videos, presenting possible materials used and a quick paper prototype of the structure.

Through that we would be able to make people hopefully feel the invitation of the aesthetic interaction we propose and make them create their assumptions based on their subjective experience with the artefact itself, when experiencing the movement-based interaction of the jellyfish, floating with them, along with the feedback of the lights following the user’s movements with the artefact.

References
Petersen, Marianne Graves, Ole Sejer Iversen, Peter Gall Krogh, and Martin Ludvigsen (2004). Aesthetic interaction: a pragmatist’s aesthetics of interactive systems. In Proceedings of DIS2004 (pp. 269- 276). ACM.

Djajadiningrat, J. P., Overbeeke, C. J., and Wensveen, S. A. G. (2000) Augmenting Fun and Beauty: A Pamphlet. 275 In Proceedings of DARE’2000. ACM Press, pp. 131-134.

The Exploration of Ideas

This week we focused on the continuous exploration through the trinity of the temporal form, interaction gestalt and the physical. Where we wanted to figure out how our steps towards shaping the jellyfish with different materials could be, before trying out the different possible materials. However, these three should not be set completely apart as it is an on-going negotiation between the three elements (Vallgårda, 2014).

Physical form
To start exploring the physical form, the first thing that came to mind was what material should we use? Furthermore, how to make the user pick it up by a string instead of picking the jellyfish up by hand. Should the jellyfish itself be uncomfortable to touch? As presented in the below sketch, we explored ideas of it being pointy, sharp or even have teeth.

However it came to mind what if the jellyfish could have the same slimy looking physical appearance as an actual jellyfish does, but how to do so is the biggest challenge. So far the explorations have mainly been through paper prototyping. On the same time, I would like the Jellyfish to obtain the same elegance of movements as an actual one does move, so how do we create this movement? Is it through building it up with a ‘skeleton’ as presented in the last post, or could it be through the use of the right materials that can give that graceful movement of the jellyfish, I have attached a video below to show what I mean by graceful, the jellyfish is a very biased experience to me, as something slimy be so graceful at the same time?

It might also be the combination of the lights inside the jellyfish and the elegant movements that creates this almost hypnotizing way of floating through the water. The challenge is, how to create these lively states of light while still making it beautiful in its movements?

 Temporal form
At the temporal form, we explored mainly the stages of the lights, and how these could create a feedback that could engage the user when interacting with the jellyfish. To create more powerful lights, we chose to use a led strip, as it can light up even when shined upon in daylight. But, we wanted to create this smooth light without using delay, was found particularly challenging. However when it finally was set into the most fluent interval, it created a very smooth light. The reason why we are interested in not using a delay is because it would interrupt the on going code, thereby the interruption is avoided.

We chose to set the interval of 20 milliseconds, is this gave us a very smooth output of the led lights, if for instants the milliseconds were set to 50 seconds it became very stammering in the output, so to figure out what interval it should become was found a bit challenging, as it was a matter of trial-error.

Furthermore, the below code uses the millis() function which returns the number of milliseconds (20) since the board starts to run the current sketch that lights up /blink the led lights. Through that the function starts up the loop of running the led lights, defined later in the code.

Interaction gestalt
How can we create the inherent feedforward, which gives the impression of only being able to pick it up by the string?  The interaction gestalt is about the performance of movements that the user does in relation to the thing or the environment (Vallgårda, 2014). Therefore particularly with the interaction gestalt, we have to put our selves through all possible kinds of scenarios, to play around with whatever materials we can come up with as something useful for us in this project, if we want it to be playful, we have to explore the materials in a playful manner as well.

Exploring the interaction design, through the trinity of the temporal form, interaction gestalt and the physical. Have been very helpful in terms of what to explore next in this process. It also made me think of all the constraints both in terms of the materials and the build up of the artefact. Because all the materials we will go through, it has to obtain the gracefulness but at the same time, not being something you would like to touch, so is that even possible?

References
Vallgårda, Anna (2014). Giving form to computational things: developing a practice of interaction design. Personal and Ubiquitous Computing 18.3: 577-592.

Show and Tell

Last week we explored and prepared for the show and tell this week with of our initial idea of the project. Our idea is inspired by anthropomorphism, we would like to obtain in some way that the artefact can represent emotions and have human traits and thereby create sympathy of the users. Furthermore, we were inspired by movement, and how to create movement based interaction, through the use of an artefact (Hummels et al., 2007). However, how could we create an artefact that affords people to interact with it in certain ways? We would like the changing light of the ‘jelly fish’ to create this interaction between people that people wants to hear and see the change of the artefact and thereby create this very playful interaction. Through this interaction between people, it could be a dialogue of the jellyfish, rather than just an action-reaction based interaction. However to come to this point was a challenge, so how did we actually get to this point? According to Schön (1992) the challenge a designer deals with when designing, is challenging and unpredictable and mainly tacit, but this is central to the design process . Through the direct interplay of sketching and writing down ideas and to build upon this, the reflection-in-action happens, where the outcome can end out entirely unpredicted and is improvised in situ (as seen in the above pictures).

So what happens next? The hardest steps in my opinion are when going from ideas into a more concrete idea, and think through what materials to explore and what constraints there could be. To do so it is important to articulate your thoughts amongst each other. This is also what Schön (1987) articulate as reflection-on-action, to be able to reflect upon action, especially with others and to try and understand what have been created so far. Through that we came to mind, that the jellyfish, should be able to express its emotions through the functional feedback of light and sound, so when you touch it, it should mimic that it hurts through lightening up in red and when you throw it to one another, it should become blue and be ‘set free’. When not in motion, the jellyfish could whine, to make people aware that it is ‘in pain’ and wants to be set free. Furthermore, the most fascinating property of the jellyfish, that we would like to mimic is its movement itself, however something that we experience as challenging to obtain. The process of reflection-in-action and reflection-on-action is something that happens continuously throughout a design process, which led us to the following paper prototype of the idea (see figure below).

After the show and tell, we were given feedback, to explore with the movements, as the lecturer said “if you are going to throw it to each other, why not just make a ball?”. Therefore, we explored around with how to create another movement based interaction with the jellyfish prototype, with the thought of how to make it an extension of the body, but still being playful for the user and ended up attaching a string to it (see video below).

Through this process short period of time, the constant act of seeing-moving-seeing have broad us to a completely other outcome than the initial already (Schön, 1992). By the simple act of attaching this string to the jellyfish, it creates new affordances than beforehand, making us explore new ideas of how to create this creature full of emotions, while still being playful for the user.

References
Hummels, Caroline, Kees CJ Overbeeke, and Sietske Klooster (2007). Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction. Personal and Ubiquitous Computing 11.8: 677-690.

Schön, Donald A. (1992). Designing as reflective conversation with the materials of a design situation. Research in Engineering Design 3.3, 131-147.

Schön, Donald. A. (1987). Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. San Francisco.

A bodily experience

In this post, I reflect upon what it means to design for the body by discussing the texts we read for this week and the design we did in groups. Specifically I reflect upon how to create bodily extensions and what this might entail.

According to Svanaes and Solheim (2016) digital technologies are currently moving closer to the user’s body through e.g. smart watches and wearable computing, which create new challenges of interaction designers and HCI as a research field. The approach of their research was with inspiration of Merleau-Ponty’s concept of the lived bodythrough “Designing for the body with the first-person perspective of Merleau-Ponty makes us aware of how technology is incorporated into the experienced body, and how it thus changes us.” This made me realize just how close technologies are to the bodily experience in our lives and just how big an effect this might have in our everyday. Through their research. they ended up with designing a wearable tail that actors could use on stage, controlled externally and movable ears controlled by the actor himself through the use of a glove. These are good examples of how to make direct extensions of the body, however one might question whether a bodily extension only is an extension if you can wear it somehow? Because for instance when kicking a ball, making it move and passing it to another person, does it not make it an extension of the body that can provide you with the possibility to interact with another person through that ball?

At the course we are asked to Design of an interactive light that is close to the body, to do so inspiration had to be gathered. This was done through an extensive brainstorm, of ideas of a chair and even a long tube, that you could crawl through where you could look in and see only good things and nice sounds and after crawling into the tube, you should gain the feeling of darkness with eyes lightning up and following you and creepy sounds to make you reflect on things not always being what it appears as.

However the latter idea, was not found to be feasible to actually build, therefore we tried to gain inspiration through other elements of the idea instead. The playful things of creating artefacts that feels like it is alive was something that intrigued interest, therefore it came to mind how to mimic and simulates emotions of a prop however also keeping in mind that there should be light incorporated somehow. After scrolling through Pintrest for some time, we stumbled over the Jellyfish, as this animal actually glows like light is coming out of it (see below image).

This ended out into a more critical design approach, as it came to mind how children sometimes took these animals at the beach and throws it to each other, as if is a playful toy that they can fool around with.  As this is not the case, we would like to mimic the emotions of the Jellyfish when being thrown to one another, through the use of light as it could be red when someone is holding on to it and make noise of a person suffering and when being thrown it is ‘set free’ and therefore turning it’s colours into green/blue to mimic freedom, with sounds of a person being happy, folding out as it is swimming through the water. In this way, they jellyfish could be extending the bodily communication between the users, as they would have to engage themselves with the artefact in order to ‘set the jellyfish free’ and thereby save the jellyfish together as a team. However this is a challenge to build, therefore we played around with paper prototypes and tried to create some of these effects through sensors of the Arduino (as seen in the below videos).

This ended out into a more critical design approach, as it came to mind how children sometimes took these animals at the beach and throws it to each other, as if is a playful toy that they can fool around with.  As this is not the case, we would like to mimic the emotions of the Jellyfish when being thrown to one another, through the use of light as it could be red when someone is holding on to it and make noise of a person suffering and when being thrown it is ‘set free’ and therefore turning it’s colours into green/blue to mimic freedom, with sounds of a person being happy, folding out as it is swimming through the water. In this way, they jellyfish could be extending the bodily communication between the users, as they would have to engage themselves with the artefact in order to ‘set the jellyfish free’ and thereby save the jellyfish together as a team. However this is a challenge to build, therefore we played around with paper prototypes and tried to create some of these effects through sensors of the Arduino (as seen in the below videos).

References
Svanaes, Dag, and Solheim, Martin (2016). “Wag Your Tail and Flap Your Ears: The Kinesthetic User Experience of Extending Your Body.” Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2016.

 

Movement-based Interaction

At this post, I will firstly reflect upon how movement-based interaction can entail specific movements and interactions. Secondly, hos the simplicity of gestures can influence the design of movement-based interaction. The VCR and the MP3 player, are two great examples of how technological objects was build in the 80’s and 90’s. As these consisted of exclusively labels & buttons to afford interaction.  Whereas, we are now moving away from the keen focus on cognition towards an increased focus on the person as a whole, rather than the object itself (Hummels et al., 2007). Through that the person can now be involved with both the mind body and heart, which leaves us with endless interactions to explore. One of these aspects is movement-based interactions, were you for example can draw upon choreography interaction, to create something that props movement through a product, and thereby entails specific interactions, as for instants as seen in the picture below a “a vase that has a circular shape and curved grooves that fit the whirl of two turning arms and hence the sliding inwards of flowers with a fluent rhythm, one after another”. The aspects of creating something through existing interactions is something that I find particularly interesting, as one might not have created this, if the person was not motivated to make the perfect flower arrangement in the first place.

Another aspect of movement is gestures, as for instants hand gestures that we make in our everyday lives for instants when talking to someone, showing direction or just simply opening a door, this is something that evokes some kind of gesture. Gestures can enable to convey emotions and expression, through that we might be able to use gestures as a part of movement-based interactions. With the use of this mindset a design, can created through movement (design through movement),which also supports their notion of Reasearch by doing. Hence, when creating a design through the notion of research-by-doing, it enables to support towards creating a design through movement. These two aspects is something that can help in exploring new movement-based interaction designs, as one might not know how to approach the subject, before acting it out prior to creating a concrete artefact.

These notions helped in creating an idea of how movement-based interaction, can help engage the user towards an specific intended outcome. Furthermore, in how this subject also can be entailed by simple things as everyday gestures.

References
Hummels, Caroline, Kees CJ Overbeeke, and Sietske Klooster (2007). “Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction.” Personal and Ubiquitous Computing 11.8 : 677-690.

Djajadiningrat, Tom, William Gaver and Joep Frens (2000). Interaction relabelling and extreme characters: methods for exploring aesthetic interactions. In Proc. of DIS2000 (66-71). ACM.)

Form and Function Follows Interaction?

In this post I reflect upon the interaction driven approach, of how to explore through the approach and afterwards set in in relation to the exercise that we did at today’s lecture of lasercutting, to grasp what the interaction driven approach could mean?

The field that this course is revolved primarily what Maeng, Youn-kyung & KunPyo, (2012) describe as the interaction driven approach, this approach is led by exploring opportunities through interaction concepts, with a key focus on movement. Further, interaction is being defined as an independent factor that does not rely on the user’s needs (Maeng, Youn-kyung & KunPyo, 2012).

The interactive driven approach provides the opportunities to explore through materials and interactions. Moreover, a key point of this approach is the following “Form and function follows interaction” (Maeng, Youn-kyung & KunPyo, 2012) what does that actually mean? At todays lecture we where asked to try out the lasercut machine and create a lasercut lamp using green paper. At this task we came up with the idea to take advantages of the colour, and create a lamp that consisted of cut out leaves, that could be gathered together, looking like a pile of leaves that the light could shine throughout. The pile ended looking as the following (see below picture).

As the picture shows, the idea came to live through the materials, however, was not easy to gather, therefore if we where to build it once again, I believe we should have created a skeleton with the lamp, instead of leaving it that fragile as it could end up being destroyed just lifting it up.  Therefore, the note that form and function follows interaction, does create meaning in this case. As, the form it is created in is this fragile, function as a lamp, it should be handled very carefully in the interaction, as the lamp would break if moving it without being really careful. One point of failure in this process might be due to, that we did not think of the action before creating the materials and form of the lamp. On the other hand, we could have explored a movement of the interaction with the lamp and get inspiration of something that is not static as a lamp is, but could borrow elements of other movements as for instants a particular movement one might do when sitting beside the lamp, or a intuitive interaction when creating the input (turning on the lamp) and output of the light. Through that approach, the lamp could obtain more playful objectives and perhaps affordances that are playful in another manner than one could imagine a lamp should withhold.

References
Maeng, Seungwoo, Youn-kyung Lim, and KunPyo Lee. (2012) “Interaction-driven design: A new approach for interactive product development.” Proc. of DIS. ACM.

The Three Levels of Design: Visceral, Behavioral and Reflective

So I have recently bought this baby mobile, as the very first item to decorate our future nursery. However, it is not very practical, as it is very fragile as the baby will most likely break it, if getting in any sort of contact with it. But, I liked the way it looks and thought it will look cute in a nursery. So how can this happen, why would I buy something so unpractical?

To understand this purchase and products in general, Don Norman (2004) proposes three aspects of design that interweave emotions and cognition: visceral, behavioural, and reflective. These three terms are found useful, as it can cast a light upon why the emotional site of products, might be more critical to succeed with a product, rather than the practical elements of it. Firstly, the first level visceral design is about the more intuitive behaviour, for instants if you feel something is pretty, it comes from the visceral level and deals with the look, feel and touch (Norman, 2004). Secondly, the behavioural level, only considers the use of a product and focuses on satisfying people’s needs. But, one should keep in mind that both visceral and behavioural reactions are subconscious which makes us unaware of the true reactions and causes hereof (ibid). Therefore, even though I try to understand the reasoning behind the purchase I did, it is intuitive and subconscious, so the real cause of it cannot be uncovered completely, but gives reasoning to understand the cause behind it. Lastly there is the reflective level, which often determines a person’s overall impression of a product. As Norman (2004) states “The reflective value outweighs the behavioral difficulties” (Norman, 2004 p. 85) therefore it can be hard to judge whether someone would by a product if not having a practical reason for it, but why else would someone buy a painting only to hang on the wall? This is as the overall experience of it; is where other aspects of the product can outweigh inadequacies.

In my case at this purchase it was mainly controlled by the visceral level, as I found it pleasing to look at, however the reflective level also must have been taken into account, as it in some level affects my reflective image. I would like to have a presentable home, when people come to visit, so even though the behavioural level did not play a big role in this purchase, in terms of the functionality of the product, or lack hereof, I chose to buy it anyway.

References
Norman, Donald A (2004). Emotional design: Why we love (or hate) everyday things. Basic Civitas Books, chapter 3.

Why, What & How?

For the fourth lecture we read about designing for experiences and how experiences are inherently subjective, and this made me wonder how it is possible to design for experiences when experiences is something intangible that changes over time? According to Hassenzahl (2013) an Experienceis intangible, is something subjective that happens immediately and changes over time, and furthermore is something that cannot be materialized, and does therefore differ from the tangible object itself. However, it is also underlined that “Things are not the opposite of experiences, but create and substantially shape them.” By that the materialized things take part in shaping an experience, which could mean that the subjective experience can be influenced by design. There are different types of experiences; User Experience, which revolves around the context of creating a meaningful experience through a device, though should be understood as a sub-category of Experience, when dealing with an experience shaped by interactive products. While on the other hand Experience design is the attempt of deliberately creating experiences. To experience something, can vary a lot therefore it can often be troublesome to create experiences that fit all individuals, this is something Harrison et. Al (2012) investigates in “Unlocking the Expressivity of Point Lights” where they investigate the different experiences of small LED lights used as indicators through different devices, where their study results in pointing towards people having similar impetrations of how a light should act in different contexts, as for instant when turning a device on or getting a notification using the LED lights. This made me wonder, if the experience always is subjective, how come can they all have almost the same interpretation of how the LED lights acts?  On one side, this could be that the behaviour of the lights is based on the reptile part of the brain, speaking directly to all human beings in the same manner. On the other, it could be that the people who took part in this experiment, came from the same cultural background, because if it was that easy to create an experience that fits all individuals, wouldn’t we all do it? To investigate this they conducted interviews with 256 males/females using simple random animated GIFs to see their reactions hereof. However, as this investigation points towards people having similar reactions towards the behaviour of the LED it makes me wonder, how can Experience design then consider people’s experiences and how can you make sure that people will make use of the artefact?

Source: https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/user-experience-and-experience-design

To do so Hassenzahl (2013) proposes three levels to consider when designing for technology-mediated experiences, as seen in the above figure, visualising the three levels. These levels consists of Why, What and How.  The three levels can be understood as the following the first Why,is regarding the personal needs and emotions, what the meaning or experience there is when experiencing, this can vary a lot for instant to listen to music, can be because of the feeling you obtain listening to it, or help with the concentration.What addresses what people do when interacting with the product, and relies on the functional aspects such as listening to music the third and last level How regards the more operational aspects, such as the button you have to press when turning on the radio to listen to the music. However, when thinking of experiences we can only design for experiences, the experience cannot be designed itself. Therefore one must always consider that the experience holds with the subjective experience of oneself.

References
Harrison, C., Horstman, J., Hsieh, G., & Hudson, S. (2012). Unlocking the expressivity of point lights. In Proceedings of CHI2012 (pp. 1683-1692). ACM.

Hassenzahl, M. (2013) “User experience and experience design.” The Encyclopedia of Human-Computer Interaction.

Creating Something Else

In this post I will first go through the coding exercise of todays lecture, followed by reflections upon the terms feedback and feedforward and how this can impact in building a design that is not only functional but also attractive to its user(s).

At todays lecture, we went through exercises of how to build a simple button that could switch a LED light on and off, as presented at the last intro lecture to the Arduino. Followed by adding an extra LED light an create opposite outputs for these, if one of the lights was switched on, the other was switched off. This was followed by exercises of adding a small speaker and finally building an interaction that included sound from the speaker. At the final exercise we ended out with a sound that would change when pressing the button (as seen in the below video). Creating these small simple interactions is intriguing, as it makes you wonder how simple a keyboard actually can be created. This was created simply by changing the frequency (hz) and then put it into a loop, I did this with two different outcomes, so there would be a little dynamic every second time the button was pushed, it will change its tone to 1000 hz and 500 hz (as seen in the below image and video).

The latter exercise of creating a simple sound input, made me think how hard can it then be to build up a simple keyboard, that you can play music on (as seen in the below video)?

However, after considering all the possibilities that lies within, the different feedback and feedforward that is presented in the article “Interaction frogger: a design framework to couple action and function through feedback and feedforward” by Wensveen, Djajadiningrat, & Overbeeke. (2004) I came to realize that it is a whole different story.  When thinking about the interaction of a keyboard compared to this simple function build through the Arduino, that have the simple inherent feedforward of a button and functional feedback of the simple sound that is coming through the speaker.  Then the construction of a keyboard is far more complicated. For instants the inherent feedback of pressing down a key compared to a button is as far from each other as it could be. When pressing down a key, it has to have just the right amount of resistance when pressing down the key, to quickly be able to go to the next node in the song.

Furthermore, when it comes to the functional feedback, the sound of a piano, is not just a sound, it have to sound like an actual piano and the output has to be speakers in a high quality to obtain the most realistic and greatest sound possible, otherwise why play the piano at all? Moreover, a keyboard can as the presented one in this post, have all kinds of Augmented feedforward in terms of the written words or pictograms and inherent feedback as for example all the different buttons that can be pushed and turned. Therefore, the actions that can occur is close to infinite, but gives the opportunity for the piano player/user to express himself in countless of ways if finding it interesting to figure out what all the kinds of signifiers could offer.  Through learning these terms, it made me think of design in a whole new light, making me aware of what an interactive design can actually entail for its user(s).

References
Wensveen, Stephan AG, Tom Djajadiningrat, and C. J. Overbeeke. (2004). Interaction frogger: a design framework to couple action and function through feedback and feedforward. Proceedings of DIS2004. ACM.

The Design of Technologies

So far I have written about what my expectations at this course are, however, how should I practice interaction design, what does it really mean? According to Don Norman (2013) interaction design can be understood as “Interaction design: The focus is upon how people interact with technology. The goal is to enhance people’s understanding of what can be done, what is happening, and what has just occurred. (Norman, 2013, p. 5) However, to focus upon how people interact with technology, we should also be able to unfold how people might interact with the technology. This is done through discoverability, which results from the application of the psychological concepts of; affordances, signifiers, constraints, mappings, and feedback, and lastly the sixth principle the conceptual model, which provides the true understanding (Norman, 2013).

To investigate the above concepts through going through an example of a challenge I have had when trying to watch Netflix using Apple TV vs. using Google Chrome Cast using the terms. Firstly, both Apple TV and Chrome cast is using the opportunity of the affordances that my ipad withholds, the possibility to use the touch screen and few buttons. Moreover, when entering Netflix application trying to connect, the buttons that signifies to connect are very similar to one another; therefore it was hard to differentiate for me in the beginning. The place where Apple TV does not work very well in my opinion, is the lack of feedback. The lack of feedback occurs when the connection fails to work immediately; therefore the Apple TV has its constraints, as there is no other option than waiting for it to work. There is no sign of it working at all if the connection is slow. While on the other hand when using Chrome Cast, the feedback is immediate, which is somewhat what modern systems try to provide, that the user gets feedback within 0.1 second of any operation, to create the reassurance that the request is received and there is an output of the input (Norman, 2013). Whether it is the conceptual model of the Chrome Cast that have a better explanatory model of how it works or it is the lack of feedback of the Apple TV that sometimes made me wonder if it was broken is hard to say. But, when that is said, I chose to go forward with chromecast as I never get the feeling of it being broken. As stated by Norman (2013) that “Technology offers the potential to make life easier and more enjoyable; each new technology provides increased benefits. At the same time, added complexities increase our difficulty and frustration with technology” (Norman, 2013, p. 32). If technologies are made more enjoyable and easier by removing some of the complexities, this might include a lack of feedback. Therefore I believe that a designer also should keep in mind the failpoints that could occur in the process and create a more natural mapping, even if it is just a lack of Internet connection.

References

Norman, Donald A (2013). The design of everyday things: Revised and expanded edition. Basic books. Ch. 1,2,3.

Designing Interactive Artifacts

When designing artifacts or concepts, I have mainly practiced how to explore new designs from a user-centered perspective until now. By that the user’s perspective or behavior was explored prior or throughout the process and from that the conceptualization or findings were discovered. Therefore the approach of learning primarily through theory and practice of building an interactive design is something that I look forward to experience at the course Designing Interactive Artifacts. Through this I expect to gain new knowledge and skills of how to explore future designs. Therefore, I also hope to explore new spaces of future designs, when exploring possibilities through a ‘hands-on’ method – thus, both with regards to the materials, theoretical perspectives and inspirational sources that can be used to support this.

By reading for the first lectures, I came across the perspective by Ehn (1998) of the virtuality and fluidity of rooms, which I find valuable, when thinking of how to explore spaces and boundaries without having a user-centered approach. In the article “Manifesto for a digital Bauhaus” Ehn (1998)states that “A room is no longer only material and solid, but also virtual and fluid.” Through this understanding a room can be explored in many possible ways, as the boundaries of time and space have changed through the possibilities within the digital communication and information (Ehn, 1998). I find this valuable, as thinking through this concept might create the potential to create new boundaries or approaches of exploring new possibilities. However one might question, how to approach a design only through practical methods and theory. Therefore, I find Dalsgaard, Dindler, and Fritsch’s (2013) notion of arguing throughdesign rather thanfordesign beneficial. By that new or undiscovered possibilities can be created, as the design can be unfolded in an approach that does not have to construct one purpose and meaning, but can withhold several impressions and understandings of the user, since the design object or concept can be the catalysis for exploring concepts, methods and theory (Dalsgaard, Dindler, and Fritsch (2013).

References
Ehn, P. (1998). Manifesto for a Digital Bauhaus. Digital Creativity, 9(4)

Dalsgaard, Peter, Christian Dindler, and Jonas Fritsch (2013). Design argumentation in academic design education. Nordes 1.5.