Saturday, 26 September 2015

Week 9 - Testing with Makey Makey

The second interactive prototype requires us to incorporate Makey Makey into our designs. I’ve been testing out different ways to use it and familiarising myself with the technology. What I plan for this prototype, is that the makey makey will be connected to a bag with a zipper, and as the user unzips/zips the bag, this will also be represented in the companion digital prototype (and sets off alarm accordingly).


Since the main physical interface of ZipperBan is a zipper, a decided to test out different interactions using the Makey Makey connected to the zip in an extra pair of jeans I had at the time. I hooked up the ground to my other hand, and inserted a wire as close as possible to the zipper rail, which was connected to the space bar on the makey makey. The challenge was to make sure the pin on the wire makes contact with the zipper as it passes through, in order to complete the circuit.


I even tried playing some music on Spotify and zipped up/down in order to play/pause the music. A video demo can be seen here:





How to incorporate this with ZipperBan?


Problems/challenges:
  • The motion of zippers is continuous, yet there is no ‘continuous input’ option on the makey makey i.e the values on the makey makey are either ‘pressed’ or ‘not pressed’ - how do we use these values to measure the displacement of the zipper?
  • If all the areas of the zipper were connected to a certain input e.g space bar, the system can know and therefore act whenever the space bar is triggered e.g if the zipper is moved and makes contact with a connected area, move the zipper on the screen to the right. There is however no information about the placement of the zipper in the xyz plane - it does not have a state indicating exactly where it is along the zipper. For instance, if the user moves the zipper forward, and it makes contact with connected areas 3 times, on the screen it will move the zipper to the right 3 times. However if the user zips backwards, the connected areas will still trigger the same action. There is no way to distinguish between the connected areas.


Solution: Divide the zipper rail into 6 points, attach a wire to each of these points and connect each wire to the 6 different letters in the makey makey. This ensures that as it touches each point, it will have information regarding its location and can better determine the behaviour. This solution will still have discrete steps, and the movement will be staggered, however a smooth motion between points is not the focus of this prototype.


Algorithm: Zipper passes point connected to letter ‘a’. ‘Point-is-reached’ event is triggered. If ‘a’ is the point reached, move zipper to location (x,y).


Here is a sketch of how the system will be connected:


The pictures below illustrate the iterations of producing the physical interface:


Positioning the wires:

Sticking foil on the zipper to increase chances of making contact with wires:

Sticking the wires into place:

Covering wire ends in foil to increase surface area and attach the wires better. The wire pins kept falling out previously by just threading them through the threads in the material of the pencil case. Attached them as close as possible to the zipper rail.

The main focus for testing for this prototype, will be to see how different levels of sensitivity affect the alarm system and behaviour of the thief. There will be three levels of sensitivity: weak, medium and high. Weak sensitivity means the alarm will be set off once the zipper has reached the end of the rail. High sensitivity means the alarm will be set off after only the second point has been reached. Medium will set off the alarm at point 4. It will test the ability of the thief to retrieve belongings in a bag at different sensitivity levels.


The red circles indicate ‘active spots’ - points on the zipper rail that will trigger an alarm.


Steps for incorporating into the digital prototype:
  1. Divide the zipper rail on the digital prototype into 6 points. Position the zipper element on each point and record the corresponding coordinates.
  2. Create 3 functions in main that set off alarm at different points. i.e at 2, 4 and 6.
  3. Add a switch statement in main constructor corresponding to sensitivity button that executes appropriate function e.g for sensitivity=high, an alarm would be set off at point 2.
  4. Create buttons for the sensitivity levels


In terms of the differences between the previous digital prototype, and the digital prototype companion for the makey makey, most of the functionality that requires interfacing with the user (e.g clicking and dragging the zipper) has been removed as this interfacing has been replaced with the makey makey prototype. Also for this prototype, the user chooses to be either the owner or a thief, and this option is no longer randomised. The basic idea is:
  1. User chooses a sensitivity level and a ‘thief status’. Default is sensitivity=weak, thief=false.
  2. It listens for a keyboard event
  3. Based on the key that was pressed, the zipper object is moved to the corresponding location on the zipper rail
  4. Depending on the sensitivity level chosen, the alarm will be set off at the corresponding point


First we create and add event listeners to the ‘toggleSensitivity’ and ‘toggleThief’ buttons. We also add an event listener to the stage to listen for a keyboard event, which will trigger the default action zipperMoveWeak:


The toggleSensitivity button is implemented - it cycles through the sensitivity settings and listens for the appropriate event:



Depending on the sensitivity level, different actions will be taken. This is a snippet of the zipperMoveStrong function, which sets off the alarm as soon as the second point is reached. As can be seen in the following code, each point also sets the zipper element’s new coordinates to represent its movement along the zipper rail:




The reason it dispatches an event at every point after the second, rather than just at the second point, is because of the volatility of the physical prototype. Because in reality, the zipper may not touch each point, it should still set off an alarm at any point past the second.


Previously, the code did not include the ‘if (!isAlarming)’ checks for each dispatch. This raised an error, since there was an issue involved dispatching multiple events when it was already alarming. This prompted these dispatches to be wrapped in this check, to ensure that it only sets off an alarm if the alarm is not already set.

Overall it seems like the prototype is progressing quite well, despite a few challenges and hurdles along the way. Next week it will be interesting to test the prototype and see how it responds to different users’ input.

Friday, 18 September 2015

Week 8 - Basic interface testing

Exercise:
What is the existing experience (Restaurant Dining)? From different stakeholder P.O.V.?
Sit down with friends/family, scan menu, order food, wait for the food, eat the food, ask/wait for the bill and pay.

What external/internal factors impact on the experience?
External: weather, financial situation, social situation, education level etc
Internal: the wait for the food, the menu and how easy it is to read/understand, the ambience

What aspects of the existing experience could be enhanced/augmented/supported with technology?
Pressing a button on the table to seek attention from restaurant staff (this is already implemented in some restaurants), interactive menus, technology to enhance splitting bills more effectively.

How would introducing technology into this context change the experience?
It would make the process of ordering->eating->paying for food more efficient. For instance, being able to split the bill with technology and reduce payment time and ease financial pressure (people won’t have to owe others money).

What experience scenarios might you test with the technology?
Get a group of friends to eat at a restaurant together, order different items each and make the order complicated and not easy to split.

Continuation of basic interface prototype:
Following from last week I managed to also implement the alarm system for the prototype. This is outlined below:
  1. When the zipper reaches the end of the zip, this triggers an alarm event. Before it alarms, it decides if the person is the owner or a thief. For the purpose of this prototype, this decision is made randomly:
  1. Once the event is triggered (setOffAlarm), it initialises a timer for 0.5 seconds, and initialises other variables such as the beeping sound. Then it listens for when this timer ends:
  1. After 0.5 seconds the light object alternates between black and red (to simulate flashing), with a beep sound every time the light is red.
Working demo of alarm system and the final prototype:


This week we had tested our prototypes in the workshops to gain some feedback. The testing involved getting the users to interact with the prototype, directly observing their behaviour and asking questions afterwards. Through direct observation, it was noted that some people interact with the zipper differently: some dragged the zipper quite quickly and others more slowly. This also reflects the different ways people open physical zips. This means that in constructing a physical prototype, the system would need to be able to detect movements in the zipper at different speeds. The alarm system was also quite loud, and I observed the testees to be slightly shocked when the alarm sounded - this is a desired outcome, as the point is to attract attention and perhaps even surprise.

Some of the questions that were asked at the end were:
How easy did you find it to use the prototype?
Do you think it effectively encompasses the product’s interactions?
How could the prototype be improved?
Overall the feedback indicated the the prototype was intuitive and easy to understand, it would deter the thief and a companion app works well with the rest of the system. Some had suggested to make the alert more piercing, to further deter the thieves, however this should also be tested in-situ.

Friday, 11 September 2015

Week 7 - Basic Interface cont.

Exercise:
Looking back on the previous user testing session, a lot of the feedback received was more qualitative than quantitative, and were not as specific or objective. One of the questions asked was whether they understood the concept of ZipperBan through watching the video, and although many had answered ‘yes’, it was unclear to what extent they understood the concept. Perhaps they thought they understood the concept, but their idea of it is different to how I wanted to portray it. The feedback in this case would need to be more specific to be able to be quantified, and then acted upon for improvement e.g “on a scale of 1 - 10, how easy was the concept to grasp?” etc.


This week I had continued to work on the prototype and simulating the zipper motion in Flash. At the moment the biggest challenge is to create a slider in AS3 that allows the user to drag an element along an arc, to simulate the arc of a backpack zipper. A demo of what I want to achieve can be found here: http://evolve.reintroducing.com/_source/flas/as3/DragAlongArc/


The concept is similar to drag-and-drop, so I researched how to click and drag an element around the screen. The challenge is to constrain the motion so that it follows a defined path no matter where the cursor is. Basic drag and drop within a zipper element:


this.addEventListener(MouseEvent.MOUSE_DOWN, startDragging, true);
this.addEventListener(MouseEvent.MOUSE_UP, stopDragging, true);
function startDragging(e:MouseEvent) {
zipper.startDrag();
}
function stopDragging(e:MouseEvent) {
zipper.stopDrag();
}

After some more research, I found some external libraries that seem to do what it is I need for the prototype. The result of this can be found at http://snorkl.tv/dev/pathDrag/:




The problem with this is that I would need to use an external library and I would prefer to try to recreate this without the use of third-party plugins.


I started to look deeper into how a motion like this would work: as the user drags an element left to right, the vertical displacement follows a defined path; as element.x changes, element.y = a predefined function of x. This led me to think about using an x^2 function to create a parabolic shape. I had to revise basic parabola maths for this:


y= ax2 + bx + c


First I sketched where the arc would be (zipper path) in relation to the rest of the screen to get the pixel values so that I can plug these values into an equation to then get a formula for y. It takes 3 points to uniquely define a parabola. Note that because of the origin being the top-left corner, any y-values are negative:


A sketch of using parabolas as a solution are shown below:






The equation found from the second sketch is y = -0.01x2 + 6x - 1000. This was incorporated in the prototype however the position of the elements were adjusted slightly which varied the equation. In order to incorporate this in AS3, I had to create some nested listeners: one on the zipper element to listen for a click, and once this happens there is one on the stage to listen for the mouse coordinates.


The result:



Friday, 4 September 2015

Week 6 - Exercise/Basic Interface

Exercise:
Create CRC cards for a scenario (or 2) you what to test in future prototypes. What is a testing scenario that you want for your concept?

The user tries to open the bag, the camera sees that the user is not the rightful owner and the alarm sounds. The opposite case should also be tested - with access being granted.

What are the nouns (objects) in the scenario?
The user, the bag

What are the verbs (responsibilities)?
Scanning, stealing

Who are the collaborators?
The user collaborates with the scanning system which also collaborates with intruder’s information - face, identity etc

Basic Interface:

For the first interactive prototype, it will be split into two sections: the detecting interface (the bag) and the mobile interface which further alerts the user of intrusion and is a companion to the main system. It will first display a graphical representation of the bag and prompt the user to unzip the bag. The user will then unzip the bag by clicking and dragging a zipper element on the screen and following an arc motion, as is the case for backpacks. The disturbance of the zipper will trigger the face scanning system to kick in (this will not be implemented in this prototype however) and then sound an alarm and a flashing light to indicate if it was an intruder (or granted otherwise).

The main point of this prototype is to demonstrate the interaction of unzipping and receiving the feedback of the alarm system. The final idea (described above) does not contain facial recognition and depicts an interaction between a single user and the system. The storyboard below illustrates a previous iteration of the idea, and involves the user playing both roles of owner and thief:
  1. User ‘uploads’ their face
  2. User opens the bag and sees it’s a match
  3. Now user plays thief and is told to try to unzip the bag
  4. It is not a match and the system sets off an alarm
  5. This is also displayed on the phone, which the user can dismiss, and can also press a button which automatically takes a photo from the bag. Perhaps there is a timer, and if the user clicks the button to ‘turn around’ within a given timeframe, the thief can be caught
  6. A screen displays if the thief was caught (and a picture if one was taken) or had gotten away.




This is probably too complicated and too technical for this prototype. It should focus on and test a single interaction, rather than incorporating too many interactions, and facial recognition could be omitted. How can a thief's face be detected then? Perhaps there could be randomisation.