Wednesday, March 15, 2006

Testing the system

Last Wednesday we took Mike Sharkey around the Pronto supermarket in the Guild to test our system. It is important to note that Mike is unfamiliar with the layout of this shop, so we believe that there is no chance of the results of the guidance system test being skewed by him already knowing his way around.

Before he arrived for our prototyping session I walked around the store and planned a list of items to be bought, along with a route that would efficiently take him past their locations. This was done in advance as in the completed system a shopping list would be downloaded to the device by the user and then an optimal route for these items automatically computed once the layout of the shop is downloaded to the device upon entering the shop.

For the purposes of the experiment I took on the role of being the system, giving out only as much information to Mike as the system would allow in normal usage, and only when requested by the user (so I wasn't just giving him unneccessary help). For this experiment Mike asked me verbally for information rather than pressing buttons on the smart watch. Mike was allowed to query Dan and Smirf for advice on how to use the system, so we could help him whilst still drawing a distinction between what the system (me) said and the advice given by Dan and Smirf.

The items chosen were:

  • a Ginsters bacon lettuce and tomato sandwich
  • a pack of McCoy's salt and vinegar crisps
  • Evian mineral water
  • a tube of Jaffa Cakes


The first thing that we noticed on entering the store was the (adimttedly unusual) turnstile in the entrance which Mike didn't know was present and the system didn't warn him about, leading to him banging into it.

Mike requested the first item on the shopping list, which was read to him as 'Bacon, lettuce and tomato sandwich' followed by an instruction that the item was ahead of him, simulating the directional guidance given by the pressure pins on the back of the smart watch.

Guiding him to the correct aisle was pleasantly straightforward as Mike was able to use the prompts from the system to aid him in navigating in his usual way with his cane. My voice prompts were given only at key points of the guidance such as corners between aisles, but the pressure pin system would be able to give continual guidance, which should avoid any confusion that arose from me giving incorrectly-timed instructions. An example of this was when locating the McCoy's crisps and I told him the item was on his right about 2 paces too early so he stopped. The system would have been more accurate than me anyway thanks to the localised GPS, but if Mike had stopped too early to reach the crisps the pressure pins system would still be telling him that the crisps were ahead of him so this confusion would have been avoided.

Mike initially struggled to locate the items using the 'barcode scanner' (me reading the names of items as he pointed at them) once he'd reached the correct area of shelves. We realised that our previously-discussed idea of using RFID tags may still come in useful here, but to identify to the system at a glance what items are on a particular shelf rather than broadcasting from a tag for each item individually.

One of the items (the mineral water) was on special offer in Pronto. The system would be aware of this thanks to the database of product information accessed by the barcode reader, and so I announced to Mike once he'd scanned the mineral water that 'there is a special offer on this item'. He then, without prompting, asked for more information (equivalent to pressing a button on the smart watch) which shows that the system seemed intuitive enough for him to know what to do in that situation. The ability to announce special offers in this way should benefit both the store and the customer, as the store is more likely to sell more items if the customer's attention is drawn to the offer, and the customer gets t save some money.

On the topic of information that the system announces to the user, we had a hard time trying to strike a balance of what information should be spoken to Mike when he 'scanned the barcode' of an item and requested detailled information. At first he told us it would be useful to know the sell-by date of an item of food which was a good point, so I added this into my product summary. He didn't seem to want to know other data I offered, such as the weight of the packet of crisps. However to be fair, inclusion of this data into the product database is all up to the shop stock manager so we would only be able to make recommendations as to what data should be included. Supermarkets could do their own research to find out what data their customers most require from individual items.

After the final item was located and collected, Mike requested directions to the checkout from the system. This wouldn't strictly be necessary in practice as by selecting the 'next item' menu option from the smart watch, if the shopping list is exhausted it would give him directions to the checkout anyway.

In total, from entering the shop to leaving it, took just under 4.5 minutes for four items of shopping. This may seem slow in an uncrowded small supermarket for an average sighted person, but we were not aiming to match the speed of a sighted shopper. Mike mentioned to us at the end of the session that actually he'd quite often have to wait for at least that long before someone would be available to help him if he requested assistance, so efficiency of the system appears to be a very positive point.

2 Comments:

At 28 March, 2006 10:23, Blogger ToxicFire said...

Im going to be very critical of the testing even at this late stage in hind site even though I expressed my views in group discussions prior to these posts, im now expressing them in writtern format to cover points obviously missed by them. During testing it was quite obvious that the audio and navigation system was glossed over in that we assumed that the user would be in the automated shopping list mode and did not have to deal with navigating through the menu system which underpins the device from what testing I did by myself I did discover that it is not as easy as one would think to navigate though an audio menu due to two factor's constant announcements of each menu title delay transition through the menu incredibly making using the device time consuming. Much more effective is a system that requires you to scroll through the menu architechture then request an audio announcement of current location.

 
At 28 March, 2006 11:01, Blogger Mark Rowan said...

I know the audio announcements will cause delay. And I didn't miss it during testing, this issue was brought up right at the start of this particular idea.

When I designed this part of the system it was always intended that menu items would only be spoken after the user has stopped scrolling for half a second, not every time they transition an item in the menu. When scrolling, the user just hears a short click like the iPod click wheel sound, then when they stop the current item is spoken. This was all covered in http://hci2msd.blogspot.com/2006/03/menu-system.html as well as in earlier posts.

Requesting an audio announcement with an additional button to confirm that the user is where they think they are in the menu is a useful idea if we have the space for the extra button on the watch face. However at the start of using the system users would probably get annoyed at having to constantly press the button to find out where they are as they learn the menu layouts. With this in mind I still think the original design is the most effective.

 

Post a Comment

<< Home