Sunday, March 26, 2006

Conclusion (Evaluation)

This is the final post for this project. This post has contributions from Dan and Mark.


Following the User Centered Design (UCD) principles as closely as possible we have found them to be an effective way of creating new products of any sort in a manner that allows changes to be made easily and without the hindrances of typical engineering methods such as the waterfall method.


During the project we sought stake-holders and:

* Held a meeting with one of them (as this was the maximum number we could reach) and discussed the feasibility of the project

* Mapped out (and continually updated) the requirements of the device based on stake-holder input

* Designed the prototype

After designing the prototype we returned to our stake-holder with a presentation of our prototype and went from this to conduct usability testing out in the field where the device would be used. This testing shed the light on our design's good points, bad points and points that needed to be changed. This has been shown in previous posts.

After making the necessary changes identified in our design we attempted contact with a more general audience. We found it very difficult gaining their participation in this with only one group replying. We felt it best to get a general overview of our idea from the groups and the feedback on our ideas was very positive with the only issue raised being should this device come to the market it would need to be at an affordable price.

In conclusion, we have experienced both the pitfalls and the successes of working as a team following the UCD method to design a system to help a specific audience complete a specific task. The project did not always proceed as smoothly as hoped, due to delays in trying to contact and waiting for responses from our stake-holders. We found it very effective to use the UCD method to find out exactly what our target users require of the system, and some lateral thinking within the group was quickly able to resolve all the issues that arose from the evaluations performed. The final product met with approval from genuine 'real-world' potential users and, given the funding, we believe it could realistically be still further developed into a real system.

Friday, March 24, 2006

Success!

After contacting a small number of blind groups, a member of the Birmingham Focus on Blindness group was kind enough to reply to our email.

As mentioned previously the email I sent out was very much the same email Mark sent to Peter White. Posted below is the reply I received, all contact details have been removed (and replaced with a [cut]) for the purposes of upholding user privacy.

The reply is from Stuart James


Hi,

Like the sound of the project very much. There are some well thought our
solutions. Particularly like the watch and earpiece idea. Have you thought of
the military applications that could make your fortune?

If you would like to arrange a demo to a selected audience of blind or visually
impaired people then please contact us again - either myself or Will Thornton.
Email us at [cut] or [cut] or ring [cut] and ask.

One word of warning though, beware the cost! The majority of VI people are
unemployed and simply won't be able to afford high cost technolohy.

Regards

Stuart

-----Original Message-----
From: [cut]
Sent: 22 March 2006 21:10
To: Stuart James
Subject: Requesting your help on a future access technology


Dear Stuart,

My name is Daniel Trimm and I'm a student at the University of Birmingham,
studying a course in Computer Science.

I am in a team of three students who have taken up the challenge of
developing an electronic personal guidance and assistance device for blind
and visually-impaired people. We have almost completed the design of our
device but we require feedback from those who are most likely to find it
useful, and so we decided to get in touch with you.

In brief the device takes the form of a smart watch and bluetooth headset,
both connected wirelessly to a pocket-sized processing unit. We decided to
tackle the tasks of guiding a user around an outdoor environment such as an
unfamiliar town centre, as well as localised indoor environments such as
supermarkets, using a GPS system such as the Galileo system being launched
by Europe at the moment.

We designed the device to be controlled using the smart watch, which has a
rotating bezel to scroll up and down menu items in a similar way to the
iPod click wheel, and buttons to go back and forward in menus. Menu items
are spoken through the headset. Shopping lists for a supermarket or travel
itineraries for a day in town can be created on the user's PC and
downloaded to the central processing unit, although there is no requirement
for pre-planning your day as you can use the smart watch to locate specific
places or items once you're out in the field too.

The smart watch guides the user using a ring of pressure pins underneath,
pressed against the user's skin, corresponding to points of the compass. As
the user moves around, the active (pressed) pins also move to guide the
user around corners to the destination. Voice prompts can be given through
the headset to assist in this too.

GPS overlays giving more information about the user's surrounding area,
such as the location of aisles and checkouts in a supermarket, or bus stops
and other public locations in the town, are downloaded automatically by the
device to give more information to the GPS route finding system. Items in
the supermarket can be identified by firstly going to the correct aisle
using the directions from the smart watch GPS system, which makes use of an
overlay provided by the supermarket containing details of which aisles
contain which items. A hand-held barcode scanner provided by the
supermarket then gives the user information on the product they are
curretnly holding, so they can determine if it's the product they need.


The journal of our development process, including all our thoughts on the
device, can be seen at http://hci2msd.blogspot.com
If you could be so kind
as to let us know any thoughts or comments you have on our device design,
that would be appreciated very much and would help us to refine the design
for future users.

Thank you for your time.
Regards,

Daniel Trimm

Wednesday, March 22, 2006

A last attempt to get people to evaluate our project!

Just a quick post,

This evening I have made a final attempt (on my behalf) to try and get people to evaluate our project...

I found a website (http://www.echurch-uk.org/links/societies.php) and I have contacted several societies on it in an attempt to get them to help us out with evaluating our product.

Hopefully some of them will help us out!

Wrapping this post up, just thought I'd say I used a slightly altered version of Mark's brilliant email to Peter White to contact these groups.

Help! We need some bodies

One of THE most difficult parts of this project has been finding people to help us design it, now its proving even harder to find people to help is evaluate it!. If you know anyone that could help it would be appreciated if you could send them our way!

Tuesday, March 21, 2006

Summing up our new findings 2

Carrying on with the story...

My second thought for the device is regarding the 'Use By' and 'Best Before' dates on products. Admittedly we never thought about this in the original design of the menu system and information system (shocking really, I used to work in a supermarket and had to keep the legal documents for this bit!) but I have been thinking about this today and I can quickly see a problem...this information isn't exactly readable by a barcode scanner!

I have thought of a way around this. Currently products that require this information have the date printed on them and in supermarkets, items which are tagged with barcodes printed in-store that have prices which change with each individual item (such as reduced items) tend to have the price hidden within the barcode (once you see enough of them, you soon realise which sections of these barcodes state the item is reduced, state which department it was reduced by and its new price) and this could be easily incorporated into a small barcode located close to the main barcode that contains the products 'Use By' or 'Best Before' date. Products that don't require this sort of information could be given a 'NULL' tag.

This would cure this problem with ease and would cost hardly anything to implement. My final comments on barcodes considers their location on products. I personally believe that it would be simple to replace current barcodes with a barcode 'ring' which repeats itself all around the product. This could either replace current barcodes or it could be developed as a special barcode just recognised by the Brain and I believe manufactures would not mind implementing the ideas here in the face of all of them making the same modifications in their packaging which could increase sales as blind users will have easier access to automatically finding our about new products due to the proposed methods of product location and identification.

Summing up our new findings

As promised, today I will make a few posts.

This post is dedicated to summing up our new findings when we tested with Mike. I won't regurgitate what Mark posted but instead add my own thoughts to this.

The first thing that stands out in my mind as an issue is the system's level of detail that it gives to the user. We quickly discovered that what we thought would be 'the norm' information for users isn't always going to be what they want and we should therefore revise how this section of the system works. Personally I believe the system should be revised as follows:

Within the settings menu there should be a set of fields which can be turned on and off and these would contain various topics of information such as "Product Weight" and "Nutritional Information" which can be turned on and off as the user requires. This settings menu should be peripheral device specific and not store/location specific because users will tend to want the same settings when using that particular device more than their current location because the peripheral devices will tend to be activity orientated and as such are barcode the barcode reader settings used at one supermarket will almost certainly want to be used in other supermarkets.

Further to what I have just said, there will need to be a 'standard' set of information that all supermarkets provide to the system for this information section. This information is very much standard already across all supermarket firms as all have pretty much the same information in their barcode databases thanks to the standardisation of barcodes (they follow the Universal Product Code, see http://www.av1611.org/666/barcode.html for more information) and software for such system being of a more generic industry setup rather than tailored for each individual company. The system could include a number of fields that are specific to the company/store they are visiting but I would limit this (to say 4) as companies may go 'overboard' if given too much leg room.

Another feature this section of the system will need is the recognition of different types of store, for example, nutritional information is hardly useful when you are looking for a new door bell at your local DIY store (a British term for your local hardware store, if DIY confuses you). Also, at such a store, getting more information about products (such as product dimensions) is very important and all these different fields would make a settings menu list that is laid out the same way as the IPod list huge and extremely difficult to navigate. I therefore suggest that using an idea taken from barcodes, different types of store are standardised in a store type ID database, built into the Brain which then makes a request to the device (such as the bar code scanner) when they are pairing as to what type of store is it in. From this the Brain can customise the information settings menu. With a finite number of types of store the Brain can store each individual menu setup and after the initial configuration of each menu, every time in encounters that store type it could just restore that store type information settings for the Brain to use and for the user to customise as they see fit. Finishing this, when linking the Brain with a PC the computer software for the device can allow the user to see all the settings at once so they can quickly setup all the menus on their new device before they leave the house.

At this point I will start a new post...

Monday, March 20, 2006

Copy of a letter sent to Peter White of In Touch, a Radio 4 show for disabled listeners

Dear Peter and the In Touch team,
My name is Mark Rowan and I'm a student at the University of Birmingham, studying a course in Computer Science.

I am in a team of three students who have taken up the challenge of developing an electronic personal guidance and assistance device for blind and visually-impaired people. We have almost completed the design of our device but we require feedback from those who are most likely to find it useful, and so we decided to get in touch with you.

Briefly the device takes the form of a smart watch and bluetooth headset, both connected wirelessly to a pocket-sized processing unit. We decided to tackle the tasks of guiding a user around an outdoor environment such as an unfamiliar town centre, as well as localised indoor environments such as supermarkets, using a GPS system such as the Galileo system being launched by Europe at the moment.

We designed the device to be controlled using the smart watch, which has a rotating bezel to scroll up and down menu items in a similar way to the iPod click wheel, and buttons to go back and forward in menus. Menu items are spoken through the headset. Shopping lists for a supermarket or travel itineraries for a day in town can be created on the user's PC and downloaded to the central processing unit, although there is no requirement for pre-planning your day as you can use the smart watch to locate specific places or items once you're out in the field too.

The smart watch guides the user using a ring of pressure pins underneath, pressed against the user's skin, corresponding to points of the compass. As the user moves around, the active (pressed) pins also move to guide the user around corners to the destination. Voice prompts can be given through the headset to assist in this too.

GPS overlays giving more information about the user's surrounding area, such as the location of aisles and checkouts in a supermarket, or bus stops and other public locations in the town, are downloaded automatically by the device to give more information to the GPS route finding system. Items in the supermarket can be identified by firstly going to the correct aisle using the directions from the smart watch GPS system, which makes use of an overlay provided by the supermarket containing details of which aisles contain which items. A hand-held barcode scanner provided by the supermarket then gives the user information on the product they are curretnly holding, so they can determine if it's the product they need.


The journal of our development process, including all our thoughts on the device, can be seen at http://hci2msd.blogspot.com If you could be so kind as to let us know any thoughts or comments you have on our device design, that would be appreciated very much and would help us to refine the design for future users.

Thank you for your time.
Regards,
Mark Rowan.

Sunday, March 19, 2006

Explaining the lack of posts

Sorry to all for the lack of posts this week but dissertations have got in the way... My next post will definitely be on Tuesday when I add my bit to the evaluation of the test we conducted with Mike and further Mark's suggestions of what needs to be changed in our current prototype. I will also look into evaluating further the technologies that are aimed at driving this device.

Though I have been busy this week, I have made some progress on this. I have contacted the RNIB business development group to seek their help and I have sought after online forums in the hope that both groups may be able to help us with feedback on this device.

For now I leave you with something interesting I found, Linux being ported for use by blind users at: http://leb.net/blinux/

Thursday, March 16, 2006

RFID virus concerns

The BBC is running a report on its technology site about RFID tag viruses. The report describes how some RFID tags, with their embedded microprocessors, could be used to run malicious code and spread via radio to other nearby tags, and so on.

Of course, being aimed at the non-technically-literate, the report warns of the impending doom of an uprising of new breeds of mutated virii (or 'viruses' as the BBC calls them) - in a similar way to the mutating H5N1 human-bird-flu pandemic that we're all supposed to be expecting - that will cause havoc across the nation.. when in reality, the original research was actually an attempt to warn RFID makers to secure up their systems.

Still, now that RFID tags are back on the cards for our system, it would be worth bearing in mind that with current less-secure tags, you may not always be able to trust the data that you're receiving from them (and if you're blind and can't tell the difference between two items without the RFID being correct, this is a real problem).

Changes to be made

What follows is a very brief summary of the aspects of the system our testing showed need to be changed, or that should be added:

The ability to warn the user using voice prompts in the headset about upcoming hazards eg. the turnstile at the entrance to Pronto. These can be included on the local area overlays as 'hazard areas' along with the text which will be spoken by the Jaws software on the brain of the device when the user enters one of these areas.

Guidance to specific shelves using RFID tags embedded into each shelf containing a description of what items are on this shelf. The device brain could read the RFID tags to tell the user which shelf (numbered sequentially from the floor or from the top, for example) the desired item is on.

Currently the user has to pick up the item and turn it around when looking for a barcode to scan it and obtain information on what the item is. This is somewhat cumbersome for a blind user, and wasn't tested in this session (the product details were read straight to him when Mike pointed at it). One way around this would be to use larger barcodes such as those on Aldi branded packaging. However manufacturers are unlikely to adopt this industry-wide as it can be seen to detract from the aesthetics of the packaging. Another possible remedy is for the item barcodes to also be stuck to the shelf in front of where the item is stocked, so that it can easily be read by the barcode reader.

Wednesday, March 15, 2006

Testing the system

Last Wednesday we took Mike Sharkey around the Pronto supermarket in the Guild to test our system. It is important to note that Mike is unfamiliar with the layout of this shop, so we believe that there is no chance of the results of the guidance system test being skewed by him already knowing his way around.

Before he arrived for our prototyping session I walked around the store and planned a list of items to be bought, along with a route that would efficiently take him past their locations. This was done in advance as in the completed system a shopping list would be downloaded to the device by the user and then an optimal route for these items automatically computed once the layout of the shop is downloaded to the device upon entering the shop.

For the purposes of the experiment I took on the role of being the system, giving out only as much information to Mike as the system would allow in normal usage, and only when requested by the user (so I wasn't just giving him unneccessary help). For this experiment Mike asked me verbally for information rather than pressing buttons on the smart watch. Mike was allowed to query Dan and Smirf for advice on how to use the system, so we could help him whilst still drawing a distinction between what the system (me) said and the advice given by Dan and Smirf.

The items chosen were:

  • a Ginsters bacon lettuce and tomato sandwich
  • a pack of McCoy's salt and vinegar crisps
  • Evian mineral water
  • a tube of Jaffa Cakes


The first thing that we noticed on entering the store was the (adimttedly unusual) turnstile in the entrance which Mike didn't know was present and the system didn't warn him about, leading to him banging into it.

Mike requested the first item on the shopping list, which was read to him as 'Bacon, lettuce and tomato sandwich' followed by an instruction that the item was ahead of him, simulating the directional guidance given by the pressure pins on the back of the smart watch.

Guiding him to the correct aisle was pleasantly straightforward as Mike was able to use the prompts from the system to aid him in navigating in his usual way with his cane. My voice prompts were given only at key points of the guidance such as corners between aisles, but the pressure pin system would be able to give continual guidance, which should avoid any confusion that arose from me giving incorrectly-timed instructions. An example of this was when locating the McCoy's crisps and I told him the item was on his right about 2 paces too early so he stopped. The system would have been more accurate than me anyway thanks to the localised GPS, but if Mike had stopped too early to reach the crisps the pressure pins system would still be telling him that the crisps were ahead of him so this confusion would have been avoided.

Mike initially struggled to locate the items using the 'barcode scanner' (me reading the names of items as he pointed at them) once he'd reached the correct area of shelves. We realised that our previously-discussed idea of using RFID tags may still come in useful here, but to identify to the system at a glance what items are on a particular shelf rather than broadcasting from a tag for each item individually.

One of the items (the mineral water) was on special offer in Pronto. The system would be aware of this thanks to the database of product information accessed by the barcode reader, and so I announced to Mike once he'd scanned the mineral water that 'there is a special offer on this item'. He then, without prompting, asked for more information (equivalent to pressing a button on the smart watch) which shows that the system seemed intuitive enough for him to know what to do in that situation. The ability to announce special offers in this way should benefit both the store and the customer, as the store is more likely to sell more items if the customer's attention is drawn to the offer, and the customer gets t save some money.

On the topic of information that the system announces to the user, we had a hard time trying to strike a balance of what information should be spoken to Mike when he 'scanned the barcode' of an item and requested detailled information. At first he told us it would be useful to know the sell-by date of an item of food which was a good point, so I added this into my product summary. He didn't seem to want to know other data I offered, such as the weight of the packet of crisps. However to be fair, inclusion of this data into the product database is all up to the shop stock manager so we would only be able to make recommendations as to what data should be included. Supermarkets could do their own research to find out what data their customers most require from individual items.

After the final item was located and collected, Mike requested directions to the checkout from the system. This wouldn't strictly be necessary in practice as by selecting the 'next item' menu option from the smart watch, if the shopping list is exhausted it would give him directions to the checkout anyway.

In total, from entering the shop to leaving it, took just under 4.5 minutes for four items of shopping. This may seem slow in an uncrowded small supermarket for an average sighted person, but we were not aiming to match the speed of a sighted shopper. Mike mentioned to us at the end of the session that actually he'd quite often have to wait for at least that long before someone would be available to help him if he requested assistance, so efficiency of the system appears to be a very positive point.

Friday, March 10, 2006

Video of prototyping

Just a quick note to point out that a video of the prototyping session with Mike is available at http://www.fukmi.co.uk/~mark/100_0116.avi

There is also a longer audio-only recording from the dictaphone at http://www.fukmi.co.uk/~dan/hci2/hci2.mp3 including Mike's responses and feedback at the end of the session.

More analysis will follow.

The ways we have gone about things, why made the decisions we did and the problems we have encountered

As I sit here at 1:55 AM after spending hours tinkering with my Final Year Project (and making another discovery) I have two sets of thoughts churning through my head which I am trying to clear up so I can go to bed:

a) My sister and Dad (my Mum isn't going due to her disliking of long haul flights) are flying to New Zealand for a few weeks at 6:00am for my cousins wedding... Why didn't I take a gap year?!? - Guess I'll just have to make up for this next year on my travels.

b) The problems we have discovered doing this HCI project.

As the latter is much more relevant, I shall concentrate on it.

First off I think it's important I explain why we have done some things. As computer scientists (and researchers) it may seem that we have focused quite heavily on the core technologies of our idea and not on the abstractness of the problem and there is a reason for this. When we set out to do this task one of the first decisions we made (unconsciously at first, then it hit home after our first interview with Mike) was the importance of modeling this project on existing tried and tested technologies, we had three reason for doing this:

  • We couldn't ever have a full idea of the needs of blind users so using technology they already understood and liked was a must and we quickly discovered that training in a new technology is a slow process for many blind people so avoiding this was a high priority.
  • Our ideas from the beginning focused around making a new and different use for existing technologies and this helped because our idea was so big and general that making everything out of 'concept' ideas would have made it an impossible task in the time allocated.
  • A lot of existing technologies are either at, or are quickly reaching, the levels of efficiency and ability where this sort of project will be feasible in the near future
The abstract components of our project can be seen to be encapsulated in our technology work through what needs to be different and how we think tasks should be approached.

You may also notice we do not have any spectacular drawings of menus and uses, these would be pretty useless in a system meant for the blind! Our work concentrates on their environment and what works for them.

So what about the problems we have encountered? Well the biggest has been testing audience, Mike has been absolutely amazing in helping us out but sadly surging a big pool of volunteers that could help us in a way that would be useful for this task is a very hard thing to do and so we have had to rely pretty much solely on Mike, so far our attempts to contact people who could help us, including those who have publicly said about such a system being needed without ever talking, meeting us or seeing our work have ended in no replies but hopefully we can get a little bit more luck now after we conducted a live demo Wednesday (we'll blog what happened as soon as we have collated it all).

Have we been hit with other problems? Well yes, trying to anticipate blind users needs in a concept system is quite difficult and without throwing lots of money at organisations set-up to help the blind, getting their easy to understand guidance is very hard, so apart from Mikes feedback at our ideas and stuff we have ideas from previously we have been working on a lot of the projects design thoughts in the dark, again with no major feedback mechanism to tell us when we're getting it right and when we're absolutely wrong.

Well it's almost 2:30 now, I'll get another telling off by my medic housemate for being up late and getting up early (whilst recovering from illness). She is right for doing so though...

Wednesday, March 08, 2006

Companies get help building websites for disabled users

This has been a long time coming but now its here perhaps companies that build websites will start using their heads and force the web designing community which has spent for too much time making websites flash in the last 10 years and not practical to wake up and give everyone access.

So what is it? Well it's called PAS 78, it's a set of guidelines developed to help companies comply with laws developed in the UK in 1999. The law states that UK companies with a public accessible face on the Internet are obliged to provide access to everyone, including disabled users.

PAS 78 has been developed to help anyone looking to build a disabled user friendly website with the 78 key things they need to get their site to work with pretty much all disabled equipment available for computers.

More information is available at: http://news.bbc.co.uk/1/hi/technology/4783686.stm

Prototyping the system

Clearly, given our target audience, we can't produce a visual prototype to show potential users and ask for feedback. After a bit of thinking we've come up with an alternative involving us simulating the device's voice prompts whilst a user requests information from us as they would do with the device. This can then be recorded using a video camera and analysed later.

Mike Sharkey has agreed to be our guinea pig for this test. Firstly we will obtain a shopping list of his items in advance so we can actually go to the store and plan a route around the shop floor (analogous to the route finder in the system producing an itinerary from a pre-defined shopping list).

Once permission has been obtained from the store (most likely Pronto in the Guild or University Centre) the following test plan will take place:

Testing to see how well the suggested menu system with rotating bezel works

  • give Mike an overview of what's available in the menu system in terms of options
  • see if he's comfortable clicking round and selecting items from it with us providing voice feedback of the current menu item.
  • record audio feedback from him about how he found it.


Once he's happy (or not) we can move onto simulating the actual tasks without the "clicking", just him telling us what he wants to achieve eg. "go to checkout" and we will then guide him.

Guidance
Clearly we can't simulate the pressure ring on the back of the smart watch, so we will use voice commands, eg. "left", "stop". Afterwards we will ask if he thinks the pressure system specified would be better than our improvised voice prompts.

Shopping - repeat:

  • Mike should ask us what item he's looking for from his list (which we hold).
  • tell him (by voice) compass directions to the item's shelf, and when to stop.
  • Mike should then emulate using the barcode scanner to check the (imagined) barcodes on the shelves.
  • he should move his finger, pointing from the top to bottom of the shelves, scanning from top to bottom, with us reading out the product description as each shelf is reached. This is equivalent to the user scanning down the shelves with the barcode reader.
  • he should say "that's it" or something similar when he's happy with the choice, then retrieve the next item from us (the list) by saying "next item"
  • at the end of the list we say "end of list"


Product description should give a succinct summary of manufacturer (eg. Heinz), item name (baked beans), weight/contents if necessary or multipack etc. In practice this will all be set by the supermarket but we should be careful not to give too much (due to taking too long and the user getting impatient) or too little information.

I suggest we should choose just three or four items for Mike to locate using the simulated system and record him using the video camera doing this to see if it seems straightforward enough for him to use, and whether he improves in using this system by the end of the test.

Finally we need to ask him to discuss with us how he found the overall experience, recording it as audio on the camera. We should specifically cover ease of navigation - "does it get you to what you want quickly enough?"

Tuesday, March 07, 2006

Ethics

Many on the ethical issues attached to this project are the same as those attached to a loyalty card for supermarkets in that a store can obtain details on your shopping habits and use this to agressively market products at a user. While in most cases you can opt out of having a loyalty card with little impact opting out, the device that we have design has greater dependancy than a loyalty card thus it makes an opt out policy quite difficult, leaving the only options to either look for ways to enforce legally data protection of the users of the device or design the system in a way that the supermarket cannot actually access the shopping list data themselves but this added's complications to the device.

Exspansion on the barcode reader

Glad I actually put this off until we had that talk on skype because it brought some more points to the surface's.

In other posts we've discussed some of the main constraints behind the barcode scanner in that it would be best if the scanner was owned and maintained by the supermarket, this is primarily to do with the fact that most supermarkets would not like there complete stock data been publicly accessible without some control on their part.

This leaves us with a very narrow avenue of how the barcode reader could be used, also applying memory and processing constraints. So the device idealy should be limited to containing only portions of the stockdatabase that pertains to the shopping list of the person adding another constraint that the shopping list, needs to directly interact with the barcode reader prior to any shopping actually taking place so the correct parts of the stock database can be selected, if the shopping list is prepared online in advance of the shopping trip, this leads to making things flow more simply if this list is then handed off to the supermarket which can prepare the barcode reader (file) in advance aswell thus allowing it to be picked up quickly and efficently, at the desk as well as allowing the supermarket to deal with cases of no stock available or to tag special offers and alternatives and specific product information such as weight and nutritional values. There comes into this the ethical issue of what the store could do to abuse this info, but that is for another post.

As the barcode reader isn't part of the primary device is only needs to come into use when locating specific items the device can either be placed in a basket or trolly to free the users hands. Though initially RF tags were discussed as a way of identing products but it was clear that the device could quite easily receive multiple signals confusing products, as there was an existing system aka barcodes already in place on pritty much every product in existance, it soon became clear that a barcode reader was the best way to go as this would indentify most ifnot all products, with only problems occurings with loose products such as vegitables which could easily be remedied by placing barcodes on the front of shelves this actually solves a problem with the use of the barcode allowing the user to simply scan up and down the shelf after the instore navigation has brought them to the general location of the product, this then allows the user to find specific products, and using a thumb button select more detailed info on the last scanned product.

Identifying the products would be as simple as the paired barcode reader returning a raw text string to jaws for to audio conversion, for names of products this should be done continously in that you can hold the scan button sweep a shelf and it will que all the products it scans then audioconverts them. Where as more detailed info will require you to stop at a product then press the barcode's detailed info button.

Monday, March 06, 2006

The problems are in the genes!

Scientists in New York have found that almost 75% of the worlds most common cause of blindness (
Age-related Macular Degeneration) is caused by 2 genes.

The scientists expanded previous research which had only identified one of the genes, resulting in only a third of cases being diagnosed, but by combining research results into both genes they were able to discover that they work in tandem.

This could be the first steps to producing a treatment meant to prevent this disability from occuring. This line of research will be interesting for the next few years.

Source: http://news.bbc.co.uk/1/hi/health/4771608.stm

Saturday, March 04, 2006

Communicating with local information points

In addition to my earlier post about information overlays, increasing amounts of localised information are being provided by local councils.

In Birmingham the WayFinder system is already in the testing stage to allow blind and partially-sighted users to retrieve spoken local information from devices installed on street furniture.

With our system, this local information could be not only spoken out of a loudspeaker, but funnelled via Bluetooth to the Brain and then to the Smart Headphone so the user can have a bit more privacy about their dealings.

Train stations and bus stops in Birmingham now give details of the next arrivals and their times, destinations, and future stops. Again this information could be sent to the Brain via bluetooth in a localised area around the bus stop or train platform so that the user can obtain this information, which is already present and displayed to sighted users, with very little modification or extra cost to existing systems.

The PC software

I'm not going to go into much detail about this as it's only a peripheral part of the system that could theoretically be provided by any 3rd party as long as it's compatible with the Brain and can connect to it using the Bluetooth system.

PC software should not even be required for the use of the system, although its use would be beneficial.

I see a useful piece of software for our system as containing the following features:


  • Interface with a speech synthesiser such as Jaws for all relevant operations.
  • Create a shopping list on the user's home PC for downloading into the Brain.
  • Produce a day's itinerary eg. for shops to visit, meetings to attend and their locations.
  • Upload new overlays for the GPS system from the Internet, either for a specific area or perhaps nationwide artifacts such as National Parks.


It would be useful to allow more customisation of the system's settings using the PC software too, but just like any well-designed VCR or DVD player the system should still function more than adequately without the 'remote control'!

Overlays of local information

There are two things that we'd like the system to do for our blind users which current GPS systems are not very good at doing (if at all):

1). Guide the user around relatively dynamic indoor environments (eg. supermarket displays that may change once a week) as opposed to street-level GPS which maybe changes significantly enough to warrant updating the device once every year at most. This point has already been elaborated on in my previous post.

2). Allow the user to access far more detailed local information at the street level than is supplied in a standard GPS package that is just designed to get the user from A to B where a postcode, street name, or town is known.

This is an interesting problem. A blind user will want to know how to find items of interest such as the nearest bus stop, train station, library, pub, specifically-named shop, even names and addresses of friends at specific house numbers, etc. To hold data for the entire country at this level would be technically infeasible not to mention constantly require updating.

A suggested solution for both of these problems lies in the idea of overlays. In brief, the Brain should store a standard style country-wide GPS system that holds major points of interest such as towns, roads, train stations, etc. Even at this very basic level a user should still be able to guide themselves around an unfamiliar town centre if they know which street their destination is on for example.

But imagine now that a blind user is taking a trip to Manchester to attend a concert. They'll need to know the locations of train stations, local bus stops, the theatre itself, and maybe a pub or restaurant to drop into on the way home afterwards. Our aim is to allow this person to do all of this without relying on anyone passing by in the street (ref: the BBC Ouch! article I mentioned earlier).

Solution: before the trip to Manchester the user downloads onto their Brain device an overlay or three for the area of Manchester. Potentially these could be supplied by wi-fi in major public areas upon arrival, such as the central train station, but for now the idea of using a home PC to achieve this seems fairly plausible.

One overlay may contain a reference to all the eating points in the Manchester area so that the user can set any one of these as the GPS destination and be guided there. Another overlay could contain all bus stops. There could be an additional set of data (perhaps provided by the local council in XML format) containing details of bus routes so the user can be guided to the nearest bus stop with a fast bus to the theatre at the touch of a few buttons on the Smart Watch.

Clearly the same principle should work in a supermarket. Upon arrival at the shop the Brain can download an overlay from the shop server via wi-fi containing details of where aisles are, displays, trolley parks, checkouts etc. are located in the shop. Further overlays could add detail to the aisles, eg. to specify that biscuits are stored at the end of aisle 3 (the user will still need to use the barcode reader to accurately pick up the desired brand however as the GPS is not going to be accurate enough for that!).

These overlays can be added and removed at will using the home PC software (see my next post) or wi-fi downloads when the user is in the environment, to give a potentially unlimited wealth of local information to the user so that they can always make their own way around an unfamiliar environment.

Route planning

One of the major components of the system is its ability to guide the user to specified locations in two environments: outdoors using standard GPS, and indoors in theatres and shops etc. using a localised GPS system.

Clearly some form of route planning will be required as users can't be expected to cope with being told to "Go North" by the pressure actuators on the smart watch, if directly north of the user is a large building. Modern GPS units allow the system to plan a route around obstacles and there seems to be no reason we can't include this functionality into the Brain as it is designed to have the same sort of memory and processor capacity as the PDAs that these GPS systems work well in real-time on. However whilst this works fine for outdoor navigation at street level where this kind of local information is already known, when indoors the user will require much greater accuracy and knowledge of the local environment on the part of the system.

For example in a supermarket, the user may be told that the fruit and veg aisle is slightly to the left of straight-ahead (west-north-west). However due to the layouts of aisles in a supermarket, it's entirely plausible that the user would first have to turn right, reach the end of the aisle, turn left to the next aisle, and turn left again, effectively now walking in the opposite direction to reach the destination.

The system will need to be able to do this on a local level as well as coping with the ever-changing layouts inside a supermarket or theatre, such as item displays, trolley parks, refreshment stalls etc. so that users are not directed to walk straight into them and becoming confused.

Now that I've outlined this need, my next post will describe a possible solution to this problem as well as facilitating a related extra level of functionality.

Friday, March 03, 2006

Device pairing

There is one thing we haven't talked about so far which is absolutely important - device pairing.

Many devices will pair with our system via the Brain. Devices that will almost always be paired with the Brain are the Smart Watch and the Smart Headphone, other than these there will be many possible devices that can be paired with the brain. The most obvious of which will be the supermarket scanner.

So what does all this mean? Well first off, both the Smart Watch and Smart Headphone are wireless devices and this means they need to communicate wirelessly with the Brain. The most obvious and simplest way to do this is with the simple Radio Frequencies (RF) over a short distance, however, if we used standard RF like that found on a analogue house phone we quickly run into difficulties due to RF interference and also issues where devices using the same frequencies/channels could or would cause problems and unprotected transmissions and no authentication/encryption on the system could allow intruders in to the system in order for them to satisfy their hunger for malice. Therefore a few features for the wireless communication system are essential:

*It must not interfere with other objects/devices

*It must have a unique identification system so that devices of the exact same type in close proximity used by other users do not interfere with one another (e.g. your Smart Watch doesn't start giving instructions to your friends Brain)

*All transmissions must be encrypted with device ID as part of the authentication process so that the system knows the message came from an already authenticated paired device (and not some hacker) and the encryption stops messages being obtainable by the wrong hands.

Two other important features of the pairing system is that the wireless communication needs to be of low power consumption and of rather low radius around the user (on average between 1 and 3 metres) and of course the communication channel must be able to effectively and efficiently handle all communications.

To achieve all of this, enter Personal Area Networks (PAN) and more specifically, Bluetooth.

Bluetooth has all the features we need to achieve our pairing and wireless communication ideas, I shall explain all the bits that are useful to us here and I will then finish with what alterations we would need to have on a standard Bluetooth system to suit our needs.

The first useful part of Bluetooth for us is its power consumption and range, being a PAN technology it is meant to be used in small radiuses and also to consume little power and Bluetooth achieves exactly this, a 1 metre radius can be recognised with just over a milliwatt of power consumption and 2.5 milliwatts roughly produces a 10 metre signal radius, which more than covers our needs.

As of Bluetooth version 1.2 there has been Adaptive Frequency-hopping spread spectrum (AFH), this can be used to prohibit two devices of the same type interfering with one another (as mentioned above). More information on this can be found here

For security, Bluetooth uses the SAFER+ algorithm, this more than covers our needs.

Of course, on top of all this, Bluetooth allows device pairing, one of the most important features of our wireless communication system. However there would need to be differences between our 'Bluetooth' pairing and that of the current standard Bluetooth, as detailed below:

Our system would need to be able to work without the need of a passkey, this means that the only device which can have more than one slave is the Brain, all other devices which work with the Brain will have a maximum of one slave and to go along with this, slave devices will not be allowed to request or try to force pairing with the master (the Brain), only the Brain can make such requests of slave devices. To go with this, the encryption process needs to be changed to reflect how this part of our system works.

Further, there needs to be more than 6 possible slave devices as we cannot guarantee that 6 slave slots are enough.

Finally there needs to be a two tier device pairing system and this needs to be similar to DHCP server software such that the master has a 2 tier list device, one that has never expiring entries for devices such as the Smart Watch and Smart Headphone. Should these devices ever get replaced, pairing the newer devices with the Brain replaces the older devices in this section of the list. The second tier of the list needs to have fields that can expire, typically after 30 minutes. This is to allow for devices such as barcode scanners that are owned by the supermarket and not the user to be automatically de-paired from the Brain after the user has finished shopping, just in case the user forgets to do so.

If I think of anything more that the pairing needs to do, I'll make a second post as this one is already a mile long!

Thursday, March 02, 2006

Menu system

This is the prototype menu structure that will be accessed primarily using the rotating bezel (to scroll up and down menus) and the select/back buttons on the side of the watch.

Pressing select will open the main menu. As previously discussed, turning the bezel will scroll through menu options (with an audible and tactile 'click' feedback so the user knows how far they have scrolled). If the user pauses for more than half a second after scrolling and without pressing a button, the current menu item is spoken in the earpiece.



Main Menu
|- 1.Next location (when outdoor mode selected) / Next item (if indoors)
|
|- 2a.Navigation (outdoor mode selected)
| |- Today's itinerary
| | |- (List of items)
| | |- Move task up
| | |- Move task down
| | |- Delete task
| |
| |- Where am I? (locational information, can tie in with Bham
| | wayfinder system or other similar systems)
| |- Find nearest...
| | |- Station (these must be supported in the GPS map
| | |- Bus stop but the user can download new overlays
| | |- Taxi rank for different cities or extra landmarks
| | |- Library using the PC software at home)
| | |- Supermarket, etc.
| |
| |- Breadcrumbs
| |- Remember current location
| |- (List of items)
| |- Move item up
| |- Move item down
| |- Delete item
|
|- 2b.Navigation (indoor mode selected)
| |
| |- Go to (this entire menu is dynamically created by instore
| | | server depending on services and items available)
| | |
| | |- Checkout
| | |- Exit
| | |- Customer services
| | |- Specific item
| | | |- Fruit and veg
| | | |- Hygiene products
| | | |- ...
| | | |- Seat number (if in theatre, etc)
| | |- Special offers
| | |- Places to rest
| | |- ...
| |
| |- Shopping lists
| | |- (List of named, pre-programmed shopping lists)
| | |- Shop for these items
| | |- Delete list
| |
| |- Pair a device
| |- (List of available help devices eg. barcode reader)
| |- Pair with this device
|
|- 3.Use indoor navigation / Use outdoor navigation
|
|- 4.Settings
|- Voice navigation on/off
|- Tactile navigation on/off
|- Choose shortcut button function