exercise five: sarah hates and loves…

THE OBJECT I HATE

this horrible mess is my residence’s “home” phone. the visibility of the actions possible is mucked up, with speed dial buttons mashed together with ‘tips’ and ‘notices’. the icons used are small pixelated and mostly hard to make out images which confuse instead of facilitate upon first glance. there is a messages waiting light which blinks when there are messages in your voicemail, which is great visibility, however any information on how to access that voicemail is ignored, which means that light has been blinking since september – no one else in my unit understands it either.

semantic mapping is used in that labels have been applied to each button, but the application of text and images mix together in areas which just makes things more irritating and confusing. there are also buttons for things that aren’t necessary like pita pit and primus. the bottom five buttons are actually hogging space which could be used to more adequately space out the already existing and necessary buttons and text.

you get the tactile feedback of pushing buttons, and the sound of buttons being pressed down on, but no visual feedback is provided (aside from that of the voicemail indicator). the old phone in a cradle model affords that the phone is to be picked up to be used, and to be placed back down when you are finished. simple, tried and true method. the physical constraint of the phone cord means that one can only walk so far from the base before the base decides to crash down onto the floor in an attempt to follow. all in all it is an ugly, annoying piece of junk.

THE OBJECT I LOVE

the object i love would have to be my ipod. it’s sleek, and simple. the few buttons on the face of the product mean that you either touch the wheel and the ipod turns on, or if not you toggle the hold button. from there it’s smooth sailing: menu button brings you to a menu with easy to navigate sections.

feedback is provided through visual cues like the changing of the screen and scrolling, aural cues like the clicking as you scroll through the list, and tactile feedback of pushing the buttons all of which correspond with each other.

there are visual constraints of scrolling through the album covers or lists, and the physical rotary motion of using the scroll wheel, the play/next/back buttons being semantic constraints. everything works in harmony and i can operate it while it’s in my pocket – no eyes required.