Attempted live blogging. Notes will be rough, until I have a chance to clean them up later.
James Landay, an associate computer science professor at the University of Washington, gave a talk today on “Digital Simplicty through Activity-Based Computing.” Landay was also a past director of Intel Research in Seattle.
Design Use and Build of Interactive Systems (DUB) – left tenured position at Berkely to be a part of this group
Intel Research – mission: develop & evaluate new usage models, applications & underlying technology for ubiquitous computing … 15 full-time researchers (growing to 20), community of 35, including interns, visitors & campus collaborators) … collaborate with other exploratory labs (Berkeley, Pittsburgh, Hillsboro PaPR)
tech complicates our lives – rejection: half of returns due to “complexity” of device … avoidance: turning of devices to reduce intrusion … underutilization: devices do not work together … complexity of computers may become the bigest contributor to digital divide (Wall Street Journal, 6/13/05)
activity-based computing simplifies lives – social, natural interactions, always at hand
video prototype of possible activity-based computing devices … alert status photo frame and PDA/cell phone message interface, television phone, sensor bracelet, object-based sensors trigger calls
tested in homes of homes of elder’s children , wizard of oz usability
Activity Theory (early 20th c Russian psychology) – used in CSCW community, not well understood
* high-level activity – “let mom live a healthy & independent life” … “mom eats regularly” … “mom exercises regularly” – tool-subject-object-community-rules-division of labor
* actions – eat breakfast … eat lunch … biking stretch
* operations – get on bike … pedal
challenges:
* physical actions are tedious to record & manage (tend to start well, but stop doing it) – can build applications using action inference
* social relationships are complex & delicate – use social inference to inform human actions (ask if notification is desired)
* natural interactions are ambiguous – improve disambiguation using dynamic context
* must study in situ over extended periods – use new tools to improve data collection/analysis
Examples
UbiFIT (ubiquitous fitness-influencing technologies) – cross-cutting app using action inference
* problem: overweight and obseity is a global epidemic (1billion+, $100b+ cost in US), busy people
* challenges: fitness is long-term activity, don’t be annoying, credit for everything they do (biking, running, difficulty not figured into pedometer steps), use social support without violating norms
* iteration: automatic capture of common physical actions (Mobile Sensing Platform from 7 sensors), self awareness using natural & familiar interaction (ambient garden, smart gym), in situ studies to track long term behavior changes
* feedback: grass populated with flowers over the course of the week, get a butterfly if you meet your weekly goal (no negative, like grass wilting)
* testing: pilot and 3-month study with 20 participants
What people use is the key to recognizing many actions
* reject immature visual recognition technology in favor of RFID (cheap, widely used)
* detected with near-field reader in bracelet – code temporal order of objects as activity
* without bracelet – auto-wisp??
* testing: early trieals showed 85% inference accuracy
inferring physical activity with mobile sensing platform
* automatically track physical activity throughout the day – many sensors and processor in a pedometer box
Tools for study:
* MyExperience (context-triggered ESM tool)
* Activity Prism
* video prototype: Activity Designer (visual interface designer, storyboard transitions, visual language)
– add data as scenes, with video/images to describe – prototyping tool, using components to create function – for testing on the computer