Shape-changing materials development strives to generate motion in new ways, extending the traditional motor-based and mechanism-based techniques. We present a system that uses continuous magnetic force to create movement in a range of soft materials such as textile, foam, paper or silicon. The system has two parts: a control platform and a multi-material layer. The control platform is an array of electromagnets controlled by a microcontroller. The multi-material layer is made of a soft material with embedded ferromagnetic elements. The subtle electromagnetic force manipulates the embedded ferromagnetic material, resulting in continuous and organic-like movement in the material layer. We hope our system can empower designers to generate expressive movement with a broad range of materials. In our own work, we aim to extend our Empathy Objects research with soft materials, creating physical objects that convey emotion through expressive movement.
With domestic technology on the rise, the quantity and complexity of smart-home devices are becoming an important interaction design challenge. We present a novel design for a home control interface in the form of a social robot, commanded via tangible icons and giving feedback through expressive gestures. We experimentally compare the robot to three common smart-home interfaces: a voice-control loudspeaker; a wall mounted touch-screen; and a mobile application. Our findings suggest that interfaces that rate higher on flow rate lower on usability, and vice versa. Participants’ sense of control is highest using familiar interfaces, and lowest using voice control. Situation awareness is highest using the robot, and also lowest using voice control. These findings raise questions about voice control as a smart-home interface, and suggest that embodied social robots could provide for an engaging interface with high situation awareness, but also that their usability remains a considerable design challenge.
Haptic interfaces are ideal in situations where visual/auditory attention is impossible, unsafe, or socially unacceptable. How-ever, conventional (vibrotactile) wearable interfaces often pos-sess a limited bandwidth for expressing information. We ex-plore a novel form of tactile stimulation through brushing, and demonstrate BrushTouch, a wearable prototype for brushing haptics. We also present schemes for conveying information
such as time and direction through multi-tactor wrist-worn haptic interfaces. To evaluate BrushTouch, two user studies were run, comparing it to a conventional vibrotactile wristband across a number of tasks in both lab and mobile conditions. We show that for certain cues brushing can be more accu-rately recognized than vibration, enabling more effective spa-tial schemes for presenting information through haptic means. We then show that BrushTouch is capable of greater information transfer using such cues. We believe that brushing, as with other non-vibrotactile haptic techniques, merits further investigation as potential vehicles for richer haptic feedback.
This paper presents CataKit, a construction kit for children inspired by catapults, Rube-Goldberg chain reaction machines, and mechanical automata. We set out to promote children‘s initiative, positive risk-taking, and procedural thinking, all in the context of their bedrooms. Our motivation is to contrast the rising smart home movement in industry, which we fear may decrease children‘s initiative if children‘s bedrooms become too automated. We describe our design research process with six children followed by a prototype implementation and evaluation. We present the qualitative analysis of children‘s reactions and experiences with the prototype and show. that playing with CataKit encouraged children‘s systematic exploration of mechanical concepts, initiative, and positive risk-taking. We hope that construction kits like Catakit will empower kids to develop curiosity about the mechanical world around them, to think about risk taking as a potentially positive experience, and to think more critically about initiative in the smart home era.
As drones become ubiquitous, it is important to understand how cultural differences impact human-drone interaction. A previous elicitation study performed in the United States illus- trated how users would intuitively interact with drones. We replicated this study in China to gain insight into how these user-defined interactions vary across the two cultures. We found that as per the US study, Chinese participants chose to interact primarily using gesture. However, Chinese par-ticipants used multi-modal interactions more than their US counterparts. Agreement for many proposed interactions was high within each culture. Across cultures, there were notable differences despite similarities in interaction modality pref-erences. For instance, culturally-specific gestures emerged in China, such as a T-shape gesture for stopping the drone. Participants from both cultures anthropomorphized the drone, and welcomed it into their personal space. We describe the implications of these findings on designing culturally-aware and intuitive human-drone interaction.
Making activities for children often take place at informal learning environments. In this context parents may join their children for co-making activity. It has been shown that this type of activity can be facilitated by educators that serve as mentors. In this paper we aim to explore parent-child interaction in the context of a co-making activity at home. Towards that end, we developed a dedicated kit that couples Automata-building with paper circuits. We also designed five activity cards as scaffolding for parents, to raise their awareness to mentoring principles. We present our design process, evaluation, and findings from eight parent-child
co-making activities. Our qualitative analysis indicates the challenges and opportunities for parents as mentors in a co-making activity. We propose a two-dimensional scale that can help designers and maker-space practitioners better understand the different parental roles during a parentchild co-making activities, and the need for better tools and support materials for parents in that context.
We present the design process of Scratch Nodes, a sensor based prototype designed to enable kids to augment physical outdoor play in a creative and meaningful way. Scratch Nodes was designed for 8-12 years old kids with the goal of encouraging physical play with special emphasis on social interaction and coding. Our contribution is a new prototype that extends prior work by combining three design principles: outdoor play
focused on Heads-up Games (HUG), co-located social interaction, and changing the rules through coding. We argue that the combination of these three principles strikes the right balance between kids intrinsic motivation (play & measure, collaborate & compete, define their own rules) and contemporary social/cultural needs (decrease screen time, increase physical activity, increase creative/systematic exploration). We present
our iterative design & implementation process and insights generated from qualitative analysis of an evaluation with 20 kids who tested the prototype at various stages.
The design of tangible and embedded assistive technologies poses unique challenges. We describe the challenges we encountered during the design of “DataSpoon”, explain how we overcame them, and suggest design guidelines. DataSpoon is an instrumented spoon that monitors movement kinematics during self-feeding. Children with motor disorders often encounter difficulty mastering self-feeding. In order to treat them effectively, professional caregivers need to assess their movement kinematics. Currently, assessment is performed through observations and questionnaires. DataSpoon adds sensor-based data to this process. A validation study showed that data obtained from DataSpoon and from a 6-camera 3D motion capture system were similar. Our experience yielded three design guidelines: needs of both caregivers and children should be considered; distractions to direct caregiver-child interaction should be minimized; familiar-looking devices may alleviate concerns associated with unfamiliar technology
We present the design and initial evaluation of Kip3, a social robotic device for students with ADHD that provides immediate feedback for inattention or impulsivity events. We designed a research platform comprised of a tablet-based Continuous Performance Test (CPT) that is used to assess inattention and impulsivity, and a socially expressive robotic device (Kip3) as feedback. We evaluated our platform with 10 students with ADHD in a within subject user study, and report that 9 out of 10 participants felt that Kip3 helped them regain focus, but wondered if it will be effective over time and how it will identify inattention in more complex situations outside the lab.
We describe the design process of “Vyo”, a personal assistant serving as a centralized interface for smart home devices. Building on the concepts of ubiquitous and engaging computing in the domestic environment, we identified five design goals for the home robot: engaging, unobtrusive, devicelike, respectful, and reassuring. These goals led our design process, which included simultaneous iterative development of the robot’s morphology, nonverbal behavior and interaction schemas. We continued with user-centered design research using puppet prototypes of the robot to assess and refine our design choices. The resulting robot, Vyo, straddles the boundary between a monitoring device and a socially expressive agent, and presents a number of novel design outcomes: The
combination of TUI “phicons” with social robotics; gesturerelated screen exposure; and a non-anthropomorphic monocular expressive face. We discuss how our design goals are expressed in the elements of the robot’s final design.
Many children with cerebral palsy (CP) encounter great difficulties mastering self-feeding. We set out to assess the self-feeding skills of young children with CP via a novel instrumented spoon that monitors upper extremity biomechanics involved in eating. We describe the initial stages of an iterative design process, consisting of a focus group with domain experts, and rapid-prototyping. We discuss the physical, assessment and safety requirements for the spoon. In addition, we explain the potential of tangible interfaces to provide professional caregivers with valuable information regarding each child.
We present the design, implementation, and evaluation of a peripheral empathy-evoking robotic conversation companion, Kip1. The robot’s function is to increase people’s awareness to the effect of their behavior towards others, potentially leading to behavior change. Specifically, Kip1 is designed to promote nonaggressive conversation between people. It monitors the conversation’s nonverbal aspects and maintains an emotional model of its reaction to the conversation. If the conversation seems calm, Kip1 responds by a gesture designed to communicate curious interest. If the conversation seems aggressive, Kip1
responds by a gesture designed to communicate fear. We describe the design process of Kip1, guided by the principles of peripheral and evocative. We detail its hardware and software systems, and a study evaluating the effects of the robot’s autonomous behavior on couples’ conversations. We find support for our design goals. A conversation companion reacting to the conversation led to more gaze attention, but not more verbal distraction, compared to a robot that moves but does not react to the conversation. This suggests that robotic devices could be designed as companions to human-human interaction without compromising the natural communication flow between people. Participants also rated the reacting robot as having significantly more social human character
traits and as being significantly more similar to them. This points to the robot’s potential to elicit people’s empathy.
Gamification of fitness applications opens the door to cheating by exploiting inherent limitations of sensing, in order to advance in the game without performing the required physical activity. While this type of behavior is usually conceptualized negatively, we propose it could actually be beneficial for encouraging physical activity. We integrate prior work on cheating in online games with prior work on embracing non-normative behavior, and suggest design opportunities for embracing cheating in gamified fitness applications.
We present the notion of Empathy Objects, ambient robotic devices accompanying human-human interaction. Empathy Objects respond to human behavior using physical gestures as nonverbal expressions of their “emotional states”. The goal is to increase people’s self-awareness to the emotional state of others, leading to behavior change. We demonstrate an Empathy Object prototype, Kip1, a conversation companion designed to promote non-aggressive conversation between people
Children with Attention Deficit and Hyperactivity Disorder (ADHD) experience a deficit in cognitive processes responsible for goal-directed behaviors, known as executive functioning (EF). In an effort to assist them, we developed TangiPlan – a prototype of a tangible assistive technology intended to improve EF during morning routines. TangiPlan was designed based on the following guidelines: implement intervention techniques recommended by experts; reduce conflicts with caregivers; avoid intrusion; support flexibility and autonomy. These design guidelines were implemented in a prototype consisting of six tangible objects, each representing a task that needs to be completed during a child’s morning routine, and a tablet application for planning tasks and matching them with objects. An initial evaluation of the prototype with two case studies resulted in improved organization and time management, increased satisfaction, and fewer conflicts with parents during morning routines.
College students in the social sciences are required to learn quantitative research methods and statistics. Unfortunately, many fail to see the relevance of these courses, and are often anxious about them. In an effort to increase students’ engagement in the research process, we developed Ruzo – a mobile scientific inquiry platform. Ruzo enables instructors and students to create research projects as custom mobile apps, collect data on the go, and visualize the data using a web-based interactive tool. Ruzo was designed based on five guidelines, derived from interviews with domain experts: guide students through all stages of research; reduce anxiousness; encourage active learning; connect to students’ everyday lives; and adapt the system to the needs of the instructor. A user study showed that Ruzo was easy to use, and students expressed interest in research, thereby demonstrating the potential of mobile technology to scaffold scientific inquiry
We present Objects for Change (OFC), a set of design considerations based on established behavior change techniques that can serve designers of Tangible User Interfaces (TUI). We highlight empirical findings from behavior change literature, and show how to apply them to inherent TUI properties: (1) visibility and persistency, (2) locality, (3) tangible representation, and (4) affordances. We demonstrate how we applied OFC in the design of a TUI prototype aimed to promote behavior change in planning and organization tasks among youth diagnosed with ADHD.
Game design elements are often implemented in persuasive systems aimed to promote physical activity, a process called ‘‘gamification.’’ Gamification is believed to motivate users to become more active, and is commonly implemented in commercial products. However, relatively few studies rigorously evaluated the effectiveness of gamification, and they yielded contradicting findings. We set out to evaluate the effectiveness of virtual rewards and social comparison—two game elements prevalent in persuasive systems. We developed a research prototype, called ‘‘StepByStep,’’ aimed to promote routine walking. We created different versions of StepByStep, implemented as an application on Android-based mobile devices, and compared their effectiveness in two field studies. Study 1
showed that a quantified version of the application— offering continuous measurement of walking time, a daily goal, and real-time feedback on progress toward this goal—facilitated reflection on activity and significantly increased walking time over baseline level. Study 2 showed that gamified versions offering virtual rewards and social comparison were only as effective as the quantified version. Thus, we advise designers to facilitate reflection on meaningful aspects of physical activity by developing novel ubiquitous measures. Furthermore, our findings highlight the importance of systematic comparisons between quantified and gamified elements for better understanding their motivational affordances.
Typically developing children usually master self-feeding by the age of three years. However, children with Cerebral Palsy and other developmental disabilities encounter great difficulties acquiring this instrumental ability. In an effort to motivate young eaters in the process of acquiring self-feeding abilities, we set out
to develop ExciteTray – a customized self-feeding assistive technology. We describe the initial stages of an iterative design process consisting of interviews with domain experts, rapidprototyping, and evaluations with children. Based on our findings, we formulated preliminary design principles for a self-feeding assistive technology: draw attention without causing distraction; motivate the child during the various stages of self-feeding; facilitate face-to-face interaction between caregiver and child; adapt feedback to the cognitive and motor ability of each child. We explain how these principles were implemented in a prototype, discuss safety considerations and describe future work.
Long car rides can become a source of boredom for children, consequently causing tension inside the car. Common solutions against boredom include entertainment devices suitable for in-car use. Such devices often disengage children from other family members inside the car, as well as from the outside world. We set out to create a novel in-car game that connects children with their family and their environment, instead of only their entertainment devices. The game, called Mileys, integrates locationbased information, augmented reality and virtual characters. We developed Mileys in an iterative process — findings from the first round of prototyping and evaluation guided the design of a second-generation prototype and lead to additional evaluations. In this paper we discuss lessons learned during the development and evaluation of Mileys, present challenges for location-based in-car game design, and suggest potential solutions for promoting interactions inside and outside the car.
Children with Attention Deficit and Hyperactivity Disorder (ADHD) experience a deficit in cognitive processes responsible for purposeful goal-directed behaviors, known as executive functioning (EF). In an effort to improve EF, we are developing TangiPlan – a set of tangible connected objects that represent tasks children perform during their morning routine. We describe the initial stages of a user-centered design process, consisting of interviews with both domain experts and potential users, followed by paper prototyping. Based on our findings, we formulated preliminary design principles for EF assistive technology: facilitate organization, time management and planning; involve caregivers in the process, but strive to reduce conflict; implement intervention techniques suggested by experts; avoid distraction by mobile phones; avoid intrusion. We discuss the benefits of implementing these principles with a tangible interface, present our prototype design, and describe future directions
Family car rides can become a source of boredom for child passengers, and consequently cause tension inside the car. In an attempt to overcome this problem, we developed Mileys – a novel in-car game that integrates location-based information, augmented reality and virtual characters. It is aimed to make car rides more interesting for child passengers, strengthen the bond between family members, encourage safe and ecological driving, and connect children with their environment instead of their entertainment devices. We evaluated Mileys with a six-week long field study, which revealed differences between children and parents regarding their desired in-car experience. Children wish to play enjoyable games, whereas parents view car rides as an opportunity for strengthening the bond between family members and for educating their children. Based on our findings, we identify five key challenges for in-car game design for children:
different expectations by parents and children, undesired detachment, short interaction span, poor GPS reception, and motion sickness.
Tangible user interfaces (TUIs) are often compared to graphical user interfaces (GUIs). However, the existing literature is unable to demonstrate clear advantages for either interface, as empirical studies yielded different findings, sometimes even contradicting ones. The current study set out to conduct an in-depth analys is of the strengths and weaknesses of both interfaces, based on a comparis on between similar TUI and GUI versions of a modeling and simulation system called “FlowBlocks”. Results showed most users preferred the TUI version over the GUI version. This is a surprising finding, considering both versions were equivalent in regard to most performance parameters, and the TUI version was even perceived as inferior to the GUI version in regard to usability. Interviews with users revealed this preference stemmed from high levels of stimulation and enjoyment, derived from three TUI properties: physical interaction,rich feedback, and high levels of realism. Potential underlying mechanisms for these findings and practical implication sared is cussed.