• Nie Znaleziono Wyników

Explorascope: Stimulation of language and communicative skills of multiple-handicapped children through an interactive, adaptive educational toy

N/A
N/A
Protected

Academic year: 2021

Share "Explorascope: Stimulation of language and communicative skills of multiple-handicapped children through an interactive, adaptive educational toy"

Copied!
9
0
0

Pełen tekst

(1)

Explorascope: stimulation of language and communicative skills of

multiple-handicapped children through an interactive, adaptive

educational toy

C. Hummels1, A. van der Helm2, B. Hengeveld2, R. Luxen2, R. Voort3, H. van Balkom3 and J. de Moor4 1

Technische Universiteit Eindhoven, 2Delft University of Technology, 3Viataal-Research, Development & Support (RDS), 4Radboud University Nijmegen, the Netherlands

C.C.M.Hummels@tue.nl, A.J.C.vanderHelm@tudelft.nl, B.J.Hengeveld@tudelft.nl, R.F.Luxen@tudelft.nl, R.Voort@viataal.nl, H.vBalkom@viataal.nl, J.deMoor@ped.kun.nl

Abstract

Very young non- or hardly speaking children with severe disabilities need active guidance to stimulate interaction with their environment in order to develop their communicative and linguistic skills. Augmentative and Alternative Communication (AAC) systems can help this process, provided that they are tuned to this specific user group. LinguaBytes is a research programme, which aims at developing an interactive and adaptive educational toy that stimulates the language and communicative skills of multiple-handicapped children with a developmental age of 1 – 4 years. In this article we show which guidelines we consider essential for developing this tool. We have developed several concepts based on these guidelines, of which we elucidate Explorascope (E-scope). E-scope consists of a tangible toy-like interface that is adaptable to an individual child with respect to his or her cognitive, linguistic, emotional and perceptual-motor skills. A first user test shows that E-scope is promising and useful for this user group.

Keywords: tangible interaction, computer assisted learning, toy design, multi-handicapped children, AAC

1. Introduction

Problems in the development of language and communicative skills can have grave repercussions to the psychological development of young children, especially with respect to social and emotional maturation and the ability to be self-supporting. According to Heim (2001), the early parent-child interaction and the communicative development during the first years of a child’s life lay the foundations of language acquisition. In the event of severe disabilities, the interaction is not starting or progressing normally, which causes extra stagnation of the communicative and linguistic development. Therefore, it is of great importance to enhance and optimise the interaction between parents and non-speaking children and stimulate communication and linguistic usage.

Several tools make use of multi media techniques to support multi-handicapped children and train their cognitive, perceptual-motor, language and/or communicative skills, such as BioBytes (Voort et al. 1995),

IntelliKeys (IntelliTools), Tom (Platus Learning Systems). Despite their success, these and other

augmentative and alternative communication (AAC) systems are not tailored to very young non- or hardly speaking children with severely impaired motor and cognitive skills. The cognitive load of these systems is high, i.e. most AAC devices are not organised in ways that reflect how young children think (Shook & Coker 2006). Moreover, the current generation of AAC systems are not particularly appealing to young children in comparison with toys (Light & Drager 2004). They resemble PCs as for structure (menus and decision trees), input (mostly button-like) and output (often screen-based display) (Hummels, Overbeeke & Klooster 2006). Despite the useful endeavour to develop a variety of special input devices, one could wonder why young children are placed behind a desktop computer, which was originally designed for office work. Moreover, PCs are typically designed for exclusive use, i.e. one person sitting behind the screen with input devices, which is far from ideal to enhance interaction between parent and child. In addition, these systems do not capitalise on current technology and multi-media possibilities (Shook & Coker 2006; van Balkom, de Moor & Voort 2002; Hummels, Overbeeke & Klooster 2006), which are aspects that can enhance adaptation to an individual child and offer the possibility to use a variety of strategies to improve language and communicative skills.

A study by van Balkom, de Moor & Voort (2002) shows the necessity of having an interactive training system for this specific user group. The study resulted in a three-year research programme called LinguaBytes1, which aims at developing an interactive and adaptive educational toy that stimulates the language and communicative skills of multiple-handicapped children between 1 – 4 years old.

(2)

show what kind of educational toys we are aiming for. Next, we clarify what E-scope is and does, how it is used and how therapists and children perceived it during a user study.

2. Guidelines

After an extensive literature research, we have formulated eight main guidelines for the development of our new educational toy:

 Learn: Our goal is to enable children to improve their language and communication skills. We try to

achieve this by providing them with opportunities to experience and learn about the meaning of new lexical concepts and providing them with possibilities to practice in various ways and at different linguistic levels, coupled to their personal level.

 Adaptability & personalization: Due to the diversity of children with respect to their cognitive,

linguistic and perceptual motor skills and limitations, and their needs and interests, it is beneficial to create adaptive and personalised products. By creating products that resonate with these young children, the learning settings can be optimised and frustration avoided.

 Technology: Nowadays technological developments like miniaturization, embedded intelligence,

sensor technology and rich media, offer a new scope of possibilities for innovative adaptive designs

 Challenge: This key element of motivation engages children by stimulating them to reach for the

boundaries of their skills and to take initiative for interaction (Csikszentmihalyi 1990). We aim at challenging multi-handicapped children to capitalise not only on their cognitive and linguistic skills, but also on their perceptual-motor skills.

 Playful, tangible interaction: Since we are physical beings we learn to interact with the world

through our physical skills. Therefore, especially with young children that explore the world through play, we focus on playful, tangible interaction in accordance to current trends in toys.

 Independence: Feeling independent is an essential part in the motivation of children while learning.

Independence gives the feeling of being in control, which enhances the satisfaction of reaching goals resulting in self-confidence.

 Social interaction and monitor: We aim at an educational toy that evokes communicative and social

interaction between children and parents, therapist and other children. Moreover, recording and displaying performance data of a child in a user model enables therapists and parents to gain a clear understanding of the progress of that child and adjust their strategy accordingly, if not done already automatically by the adaptive system.

 Frame of reference: We like to base our design on the children’s mode of living and the way they

perceive their environment in order to inspire them, and stimulate their imagination and curiosity.

3. Our approach

Based on these guidelines, we are designing several concepts through a research-through-design approach, i.e. we generate scientific knowledge through the act of designing and subsequently testing experiential prototypes in real life settings (Pasman, Stappers, Hekkert & Keyson 2005). In this case we generate knowledge about novel, adaptive & tangible AAC systems for improving language and communication skills, which has to result in a theoretically and experientially sound product.

The iterative design process of LinguaBytes consists mainly of two phases: conceptualisation and specification. During the conceptualisation phase, we explore the scope of the new tangible tool, by building and testing a comprehensive set of concepts / prototypes. The most appropriate concept will be further developed, completed, built and extensively tested during the specification phase. The first conceptualisation round of LinguaBytes started in 2004 with E-scope, which will be further explained in this paper. We have just finished the second round of the conceptualisation phase, which consisted of building and testing four simple concepts based on interactive books, prints, and tangible objects. We are currently in the third round of this phase in which we are building and testing interactive tangible sketches of educational toys (see Figure 1). This phase will end with extensive user testing of a working prototype of the final concept. It goes beyond the scope of this paper to elaborate on all rounds; we will only focus on the development of E-scope.

4. E-scope: an adaptive tangible controller

(3)

by turning the upper ring and pushing buttons. The prototype of E-scope consists of a wooden ring-shaped toy with sensors and actuators, a computer with a wireless station and a screen (see Figure 2).

Figure 1 Left: InteractBook plays a sound file after turning the page (short story/sentence) or touching a PCS symbol (related word); Middle: PictoCatcher is an interactive marble course where each marble with an

enclosed PCS symbol is related to a story and triggers a movie when entering the wooden box; Right: ObjectSlider starts or alters an animation by placing the related tangible objects next to the screen.

Figure 2 E-scope consists of a wooden toy, a computer with a wireless station and an optional separate monitor (left). The upper and lower ring of E-scope communicate with the computer through radio

transceivers (right). All sensors, actuators and batteries are built into the layers of E-scope.

E-scope is adaptable to a child and can be used in different configurations (see Figure 3). A child can listen to stories or play educational games by rolling E-scope over pictures that are lying on the floor. Every picture triggers a corresponding story. The buttons can be used for further deepening of the concept. E-scope can also be used at the table. By turning the upper ring and pushing the buttons on the wooden ring a child can interact with stories shown on an integrated or a separate screen, depending on the ergonomic and social requirements. If necessary, E-scope can also be attached to alternative input devices, for e.g. one button or eye-movement interaction. In this last case, the upper ring is rotated by use of a motor.

The different configurations all require a different level of motor skills and have different social impact. Moving the E-scope over the floor is suitable for children with gross motor difficulties, but with the ability to move their entire body by crawling and gliding. It is a very playful and informal way of interacting, especially when the parent or therapist is also sitting on the floor. Children with gross motor difficulties and who have less control over their entire body can use the table set-up and push an additional big button. Turning the upper ring requires more control and coordination between both hands, and pushing buttons requires fairly fine motor skills of at least one hand. The position of the screen has next to ergonomic aspects also clear social implications, because it determines how the parent or therapist and the child are positioned in relation to each other.

(4)

Figure 3 E-scope configurations: Using pictures (upper left); integrated LCD screen (lower left); with separate screen (centre), or with a variety of input devices making it possible for a severely motor

handicapped person to operate it (right).

Figure 4. E-scope can be used with or without symbols

5. Multi-media content

The stories that E-scope offers aim at being rich and engaging. Therefore, the stories use a variety of visual and auditory output such as photos, drawings, movies, symbols, spoken stories, sounds, songs and written text. The graphical style aims at being realistic for an optimal recognition of the concepts to be learned, but which enough freedom to stimulate the imagination of the children (Figure 5). A small user study with two children indicated a preference for photos compared to drawings.

For children who have problems with attachment and recognition, such as autistic children, child-related photos can be imported and voice and sound recordings can be made with E-scope. Therapists and parents can also use this option to focus on a specific or actual topic, such as telling the story of a day with pictures of that child’s environment or the birthday of a family member.

(5)

This brings the user to the next menu levels, which can be browsed through in the same way. By turning the ring to the right, one goes deeper into the menu, and by turning to the left one goes back up again.

Figure 5. Different graphical styles: photos and drawings

Figure 6 By turning E-scope’s upper ring, one moves through the different menu screens

6. E-scope prototype

To develop a working prototype of E-scope we designed software architecture with three subsystems, briefly discussed in the next sections and illustrated in Figure 7.

6.1 The content store

The content store is a collection of stories and games. The scripts for the narratives are developed based on literature on the linguistic development in the first years of a child’s life (van Balkom, de Moor & Voort 2002). From these scripts, all elementary content material like photos, movies, drawings, spoken text, etc. are created and imported into the content store.

The next step involves linking these elements into narrative structures like arranging the pages of a story. Thereupon the conditions for page transitions are defined; i.e. these transitions are coupled to specific actions on the tangible controller, e.g. a button press. The conditions can also be seen as educational targets, the reward for pressing the right button is getting to the next page.

For example, within the story “Jitte goes to bed”, the first page consists of:

 A picture of Jitte who is tired (see Figure 5)

 A sound file with a voice saying: “Jitte is tired, she wants to go to sleep”

(6)

Figure 7 System diagram

The second page consists of:

 A PCS icon of the word ‘sleep’

 A sound file with a voice saying “sleep”

 A picture of the words ‘sleep’ in text

 And the condition: wait for button press of button showing the PCS icon ‘sleep’

By placing these files in the content store and after assigning links and conditions, the final story that will run on E-scope can have the following outcome:

E-scope starts by saying, “Jitte is tired, she wants to go to sleep” and meanwhile showing the photo of Jitte lying on the couch. Subsequently, the system automatically starts page 2 and the voice is saying, “sleep” while showing the PCS icon of ‘sleep’ and with the text ‘sleep’. Now the child has to push the ‘sleep’ icon on the ring, or any button when no PCS icons are used, or even no action at all when the E-scope is running in automatic mode. The word ‘sleep’ is repeated again and subsequently the following scenes appear until the entire story with nine scenes is played.

Although the majority of stories will be imported beforehand, parents and therapists can add stories that are tailored to the needs of an individual child, to the content store, The current version has two stories ‘Jitte goes to bed’ and ‘Children’s farm’. Only after having tested the different concepts (e.g. E-scope, PictoCatcher and ObjectSlider) and having decided upon the chosen concept, the stories and games will be further developed.

6.2 The user model

E-scope can adapt content browsing to the physical capabilities, knowledge level or concerns of the child through the user model subsystem.

The system can also record performance data in the user model to enable the therapist to gain insight into the progress of a child. For example, the first time when the child uses E-scope, a profile has to be made, stating the level of motor skills (ability to push buttons, use of specialised input devices, preference to use at a certain place etc.), the cognitive and linguistic level (vocabulary, pronunciation, mental development, mastering of certain games etc.), attitude (level of independence, social interaction, level of communication etc) and personal concerns (preferred graphical style, hobbies, preferred subjects etc.).

6.3 The E-scope player

E-scope can adapt content browsing to the physical capabilities, knowledge level or concerns of the child. The E-scope player subsystem is tightly coupled to the content store and the user model. The E-scope player interfaces the tangible controller and detects the configured page transition conditions, upon which it schedules the playback of content on the next page and updates the user model to reflect the progress made. For example, if the child has a fairly low linguistic developmental level and rather fine motor skills, the E-scope player subsystem can decide to use the coloured buttons and show the next page when a arbitrary button is pushed. When the child is not responding and no button is pushed within 5 - 10 seconds, the E-scope can give a hint on the screen or through speech.

(7)

The current version of E-scope is a not a fully implemented product, but a prototype to get feedback for further development. This first version of E-scope was tested with three children and three therapists in the Rehabilitation Centre St. Maartenskliniek in Nijmegen, The Netherlands. E-scope was explained to the therapist before the actual therapy began. Child 1 interacted with ‘Jitte goes to bed’ with help from her therapist who pushed the buttons, while she could look at the story on a separate screen that was placed behind E-scope on the table (Figure 8 left). Child 2 played with ‘Children’s farm’ and was listening to short stories by rolling the E-scope over drawings that were lying on the floor (Figure 8 middle). Child 3 looked at ‘Jitte goes to bed’ on a separate screen that was standing on the table (Figure 8 right). Each session took half an hour and was conducted during a ‘regular’ speech therapy session. The sessions were annotated by an observer on the spot and also videotaped for further analysis afterwards. The therapists were interviewed after their session with respect to their general impression, feasibility of the concept, challenge, playfulness, the support for learning, tangibility, the suitability for teaching and graphical style.

Figure 8 Therapists tested E-scope with three children

The test indicated that the overall concept is promising and useful. The therapists were very positive about the toy-like appearance and its playful, engaging and sensorial character. They were enthusiastic about the diversity of interaction style and multi-media output, thus supporting different strategies and skills, which is hard to do with the currently used methods and tools. Moreover, the therapists preferred further adjustments to improve the personal fit. For example, the therapist from child 1 would like to use E-scope with an integrated motor that can be operated through eye-movements. Working with a novel toy and two observers was a bit too much for the autistic child 2. His therapist advised us to enable his family creating their own, familiar content (images from home and speech from his parents) Moreover, if child 2 could get familiar with E-scope by using it frequently as a regular toy, e.g. a musical instrument; it could help the speech therapy with E-scope. The therapist from child 3 wanted an integrated screen to enhance social interaction by sitting opposite each other with E-scope in the middle.

The children gave a similar impression when looking at their behaviour. The two girls were very enthusiastic. Child 1 was clearly excited by the stories and graphics, and seemed to feel enjoyment during interaction (laughing frequently) and she had a proud look on her face after having selected the correct PCS symbol several times. The toy amazed and captivated child 3 who had her eyes wide open and who was showing a beautiful concentration. The tangibility of the toy challenged her to push the buttons frequently and she was delighted to receive the movies as a kind of reward. Child 2 was very restless and was overwhelmed by this deviation from his regular routines. However, despite this behaviour, he immediately understood the working of E-scope and interacted with it frequently.

E-scope will be further developed in the near and mid-term future and we plan to test an improved version with these and other children again. The findings of the studies are used to enhance the development of other concepts for an interactive, adaptive educational toy. The results of the overall conceptualisation phase will be an extensive user study with one or several working prototypes.

Notes 1

LinguaBytes is cooperation between Viataal-Research, Development & Support (RDS), Radboud University Nijmegen, Technische Universiteit Eindhoven, Delft University of Technology, The Phelps Foundation and seven other Dutch funds sponsor the project.

Acknowledgements

(8)

References

Csikszentmihalyi, M. (1990) Flow: the psychology of optimal experience, Harper & Row, New York.

Heim, M. (2001) ‘Nauwelijks sprekend, veel te zeggen. Een studie naar de effecten van het COCP-programma’, PhD, Netherlands Graduate School of Linguistics, Utrecht.

Hummels, C., Overbeeke, C.J. & Klooster, S. (2006) ‘Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction’, Personal and Ubiquitous

Computing, [online], http://dx.doi.org/10.1007/s00779-006-0135-y, accessed November 2006.

IntelliTools, [online], http://www.intellitools.com, accessed July 1 2006.

Light, J.C., Drager, K. & Nemser, J. (2004) ‘Enhancing the appeal of AAC technologies for young children: lessons from the toy manufacturers’, Augmentative and Alternative Communication, vol. 20, no 3, pp. 137-149.

Pasman, G., Stappers, P.J., Hekkert, P. & Keyson, D. (2005) ‘The ID-StudioLab 2000-2005’. Proceedings of

Design Research in the Netherlands 2005, Eindhoven, pp. 193-204.

Platus Learning Systems, [online], http://www.sgs.at/mayr/tomstart/index.htm, accessed August 1 2006.

Shook, J. & Coker, W (2006) ‘Increasing the appeal of AAC technologies using VSD's in preschool language intervention’, Proceedings of the 22nd Annual International Technology and Persons with Disabilities

Conference. Los Angeles [online], http://www.csun.edu/cod/conf/2006/proceedings/csun06.htm, accessed

August 22 2006.

Van Balkom, H., de Moor, J. & Voort, R. (2002) LinguaBytes. Een studie naar de ontwikkeling van een

computerprogramma voor niet- of nauwelijks sprekende peuters met een motorische beperking,

Expertisecentrum Atypische Communicatie (EAC), Nijmegen.

Voort, R., de Moor, J. & Waesberghe, B. (1995) Bio Bytes. Een computerprogramma voor peuters met een

motorische handicap: handleiding, Foundation Edupro, Ridderkerk.

Information about the Authors

Caroline Hummels is working as an associate professor at the Designing Quality in Interaction group at the

department of Industrial Design, TU Eindhoven and as a designer/researcher in her own company ID-dock. Through building and testing experiential products and interactive tangible multi-media installations her work questions what evokes resonance during interaction.

Aadjan van der Helm is a creative software engineer. He is involved in research and education at the

ID-Studiolab, an institute of the Industrial Design Faculty at the Delft University of Technology. He is mostly active in the fields of early prototyping and tangible interaction. He has 15 years experience working with computer technology in a scientific context in the fields of computer graphics, interactive design and art.

Bart Hengeveld studied Industrial Design Engineering at Delft University of Technology, where he

graduated in 2001. Between 2001 and 2006 he worked as a multimedia designer and freelance product designer, specializing in design for kids. In 2006 he started his PhD project about Adaptive Toys for Learning Language at ID StudioLab, Delft University of Technology.

Rob Luxen is an engineer in electronics who builds prototypes for research and educational purposes in the

ID-Studiolab, an institute of the Industrial Design Faculty at the Delft University of Technology. He has 35 years of experience in analogue- and digital electronics and microcontroller programming.

Riny Voort is an educational psychologist. From 1987 to 2006, she worked at LCN where she developed

(9)

Jan de Moor is Professor in Child Rehabilitation at Radboud University Nijmegen. His researches focuses

on the reduction of disturbances and limitations in learning, behaviour, coping for oneself and communication, and on the realisation of optimal participation in society.

Hans van Balkom is head of Viataal’s Research, Development and Support Group in St. Michielsgestel and

Cytaty

Powiązane dokumenty

sele,:tief is voor benzeen. In destilla~iekolom Tl1 wordt. Deze 'Joeding is. programma voor multic0mponent distillation uit lit[16]. Hiervan is gebruik gemaakt naar

Mapa katastralna Boguchwały z 1849 roku jest ważna dla usta- lenia zmian zachodzących w układzie przestrzennym, jakie dokonały się w drugiej połowie XIX wieku Przedstawia

Niekiedy czytelnik – zwłaszcza ten niezgłębiający profesjonalnie zawiłości biogra- fii pisarki i meandrów świata przełomu stuleci – podczas lektury będzie odczuwał niedosyt

Jednak to, co wszyscy musimy uwzględnić i zaakceptować, to fakt, że sposób ucze- nia się ludzi dorosłych znacznie różni się od uczenia się dzieci, i że edukacja dorosłych

In this paper, we have outlined some of the main factors that cause severe limitations in the language, emergent literacy, and communication development of very young

Having first mentioned the already known definition of the graded type of the offence of embezzlement of public money involving pecunia residua, Paulus then claimed, referring to

Wydaje się jednak, że głównym celem jego rozważań było ukazanie już na początku odradzania się państwowości polskiej rządowi i parlamentowi konieczności

The basic language skills connected with the right hemisphere are lexical-semantic processes, transformation of complex language information and emotional prosody.