Intelligent User Interfaces for Ubiquitous Computing

Size: px
Start display at page:

Download "Intelligent User Interfaces for Ubiquitous Computing"

Transcription

1 Intelligent User Interfaces for Ubiquitous Computing Prof. Dr. Rainer Malaka Bremen University Faculty 3, Computer Science P.O. Box D Bremen Tel.: ++49 (0) Fax: ++49 (0) To appear in: Handbook of Research on Ubiquitous Computing Technology for Real Time Enterprises Edited By M. Mühlhäuser and I. Gurevych 1

2 INTELLIGENT USER INTERFACES FOR UBIQUITOUS COMPUTING ABSTRACT Designing user interfaces for ubiquitous computing applications is a challenging task. In this chapter we discuss how to build intelligent interfaces. The foundations are usability criteria that are valid for all computer products. There are a number of established methods for the design process that can help to meet these goals. In particular participatory and iterative so-called human centered approaches are important for interfaces in ubiquitous computing. The question on how to make interfaces more intelligent is not trivial and there are multiple approaches to enhance either the intelligence of the system or that of the user. Novel interface approaches follow the idea of embodied interaction and put particular emphasis on the situated use of a system and the mental models humans develop in their real-world environment. Keywords: usability engineering, user interface, interaction design, intelligent user interfaces User interfaces for computational devices can be challenging for both their users and their designers. Even such simple things as VCRs or TV sets feature interfaces that many people find too difficult to understand. Reviews and tests of consumer electronic devices very often rank bad usability even higher than technical aspects and the originally intended main function of the devices or features. Moreover, for most modern appliances there is not much technical difference in their core functions. For instance TV sets differ less in quality of display and sound and more in the way the user interacts with the device. This already shows why user interface design is crucial for any successful product. However, we want to extend the question of user interface design in two directions: the user interface should become more intelligent and adaptive and we want more suitable interfaces for Ubiquitous Computing scenarios. The first aspect seems to be clear at first sight: Intelligent user interfaces are just what we want and nobody will neglect the need for smart, clever, and intelligent technology. But it becomes more difficult, if we strip away the buzzwords and dig a bit deeper into the question of what an intelligent user interface actually should do and how it would differ from an ordinary interface. Would the standard interface then be a stupid one? The second aspect introduces a new level of complexity: An interface is by definition a clear boundary between two entities. A user interface resides between human and machine; other interfaces mediate, for instance, between networks and computers. In Ubiquitous Computing we have the problem that there might not be a clear boundary any more. Computers are no longer visible and in the end, they can disappear from the user s conscious perception. We will, therefore, face the challenge of building an interface for something that is rather shapeless. 2

3 In the following, we will go into more detail through these questions and will introduce some general approaches for designing user interfaces. We will see that we can learn from good interface design for other classical devices, and that we can apply many of those user interface design principles for Ubiquitous Computing as well. A central aspect will be the design process that helps to find the right sequence of steps in building a good user interface. After discussing these general aspects of user interface design, we will focus on the specific needs for Ubiquitous Computing scenarios and finally on how to build intelligent user interfaces or to be less euphemistic: to avoid stupid interfaces. BUILDING GOOD USER INTERFACES The design of a good user interface is an art, which has been ignored for a long time in the information and communication technology (ICT) business. Many software developers just implemented whatever they found useful for themselves and assumed it would also be beneficial for the respective users. However, most users are not software developers and their way of interact with technology is very different. Sometimes, the result is technology that is highly functional and useful for a small group of people, namely the developers of the system, and highly inefficient, frustrating or even unusable for most other people. Some of the highlights of this dilemma can be found in the communication with the user when something goes wrong: An error message notifying the user: an error occurred, code 127 might be of some use for the developer and help in his efforts in debugging the system, but a user will hardly be able to understand what went wrong. Today usability plays a much bigger role and many systems (including computer systems) are now designed with more care for easy and safe usage. On the one hand, this is due to legal constraints demanding accessibility but also due to the fact that many systems do not differ so much in their technical details and vendors have to diversify their products solely in terms of their look and feel. We now have a wealth of methods, tools, and guidelines, which all help to develop a good user interface (Dix et al., 1998; Mayhew 1999). However, there is not one single recipe whose application guarantees 100% success. The essence of usability engineering is to work iteratively in order to achieve the goal of better usability. Let us briefly go through these steps and summarize some of the most important issues of usability engineering. For more detailed information, a number of textbooks and research articles can be consulted (Dix et al., 1998; Nielsen, 1993; Shneiderman, 1997). The first question of usability engineering is the question of what goals we actually want to achieve. The typical list of usability goals contains at least the following five (ISO 9241, 2006): Safety and security Good design should not harm users or other people affected by the use of a product. It should also help to avoid errors made by humans in using the system. Effectiveness A good user interface supports a user in solving a task effectively, i.e., all aspects of a task can be actually handled. Efficiency and Functionality 3

4 A well designed and usable system should allow for quick and timely work. Joy and fun How enjoyable is it to work (or play) with the system? Is it fun or is it a pain to interact with it? Ease of learning and memorizing How fast can new users interact with the system and will they remember what they learned. This list, of course, is not exhaustive and not all aspects can be fulfilled to the same (high) degree, which is to say that there are classic trade-offs. Some aspects, therefore, might even be in conflict with others and it is important to identify such conflicts and to decide which aspect to optimize and to what extent. For instance, when designing an interactive game, joy and fun might be more important and effectiveness is less important. In contrast, a system for firemen has to be more efficient and can be less fun. Another typical trade-off exists between the need for efficient work and for training. One solution can be to provide two modes: an expert mode and a novice mode. As a general rule, all efforts and goals of usability should be measurable in quantitative or qualitative ways. And since most usability criteria depend on the actual use of a system, there is a need to involve users in the design process. Of course, many human factors have been studied and psychologists have theories about how people can perceive information and how they can in principle react. But, in order to actually find out if the goals are met, one must try things out with actual users. And the more unknown your application terrain is, the more involvement of users is required, which is of particular importance for Ubiquitous Computing because there is not yet a large set of experience, studies, and guidelines at hand. The design process that involves users has been named human-centered design (ISO 13407, 1999). Its principle is to develop an application iteratively with evaluations in every cycle. Human-centered design also is regarded as the best approach when design goals are hard to formalize in technical terms. There have been multiple approaches for system design processes that involve the users. Their roots are in the participatory design idea from Scandinavia that involves workers in the definition and design of their working environment (Olson & Blake, 1981). In contrast to the classical waterfall model in systems engineering (Royce, 1970) that segments the design process into a linear order of clearly separable steps, these models iterate and involve users and evaluations in each cycle. A number of models have been proposed replacing the waterfall scheme by cycles or stars, i.e., the design process is open and decisions can be revised depending on user feedback during development (Gould et al., 1991; Hartson & Hix, 1989; Hix & Hartson, 1993). Since many usability goals are not well-defined and cannot be formally defined beforehand, these models allow for a continuous evolution of the usability of the system (Fig. 1). 4

5 implementation task analysis/ functional analysis evaluation prototyping requirements /specification conceptual design/ formal design Fig. 1: Star model for user-centered design (Hartson & Hix, 1989) The design steps in these models are the following: Definition of the context As a first step, designers should consider the context of their envisioned product. This includes defining the way the system will be used, if it will be used for life-critical or fun purposes, in home or in office environments as well as the market situation. The latter is important because it tells something about expectations of users and about who is going to buy the product. In general, not only the target users are involved in deciding about the success (i.e., sales) of a product. Decisions can be done by managers of the users and they can influence third parties such as the customers or clients of the users. Description of the users Based on the context definition, each group of directly or indirectly affected users must be carefully analyzed. Their physical and cognitive abilities, their cultural and social background may affect the way they interact with the system. Special needs may play a role. Accessibility has become important for IT systems and is demanded by many legal constraints in particular in working environments. Task analysis Multiple techniques help to derive a rather formal description of the task users want to solve from informal interviews and observations. Most importantly, designers should find out how users actually solve their task currently (not how they think they do it) and how they make use of tools at hand, how they communicate and how their context influences the course of activities. Requirements/Specification 5

6 This step would have been the first step of the classical software development process. For user-centered design, it is now based on a better understanding of the users, their context and their tasks. Moreover, the specifications can be changed in each iteration, when a better understanding of the system could be gained through evaluations. Conceptual design/formal design The requirements and specifications are translated into system components. Prototyping Instead of doing it right the first time we iteratively build prototypes of the system. A prototype can be a very simple design sketch or an almost complete and working system depending on the stage of iterations in the design process. Evaluations Evaluations are essential for assessing the progress of the design process and for deriving a better understanding of the tasks, the requirements and thus a better specification and implementation of the system (prototype). Implementation, Tests, Maintenance Whenever the iterations have got to a stage where the prototype sufficiently fulfills the design goals, the final prototype (product) can be implemented. Of course tests and maintenance are as important as in classical system engineering. Moreover, they can help to further improve the system and in particular user feedback after deployment can be used for defining new development cycles. These design steps are the building blocks for good user interface design. They are very generic and they are valid for basically every interactive system. Iterative development, however, is inevitable for the design of human-computer interaction in Ubiquitous Computing as we enter a domain of interactive systems, where we cannot derive system requirements from interaction goals without user involvement. This is mainly due to the fact that interaction in Ubiquitous Computing aims at intuitively usable pervasive IT systems that assist users in their real-world endeavors. Without taking these aspects into account, these systems are subject to failure. Many Ubiquitous Computing prototypes are completely technology-driven. Their developers focus on smart new gadgets, networks and infrastructure but they do not focus their design efforts on their users. Just for the sake of plausibility, some usage scenarios and users are added to the design. Such systems will not leave the research labs and they will fail to find their market. EVALUATIONS, ALTERNATIVES AND PROTOTYPES The importance of iterations in designing intelligent user interfaces for Ubiquitous Computing has now been emphasized. However, how should that bootstrapping actually happen? Where to start, how to proceed and when to stop? If we need a full fledged prototype in each iteration along with an evaluation with a high number of users, the costs for developing a better Ubiquitous Computing application will rapidly explode. Fortunately, things can be done more efficiently and some techniques help to manage the process: Where to start? 6

7 In order to get a first impression of how to build a system that actually meets the usability goals, e.g., being an understandable and enjoyable assistant for some user task, we do not need any system but can make up a fake system without bothering with how to build a real one. A number of methods can be used (Dix et al., 1998; Shneiderman, 1997): Design sketches: Instead of actually building something that looks like a real system, users or usability experts can evaluate early design ideas. First sketches on paper or on a blackboard can already give an impression of the designer s ideas, and feedback can already help to avoid basic mistakes. Moreover, the discussion can facilitate the mutual understanding of the users world and the prospective system. Wizard of Oz experiments: If, however, the users should already get an impression of how the interaction with the system might look, a system can also be simulated. A human operator remote controls all functions of the environment and the test users are told they are already interacting with the system. This technique has been proven to be extremely fruitful for systems that need data on the interaction in advance. For instance for systems that are language controlled, Wizard of Oz experiments can be used to collect utterances and language patterns that help to build speech recognizers and grammars for the real system. Mock-Ups: A mock-up is a model of the system that already exposes the look and feel but does not yet include real functionality of the intended system. Early mock-ups for graphical interfaces can, for instance, consist of a PowerPoint walkthrough through a system or some Web sites emulating a system. Prototypes: In contrast to the mock-up, the prototypes include actual functionalities of the target system. They may iteratively evolve to the final system. Since many applications for Ubiquitous Computing scenarios are embedded into real-world tasks and many of them are also affected by or affect other objects in the users surroundings, Wizard of Oz experiments are a cheap and very beneficial first step in system design. They can help to understand how people would interact in an environment that is enhanced by Ubiquitous Computing technology. Moreover, the designers get data that help to design interaction with the system. For most cases of more natural interaction such as speech or gesture, such data is necessary anyway because the recognizers need training data. How to proceed? Evaluation is the core of the above mentioned star-model. Depending on the maturity of the design, the budget and the nature of the system, a great variety of evaluation techniques can be used. Evaluation methods can be classified according to the following dimensions: Qualitative vs. quantitative methods: In qualitative methods, feedback in form of comments, impressions and subjective ratings is collected in interviews or questionnaires. Quantitative methods measure parameters such as error rates, task completion times or movements of users in order to estimate the quality and efficiency of an interface. 7

8 Studies in the field or in the lab: Field studies are conducted under realistic conditions where the systems are actually used, e.g., in the office or home of the users. They usually need more effort than studies in the lab under simulated conditions, but they yield more realistic results. User tests or expert evaluations: User studies involve real test users. They are more expensive than expert evaluations where a few experts judge the system by their experience on user behavior and the application domain. There are many well-known techniques for both such as cognitive walkthrough, discount evaluation, thinking aloud - and in some cases even combinations may be useful. System state (sketch, mock-up, prototype ): As discussed above, in early evaluations, a system does not necessarily have to be fully functional but can rather be a sketch or a mock-up. It is beyond the scope of this chapter to go into all details of evaluation techniques. We will focus rather on the most important aspects for Ubiquitous Computing interfaces. Even though evaluation is crucial for the design of good interfaces, it should be noted that evaluation techniques do not solve all problems and can even be misleading. One of the main problems of evaluations is that they are always limited snapshot observations restricted in the time of usage and the complexity of the context. This is important to note, in particular, for Ubiquitous Computing systems interfaces. Take, for instance, the famous Ubiquitous Computing scenario of an intelligent refrigerator that keeps track of its contents and can alert a user when she is running out of milk. In an evaluation setting one could look at users while they are at home or while they are in a supermarket and one could measure how they react to notifications of the system. A questionnaire reveals if the users like the system and would like to buy it when it comes on to the market. In a realistic setting, a study would observe some 10 to 20 users each over a time span of one to two hours of interaction. All would be in the same representative supermarket and in some model kitchen. The results would be definitely interesting and the study would even go beyond many other evaluations of similar systems. However, it is too limited for multiple reasons: No long-term observation: Since users would interact with such a Ubiquitous Computing system not only for a few hours but rather over months or years, the short interaction of a novice user does not reveal much about the user s future interaction. Limited frame of context: In order to gain comparable results, all users are set to the same or a similar context. In everyday situations, however, contexts may differ a great deal and users show a much higher degree of variation in their behavior. Additional tasks, people, and devices: As with most Ubiquitous Computing applications, users may not be focused on just one task but may be doing many other things concurrently. They could have other devices with them or be interacting with their colleagues or family members. These limitations of evaluation results make some of them questionable. However, by using a good and careful evaluation design, some aspects can be counterbalanced. Moreover, keeping the limitations in mind may help to focus on the right questions and avoid overstating the results. And finally: even when evaluations only shed limited light on the 8

9 usability of a system, this is much better than working in complete darkness without evaluations. As a rule of thumb, it should be noted that evaluations for Ubiquitous Computing interfaces should be made as realistic as possible. Thus field studies would be better than lab conditions. Moreover, the designers should have a clear understanding of what they want to achieve with their system in order to know what they want to prove using evaluations. When to stop? The development cycle should not be an endless loop. In general, the (re-)design-prototypeevaluation cycle can go on forever leading to a continuous increase of usability. In practice, either the number of cycles is fixed beforehand or certain measures define when the loop has to be stopped and the final design is achieved. Typically these measures would quantify the usability goals listed at the beginning of this chapter. Such a goal could be 95% of the test users rate the system as very convenient or the task completion rate within 30 minutes is 98%. In some cases the stop-criterion is not bound to usability but to other measures such as we are out of budget or the deadline is next week. SPECIFIC CHALLENGES OF USER INTERFACES FOR UBIQUITOUS COMPUTING So far we have learned about how to design a good user interface. The principles we discussed are rather generic and they apply of course for designing intelligent user interfaces for Ubiquitous Computing, but they are also valid for other user interfaces such as Web-interfaces or interfaces of desktop applications. The general process of humancentered design could even be applied to non-it products such as cars, coffee machines and other objects of our daily life. It is a matter of fact that we have got to such a generic process. On the one hand, good usability is a property that is generic and the design process is fairly similar in multiple domains. On the other hand, Ubiquitous Computing is about integrating things into the objects of our normal life. Thus usability has, owing to the very nature of Ubiquitous Computing, got something to do with the usability of everyday things. Since the early days of Ubiquitous Computing, usability has been in its focus. Mark Weiser s idea of Ubiquitous Computing encompasses invisible interfaces that are so naturally usable that they literally become invisible for the user s conscious perception (Weiser 1999a, b). This notion goes back to the German philosophers Georg Gadamer and Martin Heidegger who call such interaction with things that we use without conscious awareness things that are ready-to-hand or at our horizon. In this phenomenologist view, the meaning of the things is actually derived from our interaction with them. Such a view on interactive artifacts has become popular in Ubiquitous Computing and is closely related to the notion of embodiment (Dourish, 2001). This is a fundamental shift from the classical positivist approach in computer science, i.e., modeling the real world in simplistic formal computer programs, to an embodied approach that takes the user in the real world into account. This is relevant for Ubiquitous Computing for multiple reasons. On the one hand, Ubiquitous Computing applications are to be used in complex real-world settings and their meaning (for the user) will, in fact, only evolve in the course of action. Additionally, if 9

10 things should become natural extensions of our physical abilities, they must be designed such that they do not need conscious interference from their users. Given this notion of being invisible we can see that this does not necessarily mean not there, but rather present without conscious interaction. The most basic examples for such physical objects are our body parts. We do not have to think consciously about what we do with our arms, but we just do the things we want. When we leave our house, we do not have to remember: let s take the arm with us, we might need it today. It is there and ready for immediate use. When we throw a ball, we just throw it and we do not think and plan how to make our hand grasp the ball and our arm swing around in order to accelerate the ball. In this sense, our arm is invisible but also very present. Thus if we speak of Ubiquitous Computing interfaces that are invisible or computers that are disappearing, we actually speak of things that are present and ready-to-hand. However, the artifacts we interact with might not be consciously realized as computers. A good example of such a ubiquitous technology is present in our homes already: electrical light. Whenever we enter a room that is dark, we just find a switch with our hands next to the door and the light goes on. Without thinking we turn on the light. We do not think of cables that conduct electrons. We do not have to consider how the light bulb works or how they generate electricity at the power plant. We have a very simplistic model of how the thing works and it is internalized to such a degree that we do not have to think about it when we enter a room. These mental models of how things work play an important role in designing good user interfaces as well as in designing other everyday things (Norman, 1998). Donald Norman emphasizes that a good design is about providing good mappings (Fig. 2): The design model must be mapped to the system image. Users must be able to map their understanding (mental model) to the system. The system must allow the user to map its image to the user s model. 10

11 Design Model Designer User s Mental Model User System System Image Fig. 2: Mappings of design model, mental model and system images (Norman, 1998) The question is now, how can a system image support the appropriate user s mental model. The answer with our notion of embodiment in mind must bring the meaning of things into the things themselves and thus a user can derive the meaning of something from the interaction with it or from its mere appearance that may signal some properties indicating how to use it. Such properties have been named affordances (Norman, 1998). The idea of affordances is to bring knowledge into the world instead of having it in mind. Many highly usable things that surround us just let us know by their physical appearance how we can use them. A chair for instance does not need a label or instructions on how to sit on it. We just see and know it is a chair and we know what to do with it. Similarly, affordances have been defined as virtual affordances for computer interfaces and many metaphors on our computer screens signal functionalities, e.g., mouse pointers and scrollbars. With the advent of Ubiquitous Computing, the term affordance becomes again more literally a property attached to the physical properties of things. Many Ubiquitous Computing objects include tactile interfaces or smart objects with physical and not just virtual properties. There are a number of consequences arising from this perspective of embodied interaction for Ubiquitous Computing: - Support mental models: humans use mental models to understand and to predict how things react to their actions. The system image should support such mental models and make it easy to understand it. - Respect cognitive economy: humans re-use their mental models. If there are wellestablished mental models for similar things, then they can be a good basis for an easy understanding of a new artifact. 11

12 - Make things visible and transparent: In order to understand the state of an object it should be obvious what is going on. For instance, a container can indicate if it is loaded or not. - Design for errors: Mappings between the user s model and the system sometimes fail. Most human errors are, in fact, mapping errors. Therefore, systems must assist users in finding a solution for their task even if something has gone wrong. There are a number of techniques for doing so, e.g., allowing undo-actions or sanity checks on user inputs. - Internal and external consistency: Things within an application should work consistently. For instance pushing a red button always means stop. External consistency refers to expectations users may have from usage of other applications. If we add some Ubiquitous Computing technology to a cup and turn it into a smart cup, a user will still expect the cup to work as a cup. With these guidelines and the general design process considerations we are already well prepared for building very good interfaces for Ubiquitous Computing applications. However, there are a number of further practical considerations and human factors that play a role for Ubiquitous Computing user interfaces. Some of these issues are related to the very nature of these applications being ubiquitous and some are more related to technical problems in mobile and ubiquitous scenarios. We will briefly highlight some of these aspects. Due to the broad spectrum of possible applications, we cannot go into details of all possible factors. Human factors for Ubiquitous Computing In classical human-computer interaction, we have a well-defined setting. In Ubiquitous Computing, we do not know where the users are, what tasks they are doing currently, which other persons may be around. This makes it very hard to account for some human factors that can greatly influence the interaction. Depending on time, concurrent tasks etc., the user s cognitive load, stress level, patience, and mood may vary extremely. Thus an interface can, in one situation be well-suited and in another situation the user is either bored or overloaded. Another problem lies in spatial and temporal constraints. In many Ubiquitous Computing applications, location and time play a crucial role. Users need the right information at the right time and place. In a system that helps a user to navigate her vehicle through a city, the information turn right only makes sense at a very well-defined point in space and time. An information delay is not acceptable. Even though space and time are the most prominent context factors in systems today, other context factors may also play a big role (cf. chapter Context Models and Context Awareness ). An interface can adapt to such context factors and take into account what is going on. In particular, the user might not have the focus of attention on the system but rather might be busy doing something else. But not only userdriven activities can distract the user, other people and events are not the exception but the normal case in many Ubiquitous Computing scenarios. This has a huge effect on the interface and dialog design. While in desktop applications, the designer can assume that the user is looking at the screen and a system message is (in most cases) likely to be read by the user, in Ubiquitous Computing we must reckon with many signals from the system being ignored by the user. 12

13 The interfaces can try to take the users tasks into account and thus adapt their strategy to reach the user s attention. For example., when the user is driving a car, the system might interact in a different way than when the user is in a business meeting. However, when the system is literally ubiquitous, the number of tasks and situations the user might be in can be endless and it is not feasible to model each and every situation. The system interface might then instead be adaptable to a few distinct modes of interaction. Who is in Charge? As we mention adaptation and adaptivity, we get to a point where the system behaves differently in different situations. This can be a perfect thing and can significantly increase the ease of use. A mobile phone, for instance, that automatically adapts to the environment and stays silent in a business meeting, but rings in other situations is rather practical. However, the tradeoff is a reduced predictability and as discussed above, many usability goals can be in conflict with each other. The developers and (hopefully) the users have to decide which goal is more important. It is important to know about these conflicts and to decide explicitly how to deal with them. Typically usability goals in Ubiquitous Computing that come into conflict with others are: Controllability: Is it the system or the user who controls the situation? Support of mental models: How can a user still understand a very complex system? Predictability: Humans want to be able to predict the outcome of their actions. If a system is too adaptive and autonomous, users get lost. Transparency: If the system adapts to all sort of context factors, its state becomes less transparent. Learn ability: A system that learns and behaves differently in new situations can be hard to understand. The designers have to decide to what degree they want to achieve which level in each of these dimensions and how other aspects such as autonomy or adaptivity may affect them. In general, there are no rules or guidelines that can give clear directions. While in many other IT domains such as Web-systems, some established standards may set the stage and good guidelines exist, the designer of a Ubiquitous Computing system will have to derive his own solution on the basis of the goals he wants to achieve. The only way to prove that the solution actually fits these goals, are, in turn, evaluations. Therefore, a user-centered design approach is the only way to design Ubiquitous Computing systems that incorporate good user interfaces. INTELLIGENT AND DUMB INTERFACES FOR UBIQUITOUS COMPUTING In the last part of this chapter we want to focus on intelligent user interfaces. The term intelligent user interface has been debated for a while and it is not so clear what it means and if at all intelligent interfaces are something beneficial. But even the term intelligence is not well-defined and has been used (or misused) in multiple ways. Before going into 13

14 technical details we should, thus, first discuss what the term means and then see some techniques that are used for realizing them. We will finish with a discussion on how much intelligence a good interface actually needs. System Interaction System Interaction a) classical interfaces b) intelligent user interface (classical AI) System Interaction System Interaction c) embodied interaction d) intelligent cooperative interface Fig. 3: Multiple views on intelligent user interfaces What is an intelligent user interface? So far we have presented a number of techniques for building good interfaces. We also saw how the view of embodied interaction can be used as a paradigm for Ubiquitous Computing. In general, a technical solution can be called intelligent for two reasons: (i) there is some built-in intelligent computation that solves some otherwise unsolvable problem; (ii) Using the system, a user can solve an otherwise unsolvable problem, even though the system itself does not actually do anything intelligent. Suppose that calculating the logarithm of a number is a hard problem for a human, then a calculator is a good example for case (i) and an abacus would be an example for (ii). The calculator solves the problem for the human and the abacus empowers the user to solve the problem on her own. The classical approach of artificial intelligence (AI) is a rationalist one. According to this approach, a system should model the knowledge that human experts have and thus emulate human intelligence. In this sense, the intelligence moves from the user to the system (Fig. 3a, 3b). This approach is valid for many cases, e.g., if expert knowledge is rare and nonexperts should also be able to work with a system. As discussed above, the embodied interaction view would rather try to make the interaction more intelligent (Fig. 3c). This fits too many new trends in AI where embodied intelligence is viewed as a property that emerges from the interaction of an intelligent agent with the environment. In this view, even simple and light-weight agents can perform intelligent behavior without full reflective and conscious knowledge of the world. With respect to this definition, all of the abovementioned material already describes how to build an intelligent interface. Because the 14

15 processes for designing human-centered systems are just the right techniques for designing intelligent interactive systems, we already defined to a great extend how to build intelligent user interfaces. Instead of leaving all the intelligence to the system, the user or the interaction, we can also try to get the best of all worlds and combine these techniques into a cooperative system, where both the system and the user cooperate with their knowledge on solving some tasks supported by intelligent interaction techniques (Fig. 3d). As discussed above, we can make the system more intelligent by enhancing the system, the interaction or the user. Intelligent user interface techniques exist for all three aspects. We will briefly list the key methods. Some details on them can be found in other chapters of this volume. Techniques for enhancing the system s intelligence A huge number of AI techniques can be used to put more knowledge and reasoning into the system. Besides state of the art IT methods such as databases, expert systems, heuristic search and planning, a number of more recent developments have attracted a good deal of interest by researchers and practitioners in the field: World knowledge and ontologies: semantic technologies and formal models of world knowledge have had a great renaissance in the last couple of years. In context of the Semantic Web efforts, ontologies have been established as a standard method for capturing complex relations of objects and events in the world. Ontologies (cf. chapter Ontologies for Scalable Services-Based Ubiquitous Computing ) can be successfully used in user interfaces in order to give the system a better understanding of the domain of an interaction. In particular for natural language interaction, ontologies provide resources for better understanding and reasoning. User Adaptation: User models and techniques of user adaptation allow for individualized interaction (cf. chapter Adapting to the User ). A number of methods allow for autonomous and user-driven customization of systems and they are widely used for intelligent user interfaces. In particular for Ubiquitous Computing, user adapted systems play a big role since these systems often have to support a great variety of use cases where a single standardized interface is not appropriate. Context adaptation: Context plays a crucial role for Ubiquitous Computing (cf. chapter Context Models and Context Awareness ). As discussed already, context-dependent user interfaces can greatly enhance the usability of these systems. However, context can also be challenging because it can depend on a huge number of parameters and it is hard to formalize the meaning of contexts and to learn the relations between them autonomously. Service federation: Integrating a variety of services and providing a single interface can be a significant step towards intelligent user interfaces. If users do not have to interact with all sorts of services separately, but can use a single portal, they can work much more efficiently with less cognitive load. However, service integration can be a hard problem, in particular when multiple services have to be integrated semantically that had not originally been designed to be integrated. An intelligent ubiquitous travel 15

16 assistant could, for instance, integrate maps, events, travel, sights and weather information from different providers and offer the user an integrated trip plan. Of course, many other techniques are available. In principle all advanced semantic and AIbased methods with relation to user interaction can help to make systems smarter and better understand what a user might need and want using ubiquitous information sources. Techniques for more intelligent interaction As discussed above, the most important aspect of intelligent interaction is to provide good and working mappings of the user s models of the world and the system s model. These mappings depend highly on the semiotics, i.e. the meaning and perception of the signs and signals, established between user and the system. These can be both actively communicated codes in form of a language but also passive features of the artifact that signal the user affordances. Both aspects can be supported through intelligent methods that aim at more natural interaction such that the interaction takes place based on the premise of human communication rather than machine languages. Multimodal interaction: Multimodal techniques make use of the human ability to combine multiple input and output modalities for a semantically rich, robust and efficient communication. In many Ubiquitous Computing systems, language, gestures, graphics, and text are combined to multimodal systems (cf. chapters Multimodal and Federated Interaction and Multimodal Software Engineering ). Multimodality is on the one hand more natural and on the other hand it also allows for more flexible adaptation in different usage situations. Cross-media adaptation: Ubiquitous Computing systems often use a number of different media, devices and channels for communicating with the user. A user can, in one situation, carry a PDA with a tiny display and, in another situation, interact with a wallsized display in public or even use no display but just earphones. Intelligent interaction can support media transcoding that presents content on different media adapted to the situation. Direct interaction: Humans are very good at multimodal communication, but for many tasks we are even better using direct interaction. It is for instance much easier to drive a car using a steering wheel than to tell the car to which degree it should steer to the left or to the right. For many activities, direct interaction is superior to other forms of human-computer interaction. Embodied conversational agents: Since humans are used to communicating with humans (and not with machines), anthropomorphic interfaces presenting animated characters can in some circumstances be very useful. In particular in entertainment systems, so-called avatars have become quite popular. However, there is also some debate about these interfaces and some people dislike this form of interaction. Again, the list of techniques could be much longer. Here we just highlight some of the most important trends in the field. More ideas have been proposed and there will be more to come. Techniques for amplifying the user s intelligence In principle, we can try to build better interface techniques, but we will not be able to change the user s intelligence leaving aside e-learning and tutorial systems that might explicitly have teaching or training purposes. But even if we do not affect the user s 16

17 intelligence, we can still do a lot about her chances to make use of it. In the scientific community there has been a controversy about the goal of intelligent user interfaces over the last couple of years on where to put how much intelligence (Fig. 3). And even though it might be counterintuitive, an intelligent interface can sometimes be the one that leaves the intelligence on the part of the user rather than putting it into the system (Fig. 3a, 3b). In the debate about these approaches, the slogan Intelligence Amplification (IA) instead of Artificial Intelligence (AI) was coined. The idea is that a really intelligent interface leaves intelligent decisions to the user and does not take away all intelligent work from the user by modeling it in the system. The question is: how can the system support users in acting intelligently? The answers have been given already when we discussed usability goals: A system that is easy to learn, where users have control and understand what is going on, where mental models are applicable is more likely to let people act intelligently. In contrast, a system that only leaves minor steps to the user, that does not provide information about its states and how it got there and for which the users do not have appropriate mental models will in the long run bore its users and decrease their creativity and enthusiasm. It should be noted that both type of systems can be packed with AI and be extremely smart or very dumb things. It is more the kind of interaction design that facilitates human intelligence or not. HOW MUCH INTELLIGENCE? Intelligent user interfaces for Ubiquitous Computing will be a necessary thing in the future. However, there are multiple competing views and philosophies. In general: three things could be intelligent: the user, the system or the way in which they interact. Most researchers focus on enhancing the system s intelligence and the assumption is that this will lead to a better usability. This is often the case but not always. A different approach is to say that users are the most intelligent agents and their intelligence should be enhanced rather than replaced by artificial intelligence (IA instead of AI). In practice, however, we should do all together in order to make the interaction as easy and efficient as possible. But each decision should be made carefully keeping in mind that the overall goal of an intelligent user interface still should be defined by the usability goals. And like with all good things less is sometimes more and some simple things often are more enjoyable and easier to understand than highly complex and automated devices. CONCLUSIONS This chapter introduced aspects of designing user interfaces for Ubiquitous Computing in general and intelligent interfaces in particular. The basics for building intelligent interfaces are techniques for building good interfaces. Consequently, we first presented an up-to-date introduction to methods of human-centered design. A central aspect of this technique is to iteratively design systems with repeated evaluations and user feedback. This approach is especially important for Ubiquitous Computing systems, since they lack clear guidelines and decades of experience and thus iterations are crucial in order to approach the desired design goals. 17

18 Obviously, many of these basic techniques are also valid for many other systems. However, Ubiquitous Computing introduces some more specific issues such as a high variety of contexts, the lack of single dedicated interface devices and by its very nature ubiquitous interaction at any time and location. Therefore, Ubiquitous Computing interfaces must place even more emphasis on good mappings to mental models and provide good affordances. The view of embodied interaction gives us a good theoretical idea of the way we should think of and model interaction in such systems. With these prerequisites, designers can build very good interfaces and can take many usability aspects into consideration. However, so far the intelligence of the interfaces has not been discussed. We did that in the last part of the chapter and presented a modern and sometimes controversial view on intelligent user interfaces. There are different paradigms that may contradict each other. The main question can be formulated as: AI or IA Artificial Intelligence or Intelligent Amplification. We discussed these design philosophies and presented some ideas on how to combine the best of both worlds. We also presented a number of current trends in the field that can be found in modern Ubiquitous Computing systems. FUTURE RESEARCH DIRECTIONS As the field of Ubiquitous Computing matures, its user interface techniques will also undergo an evolutionary process and some best practices will be established, making things much easier. We currently see this happening for Web applications where developers can choose from established interaction techniques that are well-known to the users and guarantee efficient interaction. However, Ubiquitous Computing might never reach that point since the ambition to support users in every situation at every time and place which is the final goal of it requires such rich interfaces that have to cope with the complexity of the users entire life. This might be good news for researchers in the field because they will stay busy searching for better and more intelligent interfaces. The main challenges for future research will lie in the problem of extensibility and scalability of intelligent user interfaces. How can a system that has been designed for a user A in situation S be extended to support thousands of users in a hundred different situations? REFERENCES Dix, A., Finley, J., Abowd, G. & Beale, R. (1998). Human-computer interaction. Upper Saddle River/NJ, USA: Prentice-Hall. Dourish, P. (2001): Where the action is: the foundations of embodied interaction. Massachusetts, USA: MIT Press. Gould, J. D., Boies, S. J. & Lewis, C. (1991). Making usable, useful, productivityenhancing computer applications. Communications of the ACM, 34 (1),

19 Hartson, H. R. & Hix, D. (1989). Toward empirically derived methodologies and tools for human computer interface development. International Journal of Man-Machine Studies, 31 (4), Hix, D. & Hartson, H. R. (1993). Developing User Interfaces: Ensuring usability through product and process. New York, USA: John Wiley and Sons. ISO (1999). Human-centred design processes for interactive systems. International Organization for Standardization. ISO 9241 (2006). Ergonomics of human-system interaction. International Organization of Standardization. Mayhew, D. J. (1999). The usability engineering lifecycle. Burlington/MA, USA: Morgan Kaufmann. Nielsen, J. (1993). Usability engineering. Boston/MA, USA: Academic Press. Norman, D. A. (1998): The Design of Everyday Things. Massachusetts, USA: MIT Press. Olson, M. H. & Blake, I. (1981). User involvement in system design: an empirical test of alternative approaches. New York, USA: Stern School of Business. Royce, W. W. (1970). Making the development of large software systems: concepts and techniques. Technical Papers of Western Electronic Show and Convention (WesCon). August 25-28, Los Angeles, USA. Shneiderman, B. (1997). Designing the user interface: strategies for effective humancomputer interaction. Boston/MA, USA: Addison Wesley. Weiser, M. (1999a). The computer for the 21st century. ACM SIGMOBILE Mobile Computing and Communications Review. 3 (3), Special issue dedicated to Mark Weiser. Weiser, M. (1999b). Some computer science issues in ubiquitous computing. ACM SIGMOBILE Mobile Computing and Communications Review, 3 (3), Special issue dedicated to Mark Weiser. FURTHER READING Cooper, A. & Reimann, R. M., (2003). About Face 2.0: The Essentials of Interaction Design. New York/NY, USA: John Wiley. Dautenhahn, K. (1996). Embodied cognition in animals and artifacts. In Embodied Action and Cognition: Papers from the AAAI 1996 Fall Symposium in Boston (pp ). Hornecker, E. & Buur, J. (2006). Getting a grip on tangible Interaction: a framework on physical space and social interaction. Conference on Human Factors in Computing Systems (pp ). New York/NY, USA: ACM Press. Preece, J., Rogers, Y. & Sharp, H. (2002). Interactive Design. New York/NY, USA: John Wiley. Sharkey, N. & Zeimke, T. (2000). Life, mind and robots: the ins and outs of embodied cognition. In S. Wermter & R. Sun (Eds.), Symbolic and Neural Net Hybrids. Massachusetts, USA: MIT Press. URL: citeseer.ist.psu.edu/sharkey99life.html Winograd, T. & Flores, F. (1987). Understanding computers and cognition: a new foundation for design. Boston/MA, USA: Addison Wesley. 19

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of

More information

SMART USER INTERFACE FOR UBIQUITOUS COMPUTING: A STATE OF THE ART OF THE COMPUTER TO HUMAN COMMUNICATION

SMART USER INTERFACE FOR UBIQUITOUS COMPUTING: A STATE OF THE ART OF THE COMPUTER TO HUMAN COMMUNICATION SMART USER INTERFACE FOR UBIQUITOUS COMPUTING: A STATE OF THE ART OF THE COMPUTER TO HUMAN COMMUNICATION Abdus Sattar 1, Ayesha Siddika 2, Marufa Montaha 3 1, 2 Britannia University, Department of Computer

More information

Introduction. chapter Terminology. Timetable. Lecture team. Exercises. Lecture website

Introduction. chapter Terminology. Timetable. Lecture team. Exercises. Lecture website Terminology chapter 0 Introduction Mensch-Maschine-Schnittstelle Human-Computer Interface Human-Computer Interaction (HCI) Mensch-Maschine-Interaktion Mensch-Maschine-Kommunikation 0-2 Timetable Lecture

More information

HUMAN COMPUTER INTERFACE

HUMAN COMPUTER INTERFACE HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the

More information

School of Computer Science. Course Title: Introduction to Human-Computer Interaction Date: 8/16/11

School of Computer Science. Course Title: Introduction to Human-Computer Interaction Date: 8/16/11 Course Title: Introduction to Human-Computer Interaction Date: 8/16/11 Course Number: CEN-371 Number of Credits: 3 Subject Area: Computer Systems Subject Area Coordinator: Christine Lisetti email: lisetti@cis.fiu.edu

More information

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,

More information

User Interface Software Projects

User Interface Software Projects User Interface Software Projects Assoc. Professor Donald J. Patterson INF 134 Winter 2012 The author of this work license copyright to it according to the Creative Commons Attribution-Noncommercial-Share

More information

Human-Computer Interaction

Human-Computer Interaction Human-Computer Interaction Prof. Antonella De Angeli, PhD Antonella.deangeli@disi.unitn.it Ground rules To keep disturbance to your fellow students to a minimum Switch off your mobile phone during the

More information

Course Syllabus. P age 1 5

Course Syllabus. P age 1 5 Course Syllabus Course Code Course Title ECTS Credits COMP-263 Human Computer Interaction 6 Prerequisites Department Semester COMP-201 Computer Science Spring Type of Course Field Language of Instruction

More information

Interface Design V: Beyond the Desktop

Interface Design V: Beyond the Desktop Interface Design V: Beyond the Desktop Rob Procter Further Reading Dix et al., chapter 4, p. 153-161 and chapter 15. Norman, The Invisible Computer, MIT Press, 1998, chapters 4 and 15. 11/25/01 CS4: HCI

More information

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space Chapter 2 Understanding and Conceptualizing Interaction Anna Loparev Intro HCI University of Rochester 01/29/2013 1 Problem space Concepts and facts relevant to the problem Users Current UX Technology

More information

Effective Iconography....convey ideas without words; attract attention...

Effective Iconography....convey ideas without words; attract attention... Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the

More information

Subject Name:Human Machine Interaction Unit No:1 Unit Name: Introduction. Mrs. Aditi Chhabria Mrs. Snehal Gaikwad Dr. Vaibhav Narawade Mr.

Subject Name:Human Machine Interaction Unit No:1 Unit Name: Introduction. Mrs. Aditi Chhabria Mrs. Snehal Gaikwad Dr. Vaibhav Narawade Mr. Subject Name:Human Machine Interaction Unit No:1 Unit Name: Introduction Mrs. Aditi Chhabria Mrs. Snehal Gaikwad Dr. Vaibhav Narawade Mr. B J Gorad Unit No: 1 Unit Name: Introduction Lecture No: 1 Introduction

More information

Playware Research Methodological Considerations

Playware Research Methodological Considerations Journal of Robotics, Networks and Artificial Life, Vol. 1, No. 1 (June 2014), 23-27 Playware Research Methodological Considerations Henrik Hautop Lund Centre for Playware, Technical University of Denmark,

More information

Being natural: On the use of multimodal interaction concepts in smart homes

Being natural: On the use of multimodal interaction concepts in smart homes Being natural: On the use of multimodal interaction concepts in smart homes Joachim Machate Interactive Products, Fraunhofer IAO, Stuttgart, Germany 1 Motivation Smart home or the home of the future: A

More information

STRATEGO EXPERT SYSTEM SHELL

STRATEGO EXPERT SYSTEM SHELL STRATEGO EXPERT SYSTEM SHELL Casper Treijtel and Leon Rothkrantz Faculty of Information Technology and Systems Delft University of Technology Mekelweg 4 2628 CD Delft University of Technology E-mail: L.J.M.Rothkrantz@cs.tudelft.nl

More information

Agent-Based Systems. Agent-Based Systems. Agent-Based Systems. Five pervasive trends in computing history. Agent-Based Systems. Agent-Based Systems

Agent-Based Systems. Agent-Based Systems. Agent-Based Systems. Five pervasive trends in computing history. Agent-Based Systems. Agent-Based Systems Five pervasive trends in computing history Michael Rovatsos mrovatso@inf.ed.ac.uk Lecture 1 Introduction Ubiquity Cost of processing power decreases dramatically (e.g. Moore s Law), computers used everywhere

More information

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction

More information

Definitions of Ambient Intelligence

Definitions of Ambient Intelligence Definitions of Ambient Intelligence 01QZP Ambient intelligence Fulvio Corno Politecnico di Torino, 2017/2018 http://praxis.cs.usyd.edu.au/~peterris Summary Technology trends Definition(s) Requested features

More information

Domain Understanding and Requirements Elicitation

Domain Understanding and Requirements Elicitation and Requirements Elicitation CS/SE 3RA3 Ryszard Janicki Department of Computing and Software, McMaster University, Hamilton, Ontario, Canada Ryszard Janicki 1/24 Previous Lecture: The requirement engineering

More information

Socio-cognitive Engineering

Socio-cognitive Engineering Socio-cognitive Engineering Mike Sharples Educational Technology Research Group University of Birmingham m.sharples@bham.ac.uk ABSTRACT Socio-cognitive engineering is a framework for the human-centred

More information

Design and Implementation Options for Digital Library Systems

Design and Implementation Options for Digital Library Systems International Journal of Systems Science and Applied Mathematics 2017; 2(3): 70-74 http://www.sciencepublishinggroup.com/j/ijssam doi: 10.11648/j.ijssam.20170203.12 Design and Implementation Options for

More information

Issues on using Visual Media with Modern Interaction Devices

Issues on using Visual Media with Modern Interaction Devices Issues on using Visual Media with Modern Interaction Devices Christodoulakis Stavros, Margazas Thodoris, Moumoutzis Nektarios email: {stavros,tm,nektar}@ced.tuc.gr Laboratory of Distributed Multimedia

More information

ISO ISO is the standard for procedures and methods on User Centered Design of interactive systems.

ISO ISO is the standard for procedures and methods on User Centered Design of interactive systems. ISO 13407 ISO 13407 is the standard for procedures and methods on User Centered Design of interactive systems. Phases Identify need for user-centered design Why we need to use this methods? Users can determine

More information

Joining Forces University of Art and Design Helsinki September 22-24, 2005

Joining Forces University of Art and Design Helsinki September 22-24, 2005 APPLIED RESEARCH AND INNOVATION FRAMEWORK Vesna Popovic, Queensland University of Technology, Australia Abstract This paper explores industrial (product) design domain and the artifact s contribution to

More information

Digitisation A Quantitative and Qualitative Market Research Elicitation

Digitisation A Quantitative and Qualitative Market Research Elicitation www.pwc.de Digitisation A Quantitative and Qualitative Market Research Elicitation Examining German digitisation needs, fears and expectations 1. Introduction Digitisation a topic that has been prominent

More information

The essential role of. mental models in HCI: Card, Moran and Newell

The essential role of. mental models in HCI: Card, Moran and Newell 1 The essential role of mental models in HCI: Card, Moran and Newell Kate Ehrlich IBM Research, Cambridge MA, USA Introduction In the formative years of HCI in the early1980s, researchers explored the

More information

CSE 190: 3D User Interaction. Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D.

CSE 190: 3D User Interaction. Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D. CSE 190: 3D User Interaction Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D. 2 Announcements Final Exam Tuesday, March 19 th, 11:30am-2:30pm, CSE 2154 Sid s office hours in lab 260 this week CAPE

More information

Human Computer Interaction (HCI, HCC)

Human Computer Interaction (HCI, HCC) Human Computer Interaction (HCI, HCC) AN INTRODUCTION Human Computer Interaction Why are we here? It may seem trite, but user interfaces matter: For efficiency, for convenience, for accuracy, for success,

More information

Keywords: Human-Building Interaction, Metaphor, Human-Computer Interaction, Interactive Architecture

Keywords: Human-Building Interaction, Metaphor, Human-Computer Interaction, Interactive Architecture Metaphor Metaphor: A tool for designing the next generation of human-building interaction Jingoog Kim 1, Mary Lou Maher 2, John Gero 3, Eric Sauda 4 1,2,3,4 University of North Carolina at Charlotte, USA

More information

MANAGING HUMAN-CENTERED DESIGN ARTIFACTS IN DISTRIBUTED DEVELOPMENT ENVIRONMENT WITH KNOWLEDGE STORAGE

MANAGING HUMAN-CENTERED DESIGN ARTIFACTS IN DISTRIBUTED DEVELOPMENT ENVIRONMENT WITH KNOWLEDGE STORAGE MANAGING HUMAN-CENTERED DESIGN ARTIFACTS IN DISTRIBUTED DEVELOPMENT ENVIRONMENT WITH KNOWLEDGE STORAGE Marko Nieminen Email: Marko.Nieminen@hut.fi Helsinki University of Technology, Department of Computer

More information

Auto und Umwelt - das Auto als Plattform für Interaktive

Auto und Umwelt - das Auto als Plattform für Interaktive Der Fahrer im Dialog mit Auto und Umwelt - das Auto als Plattform für Interaktive Anwendungen Prof. Dr. Albrecht Schmidt Pervasive Computing University Duisburg-Essen http://www.pervasive.wiwi.uni-due.de/

More information

GUIDE TO SPEAKING POINTS:

GUIDE TO SPEAKING POINTS: GUIDE TO SPEAKING POINTS: The following presentation includes a set of speaking points that directly follow the text in the slide. The deck and speaking points can be used in two ways. As a learning tool

More information

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design Zhang Liang e-mail: 76201691@qq.com Zhao Jian e-mail: 84310626@qq.com Zheng Li-nan e-mail: 1021090387@qq.com Li Nan

More information

FP7 ICT Call 6: Cognitive Systems and Robotics

FP7 ICT Call 6: Cognitive Systems and Robotics FP7 ICT Call 6: Cognitive Systems and Robotics Information day Luxembourg, January 14, 2010 Libor Král, Head of Unit Unit E5 - Cognitive Systems, Interaction, Robotics DG Information Society and Media

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Lecture 01 - Introduction Edirlei Soares de Lima What is Artificial Intelligence? Artificial intelligence is about making computers able to perform the

More information

CSTA K- 12 Computer Science Standards: Mapped to STEM, Common Core, and Partnership for the 21 st Century Standards

CSTA K- 12 Computer Science Standards: Mapped to STEM, Common Core, and Partnership for the 21 st Century Standards CSTA K- 12 Computer Science s: Mapped to STEM, Common Core, and Partnership for the 21 st Century s STEM Cluster Topics Common Core State s CT.L2-01 CT: Computational Use the basic steps in algorithmic

More information

Human Computer Interaction Lecture 04 [ Paradigms ]

Human Computer Interaction Lecture 04 [ Paradigms ] Human Computer Interaction Lecture 04 [ Paradigms ] Imran Ihsan Assistant Professor www.imranihsan.com imranihsan.com HCIS1404 - Paradigms 1 why study paradigms Concerns how can an interactive system be

More information

DESIGN FOR INTERACTION IN INSTRUMENTED ENVIRONMENTS. Lucia Terrenghi*

DESIGN FOR INTERACTION IN INSTRUMENTED ENVIRONMENTS. Lucia Terrenghi* DESIGN FOR INTERACTION IN INSTRUMENTED ENVIRONMENTS Lucia Terrenghi* Abstract Embedding technologies into everyday life generates new contexts of mixed-reality. My research focuses on interaction techniques

More information

how many digital displays have rconneyou seen today?

how many digital displays have rconneyou seen today? Displays Everywhere (only) a First Step Towards Interacting with Information in the real World Talk@NEC, Heidelberg, July 23, 2009 Prof. Dr. Albrecht Schmidt Pervasive Computing University Duisburg-Essen

More information

An Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation

An Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation Computer and Information Science; Vol. 9, No. 1; 2016 ISSN 1913-8989 E-ISSN 1913-8997 Published by Canadian Center of Science and Education An Integrated Expert User with End User in Technology Acceptance

More information

6 Ubiquitous User Interfaces

6 Ubiquitous User Interfaces 6 Ubiquitous User Interfaces Viktoria Pammer-Schindler May 3, 2016 Ubiquitous User Interfaces 1 Days and Topics March 1 March 8 March 15 April 12 April 26 (10-13) April 28 (9-14) May 3 May 10 Administrative

More information

Introducing Evaluation

Introducing Evaluation Projektas Informatikos ir programų sistemų studijų programų kokybės gerinimas ( VP1-2.2-ŠMM-07-K-02-039) Introducing Evaluation Lecture 13 Dr Kristina Lapin Outline The types of evaluation Evaluation case

More information

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Human-Computer Interaction

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Human-Computer Interaction Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Human-Computer Interaction Beatriz Sousa Santos, 2016/2017 Outline Introduction Course Information Lectures and lab classes

More information

Intelligent Systems. Lecture 1 - Introduction

Intelligent Systems. Lecture 1 - Introduction Intelligent Systems Lecture 1 - Introduction In which we try to explain why we consider artificial intelligence to be a subject most worthy of study, and in which we try to decide what exactly it is Dr.

More information

in the New Zealand Curriculum

in the New Zealand Curriculum Technology in the New Zealand Curriculum We ve revised the Technology learning area to strengthen the positioning of digital technologies in the New Zealand Curriculum. The goal of this change is to ensure

More information

EXTENDED TABLE OF CONTENTS

EXTENDED TABLE OF CONTENTS EXTENDED TABLE OF CONTENTS Preface OUTLINE AND SUBJECT OF THIS BOOK DEFINING UC THE SIGNIFICANCE OF UC THE CHALLENGES OF UC THE FOCUS ON REAL TIME ENTERPRISES THE S.C.A.L.E. CLASSIFICATION USED IN THIS

More information

Impediments to designing and developing for accessibility, accommodation and high quality interaction

Impediments to designing and developing for accessibility, accommodation and high quality interaction Impediments to designing and developing for accessibility, accommodation and high quality interaction D. Akoumianakis and C. Stephanidis Institute of Computer Science Foundation for Research and Technology-Hellas

More information

CS 350 COMPUTER/HUMAN INTERACTION

CS 350 COMPUTER/HUMAN INTERACTION CS 350 COMPUTER/HUMAN INTERACTION Lecture 23 Includes selected slides from the companion website for Hartson & Pyla, The UX Book, 2012. MKP, All rights reserved. Used with permission. Notes Swapping project

More information

Development and Integration of Artificial Intelligence Technologies for Innovation Acceleration

Development and Integration of Artificial Intelligence Technologies for Innovation Acceleration Development and Integration of Artificial Intelligence Technologies for Innovation Acceleration Research Supervisor: Minoru Etoh (Professor, Open and Transdisciplinary Research Initiatives, Osaka University)

More information

Human Computer Interaction. What is it all about... Fons J. Verbeek LIACS, Imagery & Media

Human Computer Interaction. What is it all about... Fons J. Verbeek LIACS, Imagery & Media Human Computer Interaction What is it all about... Fons J. Verbeek LIACS, Imagery & Media September 4, 2017 LECTURE 1 INTRODUCTION TO HCI PRINCIPLES & KEY CONCEPTS 2 3 Content What is HCI Historical context

More information

Interaction Design -ID. Unit 6

Interaction Design -ID. Unit 6 Interaction Design -ID Unit 6 Learning outcomes Understand what ID is Understand and apply PACT analysis Understand the basic step of the user-centred design 2012-2013 Human-Computer Interaction 2 What

More information

THE MECA SAPIENS ARCHITECTURE

THE MECA SAPIENS ARCHITECTURE THE MECA SAPIENS ARCHITECTURE J E Tardy Systems Analyst Sysjet inc. jetardy@sysjet.com The Meca Sapiens Architecture describes how to transform autonomous agents into conscious synthetic entities. It follows

More information

THE FUTURE OF DATA AND INTELLIGENCE IN TRANSPORT

THE FUTURE OF DATA AND INTELLIGENCE IN TRANSPORT THE FUTURE OF DATA AND INTELLIGENCE IN TRANSPORT Humanity s ability to use data and intelligence has increased dramatically People have always used data and intelligence to aid their journeys. In ancient

More information

CSC 550: Introduction to Artificial Intelligence. Fall 2004

CSC 550: Introduction to Artificial Intelligence. Fall 2004 CSC 550: Introduction to Artificial Intelligence Fall 2004 See online syllabus at: http://www.creighton.edu/~davereed/csc550 Course goals: survey the field of Artificial Intelligence, including major areas

More information

Who are these people? Introduction to HCI

Who are these people? Introduction to HCI Who are these people? Introduction to HCI Doug Bowman Qing Li CS 3724 Fall 2005 (C) 2005 Doug Bowman, Virginia Tech CS 2 First things first... Why are you taking this class? (be honest) What do you expect

More information

TANGIBLE IDEATION: HOW DIGITAL FABRICATION ACTS AS A CATALYST IN THE EARLY STEPS OF PRODUCT DEVELOPMENT

TANGIBLE IDEATION: HOW DIGITAL FABRICATION ACTS AS A CATALYST IN THE EARLY STEPS OF PRODUCT DEVELOPMENT INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 5 & 6 SEPTEMBER 2013, DUBLIN INSTITUTE OF TECHNOLOGY, DUBLIN, IRELAND TANGIBLE IDEATION: HOW DIGITAL FABRICATION ACTS AS A CATALYST

More information

Agent Smith: An Application of Neural Networks to Directing Intelligent Agents in a Game Environment

Agent Smith: An Application of Neural Networks to Directing Intelligent Agents in a Game Environment Agent Smith: An Application of Neural Networks to Directing Intelligent Agents in a Game Environment Jonathan Wolf Tyler Haugen Dr. Antonette Logar South Dakota School of Mines and Technology Math and

More information

Natural Interaction with Social Robots

Natural Interaction with Social Robots Workshop: Natural Interaction with Social Robots Part of the Topig Group with the same name. http://homepages.stca.herts.ac.uk/~comqkd/tg-naturalinteractionwithsocialrobots.html organized by Kerstin Dautenhahn,

More information

CPE/CSC 580: Intelligent Agents

CPE/CSC 580: Intelligent Agents CPE/CSC 580: Intelligent Agents Franz J. Kurfess Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. 1 Course Overview Introduction Intelligent Agent, Multi-Agent

More information

Indiana K-12 Computer Science Standards

Indiana K-12 Computer Science Standards Indiana K-12 Computer Science Standards What is Computer Science? Computer science is the study of computers and algorithmic processes, including their principles, their hardware and software designs,

More information

A DIALOGUE-BASED APPROACH TO MULTI-ROBOT TEAM CONTROL

A DIALOGUE-BASED APPROACH TO MULTI-ROBOT TEAM CONTROL A DIALOGUE-BASED APPROACH TO MULTI-ROBOT TEAM CONTROL Nathanael Chambers, James Allen, Lucian Galescu and Hyuckchul Jung Institute for Human and Machine Cognition 40 S. Alcaniz Street Pensacola, FL 32502

More information

Narrative Guidance. Tinsley A. Galyean. MIT Media Lab Cambridge, MA

Narrative Guidance. Tinsley A. Galyean. MIT Media Lab Cambridge, MA Narrative Guidance Tinsley A. Galyean MIT Media Lab Cambridge, MA. 02139 tag@media.mit.edu INTRODUCTION To date most interactive narratives have put the emphasis on the word "interactive." In other words,

More information

Empirical Research on Systems Thinking and Practice in the Engineering Enterprise

Empirical Research on Systems Thinking and Practice in the Engineering Enterprise Empirical Research on Systems Thinking and Practice in the Engineering Enterprise Donna H. Rhodes Caroline T. Lamb Deborah J. Nightingale Massachusetts Institute of Technology April 2008 Topics Research

More information

SM 3511 Interface Design. Introduction

SM 3511 Interface Design. Introduction SM 3511 Interface Design Introduction Classes, class deliverables, holidays, project groups, etc. refer to http://kowym.com/index.php/teaching/ Inter-face: a point where two systems, subjects, organizations,

More information

Selecting Photos for Sharing

Selecting Photos for Sharing MHCI Team Ben Elgart Saara Kamppari Bridget Lewis Ajay Prasad Yong Woo Rhee Lalatendu Satpathy Microsoft Live Labs Steven Drucker Selecting Photos for Sharing Client-Sponsored MHCI Capstone Project Ben

More information

! The architecture of the robot control system! Also maybe some aspects of its body/motors/sensors

! The architecture of the robot control system! Also maybe some aspects of its body/motors/sensors Towards the more concrete end of the Alife spectrum is robotics. Alife -- because it is the attempt to synthesise -- at some level -- 'lifelike behaviour. AI is often associated with a particular style

More information

ANALYSIS AND EVALUATION OF COGNITIVE BEHAVIOR IN SOFTWARE INTERFACES USING AN EXPERT SYSTEM

ANALYSIS AND EVALUATION OF COGNITIVE BEHAVIOR IN SOFTWARE INTERFACES USING AN EXPERT SYSTEM ANALYSIS AND EVALUATION OF COGNITIVE BEHAVIOR IN SOFTWARE INTERFACES USING AN EXPERT SYSTEM Saad Masood Butt & Wan Fatimah Wan Ahmad Computer and Information Sciences Department, Universiti Teknologi PETRONAS,

More information

UNIT VIII SYSTEM METHODOLOGY 2014

UNIT VIII SYSTEM METHODOLOGY 2014 SYSTEM METHODOLOGY: UNIT VIII SYSTEM METHODOLOGY 2014 The need for a Systems Methodology was perceived in the second half of the 20th Century, to show how and why systems engineering worked and was so

More information

Charting Past, Present, and Future Research in Ubiquitous Computing

Charting Past, Present, and Future Research in Ubiquitous Computing Charting Past, Present, and Future Research in Ubiquitous Computing Gregory D. Abowd and Elizabeth D. Mynatt Sajid Sadi MAS.961 Introduction Mark Wieser outlined the basic tenets of ubicomp in 1991 The

More information

Assignment 1 IN5480: interaction with AI s

Assignment 1 IN5480: interaction with AI s Assignment 1 IN5480: interaction with AI s Artificial Intelligence definitions 1. Artificial intelligence (AI) is an area of computer science that emphasizes the creation of intelligent machines that work

More information

With a New Helper Comes New Tasks

With a New Helper Comes New Tasks With a New Helper Comes New Tasks Mixed-Initiative Interaction for Robot-Assisted Shopping Anders Green 1 Helge Hüttenrauch 1 Cristian Bogdan 1 Kerstin Severinson Eklundh 1 1 School of Computer Science

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

ServDes Service Design Proof of Concept

ServDes Service Design Proof of Concept ServDes.2018 - Service Design Proof of Concept Call for Papers Politecnico di Milano, Milano 18 th -20 th, June 2018 http://www.servdes.org/ We are pleased to announce that the call for papers for the

More information

Chapter 7 Information Redux

Chapter 7 Information Redux Chapter 7 Information Redux Information exists at the core of human activities such as observing, reasoning, and communicating. Information serves a foundational role in these areas, similar to the role

More information

Assessment of Smart Machines and Manufacturing Competence Centre (SMACC) Scientific Advisory Board Site Visit April 2018.

Assessment of Smart Machines and Manufacturing Competence Centre (SMACC) Scientific Advisory Board Site Visit April 2018. Assessment of Smart Machines and Manufacturing Competence Centre (SMACC) Scientific Advisory Board Site Visit 25-27 April 2018 Assessment Report 1. Scientific ambition, quality and impact Rating: 3.5 The

More information

Ethics in Artificial Intelligence

Ethics in Artificial Intelligence Ethics in Artificial Intelligence By Jugal Kalita, PhD Professor of Computer Science Daniels Fund Ethics Initiative Ethics Fellow Sponsored by: This material was developed by Jugal Kalita, MPA, and is

More information

Research about Technological Innovation with Deep Civil-Military Integration

Research about Technological Innovation with Deep Civil-Military Integration International Conference on Social Science and Technology Education (ICSSTE 2015) Research about Technological Innovation with Deep Civil-Military Integration Liang JIANG 1 1 Institute of Economics Management

More information

Context Sensitive Interactive Systems Design: A Framework for Representation of contexts

Context Sensitive Interactive Systems Design: A Framework for Representation of contexts Context Sensitive Interactive Systems Design: A Framework for Representation of contexts Keiichi Sato Illinois Institute of Technology 350 N. LaSalle Street Chicago, Illinois 60610 USA sato@id.iit.edu

More information

Interaction Design. Beyond Human - Computer Interaction. 3rd Edition

Interaction Design. Beyond Human - Computer Interaction. 3rd Edition Brochure More information from http://www.researchandmarkets.com/reports/2241999/ Interaction Design. Beyond Human - Computer Interaction. 3rd Edition Description: A revision of the #1 text in the Human

More information

Evaluating Socio-Technical Systems with Heuristics a Feasible Approach?

Evaluating Socio-Technical Systems with Heuristics a Feasible Approach? Evaluating Socio-Technical Systems with Heuristics a Feasible Approach? Abstract. In the digital world, human centered technologies are becoming more and more complex socio-technical systems (STS) than

More information

The Chatty Environment Providing Everyday Independence to the Visually Impaired

The Chatty Environment Providing Everyday Independence to the Visually Impaired The Chatty Environment Providing Everyday Independence to the Visually Impaired Vlad Coroamă and Felix Röthenbacher Distributed Systems Group Institute for Pervasive Computing Swiss Federal Institute of

More information

A User-Friendly Interface for Rules Composition in Intelligent Environments

A User-Friendly Interface for Rules Composition in Intelligent Environments A User-Friendly Interface for Rules Composition in Intelligent Environments Dario Bonino, Fulvio Corno, Luigi De Russis Abstract In the domain of rule-based automation and intelligence most efforts concentrate

More information

Design Science Research Methods. Prof. Dr. Roel Wieringa University of Twente, The Netherlands

Design Science Research Methods. Prof. Dr. Roel Wieringa University of Twente, The Netherlands Design Science Research Methods Prof. Dr. Roel Wieringa University of Twente, The Netherlands www.cs.utwente.nl/~roelw UFPE 26 sept 2016 R.J. Wieringa 1 Research methodology accross the disciplines Do

More information

Moving Path Planning Forward

Moving Path Planning Forward Moving Path Planning Forward Nathan R. Sturtevant Department of Computer Science University of Denver Denver, CO, USA sturtevant@cs.du.edu Abstract. Path planning technologies have rapidly improved over

More information

User Experience Questionnaire Handbook

User Experience Questionnaire Handbook User Experience Questionnaire Handbook All you need to know to apply the UEQ successfully in your projects Author: Dr. Martin Schrepp 21.09.2015 Introduction The knowledge required to apply the User Experience

More information

User interface for remote control robot

User interface for remote control robot User interface for remote control robot Gi-Oh Kim*, and Jae-Wook Jeon ** * Department of Electronic and Electric Engineering, SungKyunKwan University, Suwon, Korea (Tel : +8--0-737; E-mail: gurugio@ece.skku.ac.kr)

More information

Industry 4.0. Advanced and integrated SAFETY tools for tecnhical plants

Industry 4.0. Advanced and integrated SAFETY tools for tecnhical plants Industry 4.0 Advanced and integrated SAFETY tools for tecnhical plants Industry 4.0 Industry 4.0 is the digital transformation of manufacturing; leverages technologies, such as Big Data and Internet of

More information

CS 3724 Introduction to HCI

CS 3724 Introduction to HCI CS 3724 Introduction to HCI Jacob Somervell McBryde 104C jsomerve@vt.edu Who are these people? Jacob Somervell (instructor) PhD candidate in computer science interested in large screen displays as notification

More information

2. Publishable summary

2. Publishable summary 2. Publishable summary CogLaboration (Successful real World Human-Robot Collaboration: from the cognition of human-human collaboration to fluent human-robot collaboration) is a specific targeted research

More information

Introduction to Foresight

Introduction to Foresight Introduction to Foresight Prepared for the project INNOVATIVE FORESIGHT PLANNING FOR BUSINESS DEVELOPMENT INTERREG IVb North Sea Programme By NIBR - Norwegian Institute for Urban and Regional Research

More information

A New Trend of Knowledge Management: A Study of Mobile Knowledge Management

A New Trend of Knowledge Management: A Study of Mobile Knowledge Management Management Science and Engineering Vol. 8, No. 4, 2014, pp. 1-5 DOI: 10.3968/5786 ISSN 1913-0341 [Print] ISSN 1913-035X [Online] www.cscanada.net www.cscanada.org A New Trend of Knowledge Management: A

More information

Human Computer Interaction

Human Computer Interaction Unit 23: Human Computer Interaction Unit code: QCF Level 3: Credit value: 10 Guided learning hours: 60 Aim and purpose T/601/7326 BTEC National The aim of this unit is to ensure learners know the impact

More information

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu Augmented Home Integrating a Virtual World Game in a Physical Environment Serge Offermans and Jun Hu Eindhoven University of Technology Department of Industrial Design The Netherlands {s.a.m.offermans,j.hu}@tue.nl

More information

Introductions. Characterizing Knowledge Management Tools

Introductions. Characterizing Knowledge Management Tools Characterizing Knowledge Management Tools Half-day Tutorial Developed by Kurt W. Conrad, Brian (Bo) Newman, and Dr. Art Murray Presented by Kurt W. Conrad conrad@sagebrushgroup.com Based on A ramework

More information

{ Open House } * by Mark Weiser

{ Open House } * by Mark Weiser { Open House } * by Mark Weiser Principal Scientist, Xerox PARC March 1996 A few years ago I found myself on a stage at the MIT Media Lab, arguing with Nicholas Negroponte in front of 700 people. Nick

More information

First day quiz Introduction to HCI

First day quiz Introduction to HCI First day quiz Introduction to HCI CS 3724 Doug A. Bowman You are on a team tasked with developing new order tracking and management software for amazon.com. Your goal is to deliver a high quality piece

More information

Human Computer Interaction

Human Computer Interaction Human Computer Interaction What is it all about... Fons J. Verbeek LIACS, Imagery & Media September 3 rd, 2018 LECTURE 1 INTRODUCTION TO HCI & IV PRINCIPLES & KEY CONCEPTS 2 HCI & IV 2018, Lecture 1 1

More information

Artificial Intelligence and Asymmetric Information Theory. Tshilidzi Marwala and Evan Hurwitz. University of Johannesburg.

Artificial Intelligence and Asymmetric Information Theory. Tshilidzi Marwala and Evan Hurwitz. University of Johannesburg. Artificial Intelligence and Asymmetric Information Theory Tshilidzi Marwala and Evan Hurwitz University of Johannesburg Abstract When human agents come together to make decisions it is often the case that

More information

AMIMaS: Model of architecture based on Multi-Agent Systems for the development of applications and services on AmI spaces

AMIMaS: Model of architecture based on Multi-Agent Systems for the development of applications and services on AmI spaces AMIMaS: Model of architecture based on Multi-Agent Systems for the development of applications and services on AmI spaces G. Ibáñez, J.P. Lázaro Health & Wellbeing Technologies ITACA Institute (TSB-ITACA),

More information