SlideShare a Scribd company logo
The goal of this work is to facilitate the
development of Ambient Intelligence (AmI)
applications capable of moving from one
environment to another, while their user
interface keeps adapting itself,
autonomously, to the variable environment
conditions and the available interaction
resources.
AmI applications are expected to interact
with users naturally and transparently,
therefore, most of their interaction relies on
embedded devices that obtain information
from the user and environment. This work
implements a framework for AmI systems
that elevates those embedded devices to
the class of interaction resources. It does so
by providing a new level of abstraction that
decouples applications, conceptually and
physically, from the different specific
interaction resources available and their
underlying heterogeneous technologies.
In order to drive the adaptation process to
environment changes, the system obtains a
set of interaction device requirements from
models that describe the system UI, the
user and the environment conditions. These
requirements are then used to query an
device repository where IA techniques are
used to select the most adequate devices
among those available.
Abstract
Gervasio Varela
Integrated Group for Engineering Research, Universidade da Coruña, Spain
Autonomous Adaptation of User Interfaces to
Support Mobility in Ambient Intelligence Systems
Contact information: Gervasio Varela – gervasio.varela@udc.es – http://www.gii.udc.es
Context-Adaptive Distributed User Interfaces
The application of model-driven engineering
techniques combined with interaction resource
selection algorithms seems a very promising
approach to alleviate the problems of
developing user interfaces that require the
integration and utilization of many different
technologies and devices.
The current implementation of Dandelion
includes: a device abstraction technology to
decouple applications from the hardware
technology of their interaction devices; a
component migration system to physically
move HI3 components from one platform to
another and a distributed UI system, allowing
applications to operate distributed and
decoupled from their interaction resources.
1. Dadlani, P, Peregrin Emparanza, J, & Markopoulos, P.
Distributed User Interfaces in Ambient Intelligent
Environments: A Tale of Three Studies. Proc. 1st DUI, 2011,
101-104.
2. Collignon, B., Vanderdonckt, J., & Calvary, G. Model-Driven
Engineering of Multi-target Plastic User Interfaces. Proc.
4th ICAS, 2008, 7–14.
3. Blumendorf, M., Lehmann, G., & Albayrak, S. Bridging
models and systems at runtime to build adaptive user
interfaces. Proc. 2nd EICS 2010, 2010, 9-18.
4. Balme, L., Demeure, A., Barralon, N., Coutaz, J. & Calvary,
G. Cameleon-rt: A software ar-chitecture reference model
for distributed, migratable, and plastic user interfaces. Proc.
EUSAI 2004, 291-302.
Conclusions
References
UNIVERSIDADE
DA CORUÑA
User Interaction in
Ambient Intelligence
Dandelion
Devices
The operation of an Ambient Intelligence (AmI) system is
quite different from a classic software system. An AmI
system is expected to behave proactively and transparently,
interacting with the users and their environment in the most
natural way available. Therefore, the success of an AmI
system depends to a great extend on its capacity to use the
devices available in each environment in order to achieve
that kind of behaviour.
AmI system rely on sensing/actuation devices and
appliances to interact with the user and the environment.
Because of that, they are exposed to a complex world
populated by a wide range of different technologies and
devices. Furthermore, as the users moves, the environment
conditions, the available devices and the user, changes.
Predicting this variability at design time is dificult, and
because of that, the majority of AmI systems are designed
for a specific set of environment+user+devices.
Dandelion provides AmI developers with an abstraction
layer that isolate them from the location and specific
characteristics of the user interaction devices used to
implement the user interface.
A distributed UI management system that connects that
abstraction layer to the end devices, combined with an
autonomous and context-sensitive device selection
system, allows the development of AmI systems more
easily adaptable to environment changes or new
scenarios.
Dandelion uses a model-driven approach to build user interfaces by specifying a
series of high level declarative models which, at runtime, are transformed into a set of
distributed interaction devices. Applications are decoupled from the specific
characteristics and technologies of these devices by using a distributed agent
communication protocol called General Interaction Protocol (GIP), which is
implemented by the devices and abstracts them as interaction resources
Dandelion facilitates the utilization of
sensing/actuation devices and appliances as
the interaction resources of an AmI system.
Because of this it is especially tailored for AmI
applications requiring multi-modal interaction
using everyday objects.
Dandelion uses a model-driven approach in
which a series of models and device selection
algorithms are used to build, at runtime, as the
user moves from one place to another, a UI
that is appropriate for the user characteristics
and preferences, the environment conditions
and the devices available in each location.
Developers are only required to:
• Describe the UI at an abstract level using the
abstract UI model of UsiXML.
• Associate application data/action objects
with AUI elements
An application controller connects data/action
objects to the remote interaction resources.
When the application modifies a data object,
the controller redirects the change to the UI,
and vice versa
The final implementation of the UI is provided
by a set of mappings between the AUI
elements and the Final Interaction Objects
(FIOs).
FIOs are abstractions of devices and
appliances capable of input/output to the user,
like switches, presence sensors, alarm
systems, or even higher level interaction
resources, like gesture or voice recognition
software. They implement the concrete logic
required to interact with a device, and provide a
generic view of the device as an interaction
resource.
FIOs are connected to the system by using the
General Interaction Protocol (GIP) interface,
which decouples the system from the
underlying interaction technologies.
The GIP is an event based multi-agent
communication protocol which provides a
common interface for interaction resources.
The set of events defined is inspired by the I/O
actions supported by the AUI model of UsiXML.
The adaptation to context changes is achieved
by using a FIO selection engine that relies on a
FIO similarity measure to select among the
available FIOs, those that are more similar to a
set of requirements generated from the AUI,
the User and the environment models. Different
techniques can be used to generate the
requirements. The current approach uses
Fuzzy-rules.
VARIETY
HETEROGENEITY
COMPLEXITY
VARIETY
HETEROGENEITY
COMPLEXITY

More Related Content

Autonomous Adaptation of User Interfaces to Support Mobility in Ambient Intelligence Systems

  • 1. The goal of this work is to facilitate the development of Ambient Intelligence (AmI) applications capable of moving from one environment to another, while their user interface keeps adapting itself, autonomously, to the variable environment conditions and the available interaction resources. AmI applications are expected to interact with users naturally and transparently, therefore, most of their interaction relies on embedded devices that obtain information from the user and environment. This work implements a framework for AmI systems that elevates those embedded devices to the class of interaction resources. It does so by providing a new level of abstraction that decouples applications, conceptually and physically, from the different specific interaction resources available and their underlying heterogeneous technologies. In order to drive the adaptation process to environment changes, the system obtains a set of interaction device requirements from models that describe the system UI, the user and the environment conditions. These requirements are then used to query an device repository where IA techniques are used to select the most adequate devices among those available. Abstract Gervasio Varela Integrated Group for Engineering Research, Universidade da Coruña, Spain Autonomous Adaptation of User Interfaces to Support Mobility in Ambient Intelligence Systems Contact information: Gervasio Varela – gervasio.varela@udc.es – http://www.gii.udc.es Context-Adaptive Distributed User Interfaces The application of model-driven engineering techniques combined with interaction resource selection algorithms seems a very promising approach to alleviate the problems of developing user interfaces that require the integration and utilization of many different technologies and devices. The current implementation of Dandelion includes: a device abstraction technology to decouple applications from the hardware technology of their interaction devices; a component migration system to physically move HI3 components from one platform to another and a distributed UI system, allowing applications to operate distributed and decoupled from their interaction resources. 1. Dadlani, P, Peregrin Emparanza, J, & Markopoulos, P. Distributed User Interfaces in Ambient Intelligent Environments: A Tale of Three Studies. Proc. 1st DUI, 2011, 101-104. 2. Collignon, B., Vanderdonckt, J., & Calvary, G. Model-Driven Engineering of Multi-target Plastic User Interfaces. Proc. 4th ICAS, 2008, 7–14. 3. Blumendorf, M., Lehmann, G., & Albayrak, S. Bridging models and systems at runtime to build adaptive user interfaces. Proc. 2nd EICS 2010, 2010, 9-18. 4. Balme, L., Demeure, A., Barralon, N., Coutaz, J. & Calvary, G. Cameleon-rt: A software ar-chitecture reference model for distributed, migratable, and plastic user interfaces. Proc. EUSAI 2004, 291-302. Conclusions References UNIVERSIDADE DA CORUÑA User Interaction in Ambient Intelligence Dandelion Devices The operation of an Ambient Intelligence (AmI) system is quite different from a classic software system. An AmI system is expected to behave proactively and transparently, interacting with the users and their environment in the most natural way available. Therefore, the success of an AmI system depends to a great extend on its capacity to use the devices available in each environment in order to achieve that kind of behaviour. AmI system rely on sensing/actuation devices and appliances to interact with the user and the environment. Because of that, they are exposed to a complex world populated by a wide range of different technologies and devices. Furthermore, as the users moves, the environment conditions, the available devices and the user, changes. Predicting this variability at design time is dificult, and because of that, the majority of AmI systems are designed for a specific set of environment+user+devices. Dandelion provides AmI developers with an abstraction layer that isolate them from the location and specific characteristics of the user interaction devices used to implement the user interface. A distributed UI management system that connects that abstraction layer to the end devices, combined with an autonomous and context-sensitive device selection system, allows the development of AmI systems more easily adaptable to environment changes or new scenarios. Dandelion uses a model-driven approach to build user interfaces by specifying a series of high level declarative models which, at runtime, are transformed into a set of distributed interaction devices. Applications are decoupled from the specific characteristics and technologies of these devices by using a distributed agent communication protocol called General Interaction Protocol (GIP), which is implemented by the devices and abstracts them as interaction resources Dandelion facilitates the utilization of sensing/actuation devices and appliances as the interaction resources of an AmI system. Because of this it is especially tailored for AmI applications requiring multi-modal interaction using everyday objects. Dandelion uses a model-driven approach in which a series of models and device selection algorithms are used to build, at runtime, as the user moves from one place to another, a UI that is appropriate for the user characteristics and preferences, the environment conditions and the devices available in each location. Developers are only required to: • Describe the UI at an abstract level using the abstract UI model of UsiXML. • Associate application data/action objects with AUI elements An application controller connects data/action objects to the remote interaction resources. When the application modifies a data object, the controller redirects the change to the UI, and vice versa The final implementation of the UI is provided by a set of mappings between the AUI elements and the Final Interaction Objects (FIOs). FIOs are abstractions of devices and appliances capable of input/output to the user, like switches, presence sensors, alarm systems, or even higher level interaction resources, like gesture or voice recognition software. They implement the concrete logic required to interact with a device, and provide a generic view of the device as an interaction resource. FIOs are connected to the system by using the General Interaction Protocol (GIP) interface, which decouples the system from the underlying interaction technologies. The GIP is an event based multi-agent communication protocol which provides a common interface for interaction resources. The set of events defined is inspired by the I/O actions supported by the AUI model of UsiXML. The adaptation to context changes is achieved by using a FIO selection engine that relies on a FIO similarity measure to select among the available FIOs, those that are more similar to a set of requirements generated from the AUI, the User and the environment models. Different techniques can be used to generate the requirements. The current approach uses Fuzzy-rules. VARIETY HETEROGENEITY COMPLEXITY VARIETY HETEROGENEITY COMPLEXITY