日本データベース学会

dbjapanメーリングリストアーカイブ(2009年)

[dbjapan] CFP: PUC Special Issue on Multimodal Systems, Services and Interfaces for Ubiquitous Computing


Apologies if you receive multiple copies
+-------------------------------------------------------------------------------------+
 
 Special Issue on Multimodal Systems, Services and Interfaces for Ubiquitous Computing                          
 
              ACM/Springer Journal of Personal and Ubiquitous Computing
 
       Guest Editors: Zhiwen Yu, Frode Eika Sandnes, Kenji Mase, Fabio Pianesi
 
+-------------------------------------------------------------------------------------+
 
Activity recognition and implicit interaction are central themes in ubiquitous computing.
These environments usually encompass a variety of modalities (e.g., speech, gesture,
and handwriting), collecting rich and complex information by integrating multiple devices
and different kinds of sensors. In these cases, multimodal recognition achieves robust
and reliable results by leveraging on different recognition mechanisms and making the best
out of the characteristics of each single channel. Multimodal interaction, in turn,
enables natural and implicit interaction in ubiquitous computing contexts by making
available various and flexible interfaces that adapts to the environment. In the end,
multimodal approaches are key to accomplish all these tasks, by overcoming single
modality limitations and difficulties in recognition and interaction.
Hence multimodal systems, services and interfaces are crucial ingredients for
ubiquitous computing, and have attracted much interest in both industry and academia
over the last decade.
 
This special issue aims to further scientific research within the field of
multimodal interaction, services and systems for ubiquitous computing.
It will accept original research papers that report latest results and advances in this area.
It will also invite review articles that focus on the state-of-the-art in multimodal
concepts and systems, highlighting trends and challenges. The papers will be peer reviewed
and will be selected on the basis of their quality and relevance to the topic of
this special issue.
 
 
----------
 Topics
----------
 
Topics include (but are not limited to):
 
- Multimodal sensing in smart environments
- Multimodal fusion techniques
- Multimodal activity recognition
- Multimodal mobility understanding
- Multimodal user modeling
- Multimodal content access and adaptation
- Intelligent user interface
- Multimodal support for social interaction
- Virtual and augmented multimodal interfaces
- Distributed and collaborative multimodal interfaces
- Architectures and tools for multimodal application development
- Applications such as smart home, healthcare, and meeting space
- Evaluation of multimodal systems and interfaces
 
 
-------------------
 Important Dates
-------------------
 
Full manuscript due:                         May 31, 2009
Notification of the first review process:    Aug. 15, 2009
Final acceptance notification:               Oct. 20, 2009
Final manuscript due:                       Oct. 31, 2009
Publication date:                Spring 2010 (Tentative)
              
 
--------------------
 Paper Submission
--------------------
 
Submissions should be prepared according to the author instructions available at
the journal homepage, http://www.springer.com/computer/user+interfaces/journal/779.
Manuscripts must be submitted in the form of PDF file to the corresponding editor
Zhiwen Yu (zhiweny [at] gmail.com). Information about the manuscript (title,
full list of authors, corresponding author's contact, abstract, and keywords)
must be included in the submission email.
 

----------------
 Guest Editors
----------------
 
Zhiwen Yu, Northwestern Polytechnical University, P. R. China, Email: zhiwenyu [at] nwpu.edu.cn
Frode Eika Sandnes, Oslo University College, Norway, Email: Frode-Eika.Sandnes [at] iu.hio.no
Kenji Mase, Nagoya University, Japan, Email: mase [at] nagoya-u.jp
Fabio Pianesi, FBK-irst, Italy, Email: pianesi [at] fbk.eu
 
------------------------------------------The end-----------------------------------------