Multimodal human-machine interface devices in the cloud

J. Multimodal User Interfaces(2017)

Cited 2|Views7
No score
Abstract
In an increasingly connected and multidisciplinary world, we propose a new paradigm of web application development. It makes the whatever modal input to use the same interface to connect to the applications. It can essentially free the web application programmers and the end users from the need of physically handling the data input devices when they are building a multimodal system. The same application can be used for a whole range of physically different peripherals, but similar from the logical point of view of data entry. This paper discusses the implementation of a pilot project, currently in a local network environment, where all the devices in the LAN are identified and described in an interface server. Users in the local network may, upon request, make use of such devices. The communication of these peripherals with the web applications will be carried out by a network of modules that run under the websocket technology. The whole process of communication and connection establishment is automatic and guided by the existing configurations in the interface server. The entire platform runs under SOA strategy and is fully scalable and configurable. Its use is not limited to games because it has much wider possibilities, interactivity in teaching, accessibility for people with special needs, adaptation of web applications to the use of uninitiated, etc.
More
Translated text
Key words
Multimodal,Interface,SOA,Websocket,Virtual interface,Accessibility,MVCI
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined