-
GLADIN Olivier authoredGLADIN Olivier authored
TouchTalkInteractive
This is a repository for the source code of the system TouchTalkInteractive, designed to support collaborative sensemaking on a wall-sized display via touch gestures and speech commands.
For more details, please check out the paper.
To run the system in your machine, you will need to have npm
installed (works with version 9.5.0).
First, download the source code by cloning this repository:
git clone git@gitlab.inria.fr:aviz/TouchTalkInteractive.git
Below, we explain how to run the server and the client applications distributed across the folders.
Server
The server:
- sends messages to the display nodes of the wall size display for synchronization
- receives and forwards touch input messages from the touch overlay
- processes search requests
- forwards speech recognition messages from the voice apps to the main app
How to compile
Start the synchronization server. In the server folder, run
npm install
...if it is the first time you are running the server.
Then, run for compilation:
npm run build
How to run
npm start
Client
The client:
- Display the main application on the 'leader' computer and also on all the display nodes of the wall size display
How to compile
In the client folder, run
npm install
Then, run
npm run compile
If it goes well, a message will appear saying that webpack compiled successfully.
How to run
First, start the http server in a different command line window
npm run http
Go to the script folder and run the following script
./locallink.sh
The Leader URL will be in the output of the script, use this one to have a full overview of what is displayed on the wall
The client URL will provide an example of what is displayed on a single node of the clustered wall size display
Voice
The Svelte application to recognize speech commands is in the folder /voice
.
The source code for the Svelte application is based on this Spiegel article, combined with the Voice notes app of Nikhil karkra.
How to run
cd voice
npm install
npm run dev