MRS-LAB-05-Interaction_and_Collaboration
Download
Report
Transcript MRS-LAB-05-Interaction_and_Collaboration
Mixed Reality Systems
-Lab V – Interaction and CollaborationChristoph Anthes
Overview
• Introduction
• Medieval Town with an application base
• Interaction
• Recombination of interaction techniques
• Development of an own input device
• Development of an own cursor transformation model
• Network Communication
• Creating own events
• Handling events
• Concurrent Object Manipulation
• Using mergers
Mixed Reality Systems
2
Lab
Introduction
• If you take a look at you code and run the example you will find yourself in the Medieval
Town application
• The code base has become significantly smaller since the application makes use of the
OpenSGApplicationBase
• Skybox, height and collision maps, as well as Avatars are additionally registered, the rest
of the code is generic
• This could be a good example to start your own application
• Let’s inspect the application base version of the Medieval Town
Mixed Reality Systems
3
Lab
Introduction
• The class is derived from OpenSGApplicationBase
• Required variables are initialised, constructor and destructor are defined
Mixed Reality Systems
4
Lab
Introduction
• The initialize method
• Establishes the connection to the scenegraph interface
• Height maps are loaded or generated if not found
• Skybox is set up
Mixed Reality Systems
5
Lab
Introduction
•
•
•
•
A root node is set
Filled with an anonymous group core
The data from the world database and the skybox are attached to the node
The starting transformation is taken from the world database and set as an initial
transformation for the player
Mixed Reality Systems
6
Lab
Introduction
• Camera and Avatar are requested from the local user object of the user database
• Local pointers to these objects are established if the system returns that they are
available
• The display of the Avatar is set to false
Mixed Reality Systems
7
Lab
Introduction
• In the display method the skybox is updated according to the current camera
transformation
• An updateAvatar() method is implemented in order to update Avatar transformations
based on tracking data
• An empty cleanup method is provided since it is necessary to implement one
Mixed Reality Systems
8
Lab
Introduction
• Callbacks register the required modifiers, the avatar and support to VRPN and trackD
input devices
Mixed Reality Systems
9
Lab
Introduction
•
•
•
•
Main creates an application object
It simply triggers the start function afterwards
This results in a subsequent initialisation process
Once the application returns it is deleted
Mixed Reality Systems
10
Lab
Interaction
• Development of own interaction techniques
• Often highly desirable
• Can be very specific depending on input and output devices
• Can be closely related to the scene which is represented in the application
• Several approaches possible
•
•
•
•
Development from scratch
Redesign of the inVRs interaction module
Use of existing interaction module and recombination and extension of available techniques
Implementation of new techniques by creation of transition and action models
Mixed Reality Systems
11
Lab
Interaction
• Let us remember the application from our first inVRs lab the Medieval town
• In a first step we want to modify the interaction of the medieval town
• Lets have a look at the interaction state machine of inVRs again
Mixed Reality Systems
12
Lab
Interaction
• Recombination of interaction techniques
•
•
•
•
•
Our first approach is to use different predefined parts of interaction techniques
We first switch from the HOMER interaction technique to a virtual hand interaction technique
For this we have to alter two configuration files
Let’s start with the interaction.xml stored under config/modules/interaction
These models have to be exchanged:
• manipulationActionModel
• selectionChangeModel
• unselectionChangeModel
• These models can stay the same:
• selectionActionModel – objects should still be highlighted when selected
• manipulationConfirmationModel – confirmation is still triggered by a button
• manipulationTerminationModel – termination is as well triggered by a button
Mixed Reality Systems
13
Lab
Interaction
• Recombination of interaction techniques
• The setup of an interaction technique is stored inside the interaction.xml file in
\config\modules\interaction
• We have to replace the manipulationActionModel inside the stateActionModels tag with the
following snippet
Snippet 1-1
• And we have to replace the selectionChangeModel and the unselectionChangeModel inside the
stateTransitionModel tag with the following snippet
• To use our newly configured models we need sensor input as well, thus we have to changeSnippet
our 1-2
abstract input device of the input interface
Mixed Reality Systems
14
Lab
Interaction
• By inserting the following snippet in the controller.xml in the directory
\config\inputinterface\controllermanager we replace the MouseKeybController with
the MouseKeybSensorController
• Now we can use the emulation of a sensor as previously introduced in the GoingSnippet 1-3
Immersive Tutorial
• We additionally have to insert a cursor transformation model which can work with
sensors
Snippet 1-4
• Since sensor emulation is a pretty bad method for input we have to come up with an
alternative
Mixed Reality Systems
15
Lab
Interaction
• In this step we actually implement a new input device as described inside the Going
Immersive part of the inVRs tutorial
• This device will make use of ARToolKit as a tracking library
• It shall provide the functionality of sensor transformation and absence or visibility of
markers
• The marker orientation as provided from the ARToolKit is set to the sensor orientation
• The sensor translation is implemented by an additional emulator function in a cursor transformation
model
• If a marker is visible the button of the device will be set to true if it is not detected it will be set to false
• Details
•
•
We inherit from the InputDevice class
Implement the required functions by using parts of the code provided in the Augmented Reality Lab
Mixed Reality Systems
16
Lab
Interaction
• What methods are we going to use?
• virtual void update(); - called from inVRs to update the device state
• void initARToolkit(std::string cameraCalibrationFile, std::string videoConfigurationFile); initialisation
• void loadMarker(std::string markersConfigurationFile); - loading markers
• void addMarker(const XmlElement* marker); - adding markers
• void startCapturing(); - wrapper to start capturing of the video
• bool captureFrame(); - capture a single frame from the video stream
• void detectAndHandleMarker(); - detect a marker and update sensor information from the recent
captured image
• void cleanupARToolkit(); - cleanup the device
Mixed Reality Systems
17
Lab
Interaction
• What variables and helper classes are we going to use?
• Variables
•
•
•
•
•
static XmlConfigurationLoader xmlConfigLoader;
bool initialized - a boolean to check whether initialisation is terminated
ARUint8 *imageData – the captured image
float threshold – a threshold for binarisation
std::vector<MarkerData*> markers – a list of markers
• Helpers
• MarkerData – Used to store data about a single marker
• ARToolkitInputDeviceFactory – used to help with the creation of ARToolKitDevice objects
Mixed Reality Systems
18
Lab
Interaction
• Let’s start with the constructor
• It takes configuration arguments, triggers ARToolKit initialisation, and loads marker definition
• Afterwards it starts the capturing process
Mixed Reality Systems
19
Lab
Interaction
• The destructor calls the ARToolKit cleanup and empties the marker list
• In the update method the frame capture is triggered and a subsequent method for maker
detection and handling is called
Mixed Reality Systems
20
Lab
Interaction
• The startCapturing method validates that everything is initialised and triggers the
ARToolKit call for starting the video capture stream
• In the actual captureFrame method the video image is stored inside an image data
structure which will be used for future processing and detection
• The method returns whether the capture was successful
Mixed Reality Systems
21
Lab
Interaction
• This is the most important function in the device processing and updating
• The first step to detect markers and build a detected Marker structure
• ARToolkit marker processing is performed
Mixed Reality Systems
22
Lab
Interaction
• The transformations of the marker on the device are set
Mixed Reality Systems
23
Lab
Interaction
• The cleanup method which is called in the destructor empties our currently captured
image data structure
• Stops video capturing
• Closes the video stream
• Finally we create a factory, which allows to generate our device
• The factory takes a set of configuration parameters as well as the name of the device
• These configuration parameters are passed to the constructor of the device which is then returned as
a generic input device
Mixed Reality Systems
24
Lab
Interaction
• This is the implementation of the device factory used for the creation of
ARToolkitInputDevice objects
Mixed Reality Systems
25
Lab
Interaction
• Cursor Transformation Models
• Cursor transformation models belong to the user object in the user database
• They are used to determine the users cursor which can be relevant for object selection
• inVRs provides three standard cursor models
• VirtualHandCursorModel
• HomerCursorModel
• GoGoCursorModel
• We have to implement as well a cursor transformation model
• We only want to take the orientation of our marker into account for objects manipulation
Mixed Reality Systems
26
Lab
Interaction
• This is the header file of our cursor transformation model
Mixed Reality Systems
27
Lab
Interaction
• The model is derived from a generic cursor transformation model
• The constructor is kept empty and passes the configuration data up to the super class
• The getName() function returns the name of the model
Mixed Reality Systems
28
Lab
Interaction
• The cursor transformation is calculated in the generateCursorTransformation method
• Position, orientation and scale of the received sensor data can be taken into account
• To emulate a normal virtual hand model all of these values would have to be set to 1
Mixed Reality Systems
29
Lab
Interaction
• As with many inVRs components a factory is needed for the generation of cursor
transformation models
• An argument vector can be passed to the factory which returns an initial model
Mixed Reality Systems
30
Lab
Interaction
• Now we have to configure our ARToolKit device
•
•
•
•
•
These parameters are going to be used by our device
A calibration file is passed, describing the correction of the intrinsic parameters
A camera and a marker configuration file are passed
A threshold is passed for binarisation
The coordinate systems are mapped on inVRs coordinate systems
Mixed Reality Systems
31
Lab
Interaction
• And finally we have to switch again the controller in the controller.xml file, set the paths
and update the cursor
Snippet 2-1
Snippet 2-4
• Besides the includes we only have to alter two lines in our application
Snippet 2-5
• In the UserDatabase we have to register our cursor transformation model
• In the InputInterface we have to register our device
Snippet 2-2
• If we compile and execute our application now we will be able to manipulate the
scene
Snippet 2-3
with our markers
Mixed Reality Systems
32
Lab
Network Communication
• In the recent Medieval Town tutorial we have implemented own animation which was
executed only locally
• This could resolved by transmitting information over the network
• Development of network communication can be achieved in several ways
• Definition of own messages
• Definition of own events and modifiers
• Rewrite of the inVRs network module
• The most common approach is sending messages or writing own events
• We will now have a look at writing own events
Mixed Reality Systems
33
Lab
Network Communication
• Writing own events
• For communication we often have to develop our own events
• Let’s take a look at the implementation of such an event
Mixed Reality Systems
34
Lab
Network Communication
• Writing own events
• First we implement the constructor and the destructor
• One option is to use an empty constructor which automatically sets an empty payload
• The second option uses the constructor of the superclass and takes string as message payload
Mixed Reality Systems
35
Lab
Network Communication
• Writing own events
• Then we have to implement three functions
• Two used for serialisation and deserialisation and network communication
• A third for execution at the events final location
Mixed Reality Systems
36
Lab
Network Communication
• To use network communication we have to connect with the network module
• The event type has to be registered as a callback at the initialisation of the eventSnippet
manager
3-1
• An event pipe has to be defined
• And it is initialised with empty values
Snippet 3-2
Snippet 3-3
Snippet 3-4
Mixed Reality Systems
37
Lab
Network Communication
• The event has to be polled in constant intervals
•
•
•
•
The pipe to the text module which has been defined has to be requested
If the pipe is not empty the current events have to be removed from front
Once fetched their execute method is called
Afterwards the event is deleted
Snippet 3-5
Mixed Reality Systems
38
Lab
Network Communication
• This event poling has to be triggered at a given location
• Usually and in our case it is done once per frame in the display method
Snippet 3-6
• The issuing of the event has to be triggered, thus a GLUT callback has to be defined
which is automatically registered
Snippet 3-7
• If you compile and execute your code now you should be able to send events by pressing
the key ‘e’
• Try to interconnect to a remote user by passing a server IP + port
• Console output at the interconnected remote participants will be provided one you press ‘e’
Mixed Reality Systems
39
Lab
Concurrent Object Manipulation
• Only few MR applications provide the possibility to manipulate objects concurrently by
multiple users
• inVRs is the only MR framework which supports that type of interaction as an out of the
box feature
• Concurrent object manipulation is implemented by the use of so called mergers
• These mergers are implemented as modifiers in the transformation manager
• Once a pipe is opened on an object with the same id (e.g. an entity) which already has a
pipe open and a merger is defined for such a behaviour it will be executed and process
the data from both pipes
Mixed Reality Systems
40
Lab
Concurrent Object Manipulation
• Mergers
• As with any other modifier mergers have to have a source id as well as a destination id
• Each merger contains these as well the identifiers for pipeType, objectClass, objectID, and
fromNetwork
• These key attributes have to be used for an input and an output pipe
• It is equipped with an id which will be used later on to establish it in the appropriate pipe sections
Snippet 4-1
Mixed Reality Systems
41
Lab
Concurrent Object Manipulation
• Mergers
• To finally install the merger we have to activate it in our desired pipes
• In our case the merger should affect concurrent interaction
• Thus the following snippet has to be inserted twice in the local interaction pipe and as well in the
remote interaction pipe (fromNetwork = “0” and fromNetwork = “1”)
Snippet 4-2
• When we execute our application now it should be possible with two networked users
to collaboratively manipulate the same entity
Mixed Reality Systems
42
Lab
Things to do at home
• Thing to do now!!! – Add another marker, map the orientation of the marker on the sails
of the windmill
• Extend the OpenSGApplicationBase to an ARToolKitOpenSGApplicationBase and
combine the rendering from video stream with your input device
• Develop an own interaction technique
• Try to implement own events in an application for example to change the colour of
objects on pressing a mouse button
• Improve your merging strategies and have a look at the implementation of the merger
Mixed Reality Systems
43
Lab
Useful Links
• ARToolKit
• http://artoolkit.sourceforge.net/ - Sourceforge Entry page
• http://www.hitl.washington.edu/artoolkit/ - GPL version
• http://www.artoolworks.com/ - commercial version
• ARToolKit Plus Web Page
• http://studierstube.icg.tu-graz.ac.at/handheld_ar/artoolkitplus.php
• OpenSG Web Page
• http://www.opensg.org/
• inVRs Web Page
• http://www.invrs.org/
• http://doxygen.invrs.org/ - inVRs Doxygen Page
Mixed Reality Systems
44
Lab
Thank You !