Web-Books
in the Austria-Forum
Austria-Forum
Web-Books
Informatik
Joint Austrian Computer Vision and Robotics Workshop 2020
Page - 23 -
  • User
  • Version
    • full version
    • text only version
  • Language
    • Deutsch - German
    • English

Page - 23 - in Joint Austrian Computer Vision and Robotics Workshop 2020

Image of the Page - 23 -

Image of the Page - 23 - in Joint Austrian Computer Vision and Robotics Workshop 2020

Text of the Page - 23 -

To enable accurate tele-operation, the LIDARs were used in combination with appropriate software (see section 2.4) to generate a 2D and 3D map of the environment, giving the user insights into the environment from Robbie’s point of view (POV), as showninfigure3(8) (7). Toinclude theenvironment behind Robbie, which cannot be captured by the 3D LIDAR an RGB-Depth (RGB-D) camera, the Intel D435 [18], has been mounted slightly facing downwards on the sensor tower of the mobile robot. An additional Intel D435 was mounted on the base plate of the EEF to facilitate tele-operation of the EEF and Image Based Visual Servoing (IBVS) [4]. In addition, a Phidgets Spatial Inertial Measurement Unit (IMU) was mounted on the base platform to improve localization using sensor fusion such as Extend Kalman Filters and to provide input for tip-over control [5, 20, 23]. To enable an elevated POV, a universal camera mount with an attached Ueye UI-3240LE [17] camera was mounted on the sensor rig. Finally, to enable radioactive and nuclear (RN) detection, the robot was equipped with a radiometerSSM1+[29]. The processing of this sensor data is a computa- tionally complex process, so an industrial computer with the following specifications was also installed onRobbie’sbaseplatform: • 1 × Intel Core(TM) i7-7700T (4 Cores, 8 Threads)@2.90GHz • 1×GeForceGTX1050Ti • 2×16GBDDR42133MHz The visualization of these different sensor read- ings isadifficult task. Thereforean intuitiveGUIfor Robbiewasdeveloped,whichisdiscussedinthenext section followedby the implementedsoftware. 2.3.UserInterface The user interface is the essential component for promoting situational awareness [24]. To underline this statement, the reader’s attention is drawn to the fact that anS&Rrobotwas rejected in the tragedyof 9.11. becauseofa toocomplexuser interface [24]. Figure 3 shows the operator station, of the UAS TechnikumVienna,withtheassociateduser interface and controlpanel. As depicted on the left hand side of figure 3 the user interface is split into threeparts: 1. Log-Screen /Commandinput (Figure3 (1)) All log messages of the running software are displayed here, this is a necessity to detect soft- ware system errors. In addition, these terminal windows can be used to start/restart any soft- ware modules, this allows a maximum level of flexibility. 2. GUI (Figure3 (2)) TheGUIallows theoperator toperceive theen- vironment from Robbie’s POV, which is a ne- cessityforS&Rrobots[24]. This isachievedby livestreamsfromthecameras(8). In thedefault configuration, the internal, forward and back- ward facing cameras of the tracker and the el- evatedRGBcameraarestreamed. Furthermore, the 2D map generated by the SLAM approach discussedinsection2.4 isvisualizedin themid- dle partof the GUI as shown in (7), providinga bird’s eye view for the operator. Finally, sensor values (4), suchas the internal temperature,bat- teryvoltageandestimatedtimetoshutdownand the detected emitted radiation are displayed in counts per second. The developed GUI thus vi- sualizesallsuggestionsforagooduser interface evaluated in [24], which further enables the op- timization and interoperability of the available resources and accelerates access to the victims [7]. In addition, it is also possible to visualize additional sensor values and information by a simple mouse click (5), such as a 3D map, the additionalcameraonthegripper, thebackwards Figure 3. left: Complete operator station right: GUI (1) Logscreen / Command input, (2) GUI depicted in more detail on the right, (3) Control panel for teleoperation, (4) Sensor readings and emergency of switch, (5) Topic visualisation checkbox, (6) Additional teleoperation tool- box, (7)Mapvisualisationtoolbox, (8)Imagestreamfrom Robbie’s POV 23
back to the  book Joint Austrian Computer Vision and Robotics Workshop 2020"
Joint Austrian Computer Vision and Robotics Workshop 2020
Title
Joint Austrian Computer Vision and Robotics Workshop 2020
Editor
Graz University of Technology
Location
Graz
Date
2020
Language
English
License
CC BY 4.0
ISBN
978-3-85125-752-6
Size
21.0 x 29.7 cm
Pages
188
Categories
Informatik
Technik
Web-Books
Library
Privacy
Imprint
Austria-Forum
Austria-Forum
Web-Books
Joint Austrian Computer Vision and Robotics Workshop 2020