| Literature DB >> 34960422 |
Vratislav Cmiel1, Larisa Chmelikova1, Inna Zumberg1, Martin Kralik1.
Abstract
With the development of light microscopy, it is becoming increasingly easy to obtain detailed multicolor fluorescence volumetric data. The need for their appropriate visualization has become an integral part of fluorescence imaging. Virtual reality (VR) technology provides a new way of visualizing multidimensional image data or models so that the entire 3D structure can be intuitively observed, together with different object features or details on or within the object. With the need for imaging advanced volumetric data, demands for the control of virtual object properties are increasing; this happens especially for multicolor objects obtained by fluorescent microscopy. Existing solutions with universal VR controllers or software-based controllers with the need to define sufficient space for the user to manipulate data in VR are not usable in many practical applications. Therefore, we developed a custom gesture-based VR control system with a custom controller connected to the FluoRender visualization environment. A multitouch sensor disk was used for this purpose. Our control system may be a good choice for easier and more comfortable manipulation of virtual objects and their properties, especially using confocal microscopy, which is the most widely used technique for acquiring volumetric fluorescence data so far.Entities:
Keywords: confocal microscopy; fluorescence microscopy; immersive visualization; microscopy images; touch control; touch sensor; virtual reality; volumetric data
Mesh:
Year: 2021 PMID: 34960422 PMCID: PMC8703643 DOI: 10.3390/s21248329
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Operation of the complete VR control system with the developed controller. After loading tiffs of the z-stack images with different color channels to FluoRender (installed on a VR computer workstation), the z-stacks are combined and visualized as a complete multicolor virtual object. The instructions sent with the controller are received by FluoRender and translated into actions applied to the virtual object.
Figure 2FluoRender with a visualized zebrafish head. Most of the image parameter controls are located in a large menu around the object.
Figure 3Example of the control panel attachment from (A) a front view of the touch disk for the dominant hand’s fingers and from (B) the rear with fingers placed in appropriate positions with fingertips directly on the buttons.
Figure 4Example of the designed single-line menu with 4 various function categories.
Figure 5Example of the control for the mode used for an object manipulation in a space. (A) Touch and movement with 1 finger provide rotation in a space (by movement in the x–y-axes), and (B) touch and movement with two fingers ensure moving the object in a space (by movement in the x–y-axes), while the fingers are placed close together. (C) Thumb and index finger touch at a sufficient distance more than the threshold value and their movement towards and away from each other will make the object smaller or larger.
Figure 6A scheme showing the sequence of the individual operations from Figure 5. After distinguishing the number of touches and the distance between the fingers, the x and y values are traced (A,B), or the distance between the fingers is measured (C).
Figure 7Block diagram of the controller (left): the basic control elements are connected directly to the MCU, while the touch panel is connected via a USB hub. The MCU provides power to the peripherals. The VR controller construction (right): the controller consists of a rear cover assembly with finger buttons, a cover for the electronic board and rechargeable batteries, and a front-mounted touch disk with an attachment to the rear.
Figure 8Schematics of two control parts of the control system. The physical controller with the operating software (A) and FluoRender installation including two modules for function operating and menu visualization (B).
Figure 9Demonstration of a physical 3D printout with the buttons and the front touch panel.
Figure 10Demonstration of the electronic control board of the controller with a battery holder and cable connectors for connection to the button module.
Figure 11Example of the displayed tiled menu on stereo mode (for VR). The second mode for object cropping and color channel control is active.
Figure 12Example of 4 different states of the model reached by steps 1–10 according to the prepared scenario. (A) Step 1 from the scenario. (B) Object manipulation, steps 2–4. (C) Object cropping and switching of the red channel, steps 5–8. (D) The result after steps 9-10 is shown. For better illustration, a black background was used, and the stereo mode for VR was switched off when preparing the screenshots. The images were prepared in FluoRender.
Figure 13Mouse kidney stained with 3 different dyes (elements of the glomeruli and convoluted tubules by Alexa Fluor® 488, the filamentous actin prevalent in glomeruli, and the brush border labeled by red fluorescent Alexa Fluor® 568 phalloidin, nuclei labeled by DAPI). Images (A–D) demonstrate similar processing as in Figure 12; however, for (C) no object cropping was applied.
Performed operations for each control system and their rating for 2 user studies in the C1 and C2 columns.
| No. | Operation | ConfocalVR | C1 | C2 | ExMicroVR | C1 | C2 | Gesture Based | C1 | C2 |
|---|---|---|---|---|---|---|---|---|---|---|
| 1 | Loading | The load image button was pressed using the trigger button to open the file explorer, which helped to select partial data. The individual channels were loaded one by one. | 4 | 4 | Like ConfocalVR, the individual channels were loaded one by one using the file explorer and now with the help of the virtual laser pointer. | 5 | 5 | The data were quickly loaded in FluoRender before entering into VR ( | 0 | 0 |
| 2 | Resizing | The object was grabbed with both hands together and moving toward and away from each other (i.e., changing the distance of the controllers from each other). | 2 | 2 | Grabbing of the object with both hands together and moving toward and away from each other (i.e., changing the distance of the controllers from each other). | 3 | 3 | By placing two fingers (thumb and index finger) at efficient distance from each other and then pulling them slightly together, the object was being reduced in size. | 5 | 5 |
| 3 | Moving | The object was first grabbed by one hand. The controller was moved laterally (in the | 3 | 3 | The object was first grabbed by one hand. The controller was moved laterally (in the | 5 | 5 | By touching the disk with two fingers close together and moving in the | 5 | 5 |
| 4 | Rotation | The rotation of the object was performed by rotating the wrist when grabbing the object by one hand. | 2 | 2 | The rotation of the object was performed by rotating the wrist when grabbing the object by one hand. | 2 | 2 | Rotation of the object was managed by moving one finger in the | 4 | 4 |
| 5 | Switching off (R) | The unwanted channel was turned off using the left controller, using one of the three virtual buttons (R). | 4 | 4 | The red channel was disabled, unchecking the red channel from the visible column of the checkbox by pointing the laser pointer to the box and pressing the trigger button. | 3 | 3 | The red channel was disabled using a double tap by one finger on the touch disk 1. | 5 | 5 |
| 6 | Clipping (G) | The channels could not be cropped separately in the | 0 | - | The channels could not be cropped separately in the | 0 | - | Button 2 pressed 3. By touching the disk with the index finger and moving to the right (increasing the | 4 | - |
| 7 | Clipping (B) | 0 | - | 0 | - | Button 3 pressed. An identical action was taken for the blue channel. | 4 | - | ||
| 8 | Clipping (G + B) | 0 | - | 0 | - | Button 3 pressed again 2. One tap with three fingers was used to activate the | 4 | - | ||
| 9 | Visibility (B) | A slider in the color filters section was used to increase the contrast, and the slider ball was grabbed and moved to the right. | 3 | 3 | The laser pointer and the trigger button were used to uncheck the appropriate menu checkbox in the focus column (for the red channel). | 4 | 4 | Button 3 pressed 3. Luminance reduction in the blue channel was achieved by placing two fingers on the touch disk and moving it slightly to the left. | 5 | 5 |
| 10 | Contrast (G) | Contrast could not be enhanced in an individual channel. | 0 | 0 | Then, the brightness and opacity properties were changed by sequential grabbing and sliding these sliders using the laser pointer. | 4 | 4 | Button 2 pressed. The increase in contrast was achieved by placing thumb and index finger at a specific distance from each other and then pulling them slightly apart. | 5 | 5 |
1 Two or three fingers would be used to hide or make visible the second (G) or third channel (B). 2 Actions are active again for the whole objects—for all visible channels. 3 Button-pressing means that the further applied functions were later focused on a specific channel so that, for example, if button 3 is pressed, the following actions are applied to the third channel (B). By default, the functions are applied to the whole object (to all displayed channels together). To return to the default mode, it is necessary to press the same button that was used to focus on a specific channel.
Figure 14Total rating of the control systems for the application cases C1 and C2. The steps that could not be completed were rated 0.
Figure 15Total rating of the control systems for the application cases C1 and C2. The steps that could not be completed were not rated (rating was skipped for these steps).
Total score as the sum of the scores for the cases C1 and C2. In the first row, there are total scores for the case in which steps that could not be executed were rated 0. In the second row, the rating of these steps was skipped, so the total score is lower.
| ConfocalVR | ExMicroVR | Gesture Based |
|---|---|---|
| 36 | 52 | 70 |
| 32 | 44 | 58 |