Skip to content

Commit

Permalink
fix: test text and image alignment
Browse files Browse the repository at this point in the history
  • Loading branch information
ashleysyg committed Sep 19, 2023
1 parent 4d9ee7a commit 34fdefa
Showing 1 changed file with 7 additions and 6 deletions.
13 changes: 7 additions & 6 deletions blog/vr-volume-viewer.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,26 +15,27 @@ tags:

---

<h1 text-align="center">Building tools for VR scientific visualization – VR Volume viewer</h1>
<h1 style="text-align: center;">Building tools for VR scientific visualization – VR Volume viewer</h1>

### Mission of the 3D visualization team at CCV
CCV has worked in scientific visualization since its inception back in 2003, collaborating with multiple departments and faculty members to display 3D datasets on multiple devices such as display monitors, VR HMD, mobile devices, browsers and multi-display CAVE systems. Examples of our work include terrain visualization of the Mars Galle crater for the department of planetary science, high resolution tiff imagery and VR poetry for the literary arts faculty and 3D representations of human hearts for biomedical engineering classes.
It is our objective to do research on the best tools to implement these types of applications, going from open source libraries to free to use 3D visualization tools such as blender, paraview and Unity 3D. We also follow software engineering standards to ensure sustainable maintenance and produce efficient and effective products.
Historically, we have worked in the OpenGL rendering pipeline to place these datasets into 3D scenes. This library has been fully optimized for triangle mesh indexing and rendering 3D content for interactive real time simulations. Most of the time the programmer only needs to focus on implementing tools to apply transformation to those triangles and facilitate the manipulation of objects by the user.

<center>![image1](/content/images/blog/vr-volume-viewer/image1.png)</center>
<img align="center" src="/content/images/blog/vr-volume-viewer/image1.png"/>
![image1](/content/images/blog/vr-volume-viewer/image1.png)
<center>3D Bunny mesh represented as a set of triangles</center>

### Volumetric datasets
However, some of our collaborators reached out to us looking to display 3D volumetric datasets such as medical images in DICOM format, density of gasses in a volumetric space and temperatures in the ocean. Rendering this type of data in 3D space is a challenge due to the type of algorithms needed to map scalar values to voxels (a voxel is the 3D version of a 2D pixel).

<center>![image2](/content/images/blog/vr-volume-viewer/image2.png)</center>
![image2](/content/images/blog/vr-volume-viewer/image2.png)
<center>3D cube represented as a set of voxels</center>

### Volumetric Ray Marching
The most used algorithm is the ray marching technique. It consists of representing the data inside a unitary cube placed in the center of the scene (position x=0,y=0,z=0). After that, a ray is created from the camera to position in the direction of the cube location, and walk on the ray in small steps until it reaches the cube. At the ray-cube intersection point we can query the value from the data and move in the ray another step. If we find a different value, we can interpolate or take the maximum between the previously observed value and the new one. We continue moving in the same direction until we have traversed the cube. The accumulated value is converted into color space and passed to the fragment shader to paint a colored-voxel on screen. As you can see, no triangles are needed to render this type of dataset.

<center>![image3](/content/images/blog/vr-volume-viewer/image3.png)</center>
![image3](/content/images/blog/vr-volume-viewer/image3.png)
<center>Graphic representation of ray marching algorithm</center>

### Challenges and solutions
Expand All @@ -53,11 +54,11 @@ After 2 months of research and implementation, we were able to come up with a vo
### The results
The first prototype was able to load and render tiff stacks as volumes in grayscale. Most of our initial tests gave us a decent frame rate performance between 60-70 fps. In VR mode we were getting some jittering in the volume at the moment we rotated the camera, and some z-fighting depending on the point of view. Also, the application did not have enough interactivity with the user, only the camera movement (paneling, zoom). The data was being loaded using the command line arguments. The next iterations of the prototype included new features such as GUI on desktop mode and world space UI on VR mode, user interaction using VR controller (translate and rotate model), camera animation for multiple views and video exporting.

<center>![image4a](/content/images/blog/vr-volume-viewer/image4a.png) ![image4b](/content/images/blog/vr-volume-viewer/img4b.png)</center>
![image4a](/content/images/blog/vr-volume-viewer/image4a.png) ![image4b](/content/images/blog/vr-volume-viewer/img4b.png)

This is a demo of the temperature and salinity of the Rhode Island narragansett bay represented as volume data. Our tool is able to render mesh and volume data at the same time for a full 3D representation of the terrain plus the surrounding ocean.

<center>![image5](/content/images/blog/vr-volume-viewer/image5.gif)</center>
![image5](/content/images/blog/vr-volume-viewer/image5.gif)

### Conclusion
It is important to do research on the type of tools that could be used to easily visualize 3D data. However, it is vital to understand the rendering techniques and algorithms that those tools work with. Being able to implement ray marching helped our team to quickly develop a tool that adjusted to our own requirements and we can maintain following software engineer standards.
Expand Down

0 comments on commit 34fdefa

Please sign in to comment.