Medical imaging standard DICOM is certainly the widest and oldest group of protocol used in the medical sector. While visualizing the volume data with real-time technologies make sense for the profession, others, means the medical illustrator corpus may need to adapt it to a more refined level of intelligibility. In this post we will focus on how to render DICOM volumetric data using CG off line technique. The main challenge is of course the mesh creation from the volume ray-marching sampling.
Real-time render using Render Exposure
A lot of volume renderer exist, ranging from professional proprietary software suit to open source project made by communities. One of those, certainly the most robust and completed among them is the Osirix viewer developed by Pixmeo. The free version Osirix Lite has a surface construction tool for volume image with some predefined CT density level for sorting the material: SKIN, BONE and METAL. Based on the Kitware VTK open source Library, the tesselation itself doesn’t optimise nor unify normals, so polygons may have inverted normals or hole issue. Once you imported the CT or MRI scan data (DICOM file) in Osirix, you can choose 3D Surface Rendering in the 3D Viewer menu from which you can export a mesh file in .obj or .stl format among others.
The problem is that brain matters for instance range from about 35 to 55 CT density and the interpolation between value will lead to collision and noise in the final mesh rendering. This mean a lot of hand work to clean up a scene and to say the least a lack of reliability of the data. As we are seeking high resolution and fidelity for our rendering it does raise some concerns. Note also that available DICOM scan files have an average resolution of 512 by 512 pixel which means a little bit less than half a millimeter per pixel while HRCT slice range from 0.6 to 1.2 mm.
In the following screenshot we will show how we renderer a skull bone using Osirix Lite and Blender. We will use Manix sample file we can download from the Osirix website.
Osirix Lite GUI
With the folder imported we choose 3D Surface Rendering from the 2D/3D menu which popup the reconstruction settings. Basically we want high-resolution sampling or closest to the analog resolution of the scan. We will set the Decimate setting to 0.1 though we haven’t seen any significative improvement of the data from 0.1 to 0.6 for the BONE sorting. Values from 0.6 to 1.0 will corrupt the mesh.
Decimate settings on Osirix Lite 3D Surface Reconstruction
We will desactivate smoothing then we will use the 500 CT Bone density filtering. Note that excepted bone material and air everything else in the body biological matter will range from about 25 to 75 on a 1000 spectrum. So skeleton will be easy to extract while brain matter or blood vessels will be challenging and will need lot of mesh manipulation for refinement.
It is now possible to export the mesh into some common CG exchange format. Wavefront (.obj) file will often lead to file corruption so we preferred the STL (.stl) export. Here is our mesh imported in Meshlab. About 400k vertices and 800k faces.
Our mesh imported in Meshlab
From there we will setup a Cycle Renderer scene to have our final render for illustration.
Scene setup in Blender
Our final render
Ray marching and surface reconstruction code are dated from the year 2002. We would expect more efficient technical refinement today, especially with the high density reconstruction point cloud technologies running on GPU. We are looking for open source library that may show improvement in the 3D surface reconstruction algorithm, especially for real-time visualization.