3D Brains in Blender

3D Brains in Blender

In 2024, we published a paper that needed to show the properties of the outer layer of the brain, the cortex. Scientific tools were already available, but they are usually a bit clunky and do not offer the flexibility of actual software.

I was already familiar with the 3D modeling software Blender, but no information existed on how to automatically create 3D models from our scientific results.

This post presents a tutorial to generate beautiful visual representations of study data. A benefit is that all images are created with code and thus easily replicable and reproducible.

What we can do

When we understand our data better, we can break free from the standard toolings for visualization and create images like the ones shown below. The first image shows the brain cortex labeled according to the Desikan-Killiany-Tourville structural atlas. It uses red-green-blue (RGB) triplets for each position (vertex) on the average cortex of many individuals. The second image shows the cortical thickness of an individual person provided as a single scalar value. This tutorial will walk you through generating these images using the Python console in Blender.

Left brain halves are colored according to anatomical labels (left) or cortical thickness (right).

Setting up blender

First, install Blender—any version will do. When you open Blender, any screen can be switched to a different view using the button in the top left corner. In this tutorial, we will only use three windows: the 3D viewport (under general or Shift + F5), the Python console (under scripting or Shift + F4), and the properties tab (under data or Shift + F7).

We will hardly have to touch anything in the 3D viewport because we will create our scene programmatically. The only thing you should do is delete the default cube: click it and hit delete.

Once we have Python running, we need to install one module that will enable us to load 3D medical data called NiBabel. Since Blender comes with its own version of Python, it may end up in the wrong place when you try to install the NiBabel through the command line. You can, therefore use the following commands in the Python console in Blender to install NiBabel from within Python:

Blender-Python-Console
Copy
import sys import os subprocess.check_call([sys.executable, "-m", "pip", "install", "nibabel"])

Setting up the data

Before we start, I will explain a bit about the data we will use. To get data to perform this demo, you can download the FreeSurfer Average (fsaverage) person shape and surface data from the FreeSurfer Cortical Parcellations Page (direct download 299MB).

Blender-Python-Console
Copy
from pathlib import Path fsaverage = Path(r'C:\path\to\fsaverage\')

In the surf folder you will find five types of 3D models: inflated, pial, white, orig and sphere. In the label folder you can find six types of scalar morphometry data: thickness, curv, jacobian_white, sulc, area, and volume, and several types of label data such as annot and aparc.a2009s.annot. Let’s get started visualizing some of this data.

Blender-Python-Console
Copy
hemispheres = ['rh.', 'lh.'] model_names = ['inflated', 'pial', 'white', 'orig', 'sphere'] morph_data = ['thickness', 'curv', 'jacobian_white', 'sulc', 'area', 'volume'] label_data = ['aparc.a2009s.annot', 'annot']

Let’s get started with visualizing an inflated right hemisphere with thickness labels:

Blender-Python-Console
Copy
hm = hemispheres[0] model = model_names[0] label = label_data[0] name = label

Creating the mesh

You can’t just drag and drop the FreeSurfer data into Blender since its data format is unknown to Blender. Therefore, we will use Python commands that will programmatically generate the 3D scenes shown above when run in the Blender Python console. Basically, they perform commands that you could otherwise do by hand. You can either run them line by line to see what they do or save it as a script and run them all at once. First, we will want to create our 3D model and then apply the coloring. Let us start with creating the shape.

Blender-Python-Console
Copy
import nibabel.freesurfer.io as fsio import bpy # Load the mesh data coordinates, faces = fsio.read_geometry(fsaverage / 'surf' / f'{hm}{model}') # Create an object to apply our shape data to bpy.ops.object.add(type='MESH', enter_editmode=False, location=(0, 0, 0)) o = bpy.context.object o.name = f"my_first_{name}_model" mesh = o.data # Cast the numpy array of faces and coordinates to a list of tuples c = list(map(tuple, coordinates)) f = list(map(tuple, faces)) # Set the coordinate and face data to the object o.data.from_pydata(c, [], f) # Rotate the object # o.rotation_euler[-1] = 3.14 #optional: rotate o.scale = (0.05, 0.05, 0.05) # Update the mesh data o.data.update()

Before we continue, just some information on the loaded data will help create a coloring script. The model consists of 163841 points, also known as vertices. vertex_ids maps each point to one of the 76 regions. The names of these regions could be read from id_names and their color from id_colors. Lastly, coordinates specifies the location of each of these vertices in space, and faces are 327680 by 3 arrays of reference to the vertices that each of the polygons is made of.

Coloring with labels

Blender-Python-Console
Copy
# Load color data vertex_ids, id_colors, id_names = fsio.read_annot(fsaverage / 'label' / f'{hm}{label}') # Create color space mesh.vertex_colors.new() color_layer = mesh.vertex_colors["Col"] bpy.ops.object.shade_smooth() for face, poly in zip(faces, mesh.polygons): vertex1, vertex2, vertex3 = face id = vertex_ids[vertex1] color = (*id_colors[id, :3] / 255, 1) for idx in poly.loop_indices: color_layer.data[idx].color = color # Switch to vertex paint mode to see the result # bpy.ops.object.mode_set(mode='VERTEX_PAINT')

Render Material

Now, create a material.

Blender-Python-Console
Copy
bpy.ops.object.material_slot_add() new_mat = bpy.data.materials.new(model) new_mat.use_nodes = True bpy.context.object.active_material = new_mat principled_node = new_mat.node_tree.nodes.get('Principled BSDF') principled_node.inputs[0].default_value = (1,0,0,1) color_node = new_mat.node_tree.nodes.new('ShaderNodeVertexColor') rgb_node = new_mat.node_tree.nodes.new('ShaderNodeRGBCurve')

And link the color to the material object.

Blender-Python-Console
Copy
new_mat.node_tree.links.new(color_node.outputs["Color"], rgb_node.inputs["Color"]) new_mat.node_tree.links.new(rgb_node.outputs["Color"], principled_node.inputs["Base Color"])

And switch to rendered more:

Blender-Python-Console
Copy
bpy.context.scene.render.engine = 'CYCLES' for area in bpy.context.screen.areas: if area.type == 'VIEW_3D': for space in area.spaces: if space.type == 'VIEW_3D': space.shading.type = 'RENDERED' bpy.context.scene.render.film_transparent = True

This should get you the following:

label render example

Coloring with scalars

Rendering scalars can be tricky if you want something more fancy than grayscale (which you want) since the conversion from scalars to colors might require a bit of tuning on your hand. In this example, I will show how I applied the scalars and implemented a color scaling method. For this example, let us use the pial and cortical data for the left hemisphere and repeat the creating the meshing process above.

Blender-Python-Console
Copy
hm = hemispheres[0] model = model_names[1] morph = morph_data[0] name = morph

Then

Blender-Python-Console
Copy
# Load color data scalars = fsio.read_morph_data(fsaverage / 'surf'/ f'{hm}{morph}') # Create color space mesh.vertex_colors.new() color_layer = mesh.vertex_colors["Col"] bpy.ops.object.shade_smooth() # Normalize the scalars scalars -= scalars.min() scalars /= scalars.max()

We will need to convert our scalars to RGB colors. Theoretically, a linear scaling would work, giving us a linear grayscale mapping. A drawback is that we are not using colors and that outliers can reduce contrast in our color use.

Blender-Python-Console
Copy
def face2gray(face: np.array) -> tuple: c = np.mean([scalars[vertex] for vertex in face]) return c, c, c, 1.

Therefore, we would like to create a mapping from the scalars to colors. In this example, we will map high values to red and low values to blue, through gray. The α \alpha (offset) and β\beta (slope) parameters are data-specific but can be derived from the histogram of scalar values. On this demo data we will use α=0.5\alpha = 0.5 and β=10\beta = 10:

Blender-Python-Console
Copy
def face2redblue(face: np.array, alpha=0.5, beta=10) -> tuple: clamp = lambda n: max(min(1, n), 0) c = np.mean([scalars[vertex] for vertex in face]) c -= alpha c *= beta r = clamp(c - 1) b = clamp(1 - c) g = clamp(1 - (r + b)) r += g b += g return r, g, b, 1.

Next, the coloring of the polygons looks as follows:

Blender-Python-Console
Copy
for face, poly in zip(faces, mesh.polygons): # color = face2gray(face) color = face2redblue(face) for idx in poly.loop_indices: color_layer.data[idx].color = color

After this step, you can perform the Render Material step again, which should get you the following:

scalar render example

Others

Navigating the camera: The easiest way to navigate the camera is to hit Numpad 0, go to the view tab in the top right of a viewport, and under View, View Lock, hit Lock Camera To View. You can change the image size in the output tab in Properties (usually at the bottom right).

Lighting: I suggest you should definitely add some lights before rendering.

Shaders: You can play around with the Shader Editor (aka node editor) a bit. An easy parameter to tweak is roughness.