Metaball Demos

These demos are part of a Northwestern SURG (Summer Undergraduate Research Grant) project lead by me on SPH(Smooth Particle Hydrodynamics) fluid simulation. This project is  mentored by Prof. Jack Tumblin in the Computer Science department.

Metaballs are Iso-surfaces of positional dependent implicit functions, they are very useful in modeling mesh-less objects like particle systems as they demonstrate blob-like shapes between neighboring elements.

In this project, we use Metaballs to model a uniform and continuous fluid shape that are formed from discretized particles. Each particle has its own implicit function where at a point in 3d space, the value of the implicit function is proportional to the inverse square of the distance to the center of the particle. The rendering of each pixel is then determined by the sum of implicit functions of all particles.

The first demonstration shows Metaballs on a 2D space. Three particles with red, green, and blue color at the center. Each pixel on the screen is sampled using the implicit function of each particle.

   

The second demonstration is a Metaballs on a 3D space. Now instead of sampling each pixel, we will have a camera that specifies the location, direction, and zoom angles we are viewing. Each pixel now has a 3D location, we then shoot “Rays” originating from the camera and going in the directions of the pixel location. For each “Ray”, we march incrementally, sampling the implicit function values at each step, until we get to the desired value. This technique is called “Ray Marching“.

Once we have found the location of our 3D implicit surface we will now try to find the normal. Here we have two ways of normal finding. The first is to calculate the normal for each particle influencing the implicit surface, and then simply take the weighted sum of the normals, where the weights will be the implicit function values. The second is to sample the implicit function values on points very close to the implicit surface and use the function value differential to find the normal (or gradient). Once we have found the normal we can now apply realistic phong-lighting to the object to model ambient, diffusion and specular lights with different material configurations.

For fluid rendering, a very important component is the proper reflection and refraction of the fluid. Here we implement a simplified version by just taking location of the ray hitting the implicit surface for the first time, and then apply reflection or snell’s law for refraction to generate the resulting ray direction. We then put a 3D cube map for our environment and let the color rendered at the location be the cube map sampled at the resulting ray direction.

Once we have a relatively realistic rendering of the fluid, we now scale this system up from including only 4 particles into 2k particles that we have in the SPH fluid simulation system. We import particle information as large data textures into our ray shader to ensure fast data i/o. We employ data structures such as Octrees to quickly query the particles whose implicit function we need to sample. We also accelerate implicit surface finding by splitting the screen into four regions, distributing the load of finding the ray intersection at one location into four units executing parallel, and then joining the results later on. 

 

 

 

 

Rising Senior, CS & Math, Northwestern University