Rings User Documentation

Intro
- Terminology
Tutorials
- Adding surfaces to a scene
- Moving, Resizing, Coloring, and Transforming surfaces
- Adding textures to a surface
- Adding shaders to a surface
- Adding lights to a scene
- Setting up the camera
- Rendering a scene
Development
- Writing new surfaces
- Writing new textures
- Writing new shaders
- Running Test Renders on Custom Surface, Textures, and Shaders


Introduction

Terminology

Scene - The camera and the collection of all surfaces and lights that are rendered.
Surface - Any object that can be rendered in the scene.
Light - Any object that light in a scene can be traced to.
Texture - A way of describing the way a surface should be colored.
Shader - A method for shading a surface realistically.
Double - A 64-bit java floating point decimal value.
RGB - A color value described by 3 double values (red, green, and blue components) between 0.0 and 1.0 (192-bit).


Tutorials

Adding Surfaces to a Scene

To create a scene using rings, surfaces must be added. This is done through the Surface Info Panel, which is opened by selecting Edit > Open Surface Info Panel. Once the Surface Info Panel is open a new surface can be created by clicking the Create New Surface button. This causes a dialog to appear allowing a type of surface to be selected. In this example we construct a sphere and a plane.

Creating new surfaces
Creating new surfaces

After being created, surfaces can be named to make it easy to distinguish them. Here we name the sphere "Sphere" and the plane "Plane".

Naming surfaces
Naming surfaces

Moving, Resizing, Coloring, and Transforming Surfaces

Properties of surfaces including location, size, and color can be configured with the Settings tab of the Surface Info Panel. Here we color the sphere a shade of blue.

Setting up the sphere
Setting up the sphere

Transformations, such as rotation, can be applied to a surface from the Transform tab. Here we rotate the plane (which is by default an XY plane) -90 degrees so it can act as a floor for our example image. After this is done, we move it to (0.0, -1.0, 0.0) so it is below the sphere, which is at the origin.

Rotating the plane
Rotating the plane

Adding Textures to a Surface

Textures may be added to surfaces to add more interesting surface coloring. All textures are overlaid and blended with the base color of the surface (set in the Settings tab). To add a texture, select the Textures tab and press the "+". In this example we add a blue and white stripe texture to the plane (the base color of the plane is set to white, (1.0, 1.0, 1.0), so the texture will not appear tinted)

Texturing the plane
Texturing the plane

Adding Shaders to a Surface

To control the way surfaces are shaded, different shaders can be added. Shaders work in the same way as textures: all applied shaders are overlaid. By default, all surfaces use a diffuse shader (though this can be removed, if needed) which shades a flat. To add a shader, select the Shaders tab and press "+". In this example we add a reflection shader to the sphere, giving it a metal appearance.

Setting up shaders for the sphere
Setting up shaders for the sphere

Adding Lights

For surfaces to be seen, lights must be added to the scene. To add lights to the scene, the Light Info Panel will need to be opened. This can be opened by selecting Edit > Open Light Info Panel. This panel works in the same way as the Surface Info Panel. To add a light, click on Create New Light. After this, a type of light can be selected. There are three main types of lights, as shown below:

- Ambient Lights - Increase brightness of shading on all surfaces evenly.
- Directional Ambient Lights - Simulate lighting from infinity, in the opposite of the specified direction.
- Point Lights - Simulate light the comes from a single point, the location of which may be specified.

After creating a light the Edit Light dialog will appear, allowing the user to specify any information that is required by that type of light. The color of the light is specified in the same way as a surface, with 3 values between 0.0 and 1.0. All lights also have an intensity, which is a real value between 0.0 and 1.0, that indicates how intense the light from the source will be. The direction of a directional ambient light is specified by the 3 components of a 3d vector. A 3d vector specifies a direction as shown below:


Specifying direction with a vector.

The direction of the vector that is specified for a directional ambient light is the direction in which it produces light. This means, for example, that a direction of (1.0, 0.0, 0.0) indicates light that will come from the left side, assuming the scene is viewed down the -z axis.

In this example we create an ambient light 2 point lights at (3.0, 3.0, 3.0) and (0.0, 3.0, 2.0).

Adding lights
Adding lights

Setting up the Camera

For a scene to be viewed properly, the camera needs to be configured correctly. The camera is specified by 5 different types of values, as shown below:

-Location: The point in 3d space from which all viewing rays originate (Type: point)
-Viewing Direction: The direction in which the camera is viewing (Type: vector)
-Up Direction: The direction in which the "top" of the camera is pointing (i.e. which way is up) (Type: vector)
-Focal Length: The distance from the camera location to the viewing plane (Type: Real Value)
-Projection Dimensions: The width and height of the area that will be viewed by the camera (Type: Real Values)


Specifications of a Camera (this camera has a projection width and height of 1.0)

One important thing to notice about focal length and projection dimensions are that the field of view depends on them. The field of view is the angle between the 2 viewing rays at the either edge of the viewing plane (it is labeled above). The tangent of half of that angle is equal to the ratio of the focal length to half the projection width or height. So, for example, a focal length of 1.0 and a projection width and height of 2.0 results in a field of view of 90 degrees. Humans have a field of view of 90 to 120 degrees, so make sure the focal length and projection width and height are not out of proportion, or the perspective will look weird.

In this example we set up the camera looking at the origin from (1.0, 0.5, 2.0).

Configuring the camera
Configuring the camera

Rendering the Scene

Now that the scene is set up it can be rendered. Rendering options can be edited from the Render Options dialog. To open the Render Options dialog select Render > Options... In this example we render a 400 by 400 pixel image.

Render options
Render options



Development

Rings is written in 100% Java. Developing new components for rings is a relatively simple process, but requires some knowledge of the Java programming language. The following guides explain how to easily create new surfaces, textures, and shaders for rings using pure Java code. However, it does not explain how to use these new classes with the rings engine. At this point in the development of rings, using new surfaces, textures, or shaders in the user interface requires modification to existing code (which I invite you to do, it being an open source project and all. For source code email me at ashesfall@flashmail.com). On the bright side, though, rendering of custom surfaces, textures, and shaders can be done directly from source code, which I will provide an explanation of at the end of these guides.

The terminology above is important to remember when reading these guides. Two class types, threeD.raytracer.util.Vector and threeD.raytracer.graphics.RGB, are used almost everywhere in source code to represent 3d points (or directions) and rgb colors. Also, the javadoc documentation is referenced frequently.

Writing new Surfaces

All surfaces in rings are implementations of the Surface interface. This interface has five methods which must be implemented:

- public RGB getColorAt(Vector point) - Returns an RGB object representing the color of the surface at the location represented by the specified Vector object.
- public Vector getNormalAt(Vector point) - Returns a Vector object representing the normal vector to the surface at the location represented by the specified Vector object.
- public boolean intersect(Ray ray) - Returns true if the ray represented by the specified Ray object intersects the surface.
- public Intersection intersectAt(Ray ray) Returns an Intersection object representing the intersections of the specified Ray and the surface.
- public RGB shade(Vector point, Vector viewerDirection, Vector lightDirection, Light light, Light[] otherLights, Surface[] otherSurfaces) - Returns an RGB object that represents the shaded color of this surface based on shading calculations involving the specified values.

Any class that implements the Surface interface can be added to a Scene and passed to the rendering engine to be rendered, but object oriented programming provides an even better option for writing a custom surface. First of all, almost all surfaces share the implementation for the getColorAt method (return some surface color or the value for a texture). Also, it would be better to delegate the shading operation performed by the shade method to some other set of classes that could be used with any surfaces because shading is a general process that can be used with any surface. These operations are handled by the AbstractSurface class. This class can be extended to create new surfaces and leaves only the getNormalAt, intersect, and intersectAt methods of the Surface interface unimplemented. This allows all surfaces to have Textures and Shaders applied to them without any extra programming.

As an example, here is the source code for the Sphere class:

	public class Sphere extends AbstractSurface {
		public Sphere(Vector location, double radius) {
			super(location, radius);
		}
		
		public Sphere(Vector location, double radius, RGB color) {
			super(location, radius, color);
		}
		
		public Vector getNormalAt(Vector point) {
			Vector normal = point.subtract(super.getLocation());
			normal = super.getTransform(true).transformAsNormal(normal);
			
			return normal;
		}
		
		public boolean intersect(Ray ray) {
			Ray newRay = new Ray(super.getTransform(true).getInverse().transformAsLocation(ray.getOrigin()),
						super.getTransform(true).getInverse().transformAsOffset(ray.getDirection()));
			
			Vector a = newRay.getOrigin();
			Vector d = newRay.getDirection();
			double b = d.dotProduct(a);
			double c = a.dotProduct(a);
			
			double discriminant = (b * b) - (d.dotProduct(d)) * (c - 1);
			
			if (discriminant < 0)
				return false;
			else
				return true;
		}
		
		public Intersection intersectAt(Ray ray) {
			Ray newRay = new Ray(super.getTransform(true).getInverse().transformAsLocation(ray.getOrigin()),
						super.getTransform(true).getInverse().transformAsOffset(ray.getDirection()));
			
			Vector a = newRay.getOrigin();
			Vector d = newRay.getDirection();
			double b = d.dotProduct(a);
			double c = a.dotProduct(a);
			
			double discriminant = (b * b) - (d.dotProduct(d)) * (c - 1);
			double discriminantSqrt = Math.sqrt(discriminant);
			
			double t[] = new double[2];
			
			t[0] = (-b + discriminantSqrt) / (d.dotProduct(d));
			t[1] = (-b - discriminantSqrt) / (d.dotProduct(d));
			
			return new Intersection(ray, this, t);
		}
	}
      

You may notice that intersection calculations are done with a ray that is transformed by the inverse of the transformation specified by the parent AbstractSurface. Adding this allows the surface to be transformed (rotated, scaled, etc.) through methods provided by AbstractSurface.

Writing new Textures

Subclasses of AbstractSurface can be textured using implementations of the Texture interface. Textures that implement this interface are configured through an array of arguments which can be any java objects. The interface has two methods which must be implemented:

- public RGB getColorAt(Vector point) - Returns the color of the texture at the location represented by the specified Vector object. The arguments used by this method may be some default values or may be set up with configuration methods or the constructor.
- public RGB getColorAt(Vector point, Object[] args) - Returns the color of the texture at the location represented by the specified Vector object using the specified arguments.

As an example, here is some source code from the StripeTexture class:

	public RGB getColorAt(Vector point) {
		if (this.props == null)
			return null;
		else
			return this.getColorAt(point, this.props);
	}
	
	public RGB getColorAt(Vector point, Object props[]) {
		for (int i = 0; i < StripeTexture.propTypes.length; i++) {
			if (StripeTexture.propTypes[i].isInstance(props[i]) == false)
				throw new IllegalArgumentException("Illegal argument: " + props[i].toString());
		}
		
		double width = ((Double)props[0]).doubleValue();
		boolean smooth = ((Boolean)props[1]).booleanValue();
		String axis = (String)props[2];
		
		double value;
		if (axis.equalsIgnoreCase("x"))
			value = point.getX();
		else if (axis.equalsIgnoreCase("y"))
			value = point.getY();
		else if (axis.equalsIgnoreCase("z"))
			value = point.getZ();
		else
			return null;
		
		RGB c1 = (RGB)props[3];
		RGB c2 = (RGB)props[4];
		
		if (smooth == true) {
			double t = (1 + Math.sin(Math.PI * (value / width))) / 2.0;
			
			return (c1.multiply(1.0 - t)).add(c2.multiply(t));
		} else {
			if (Math.sin(Math.PI * (value / width)) > 0)
				return c1;
			else
				return c2;
		}
	}
      

Writing new Shaders

Subclasses of AbstractSurface can be shaded with implementations of the Shader interface. This interface only has one method which must be implemented:

public RGB shade(Vector point, Vector viewerDirection, Vector lightDirection, Light light, Light[] otherLights, Surface surface, Surface[] otherSurfaces) - Returns an RGB object that represents the shaded color of the specified Surface object at the specified point.

The shade method takes seven arguments:

- Vector point - The point on the surface that is to be shaded.
- Vector viewerDirection - A unit vector in the direction of the viewer.
- Vector lightDirection - A unit vector in the direction of the light.
- Light light - The light (stores color, intensity, etc.).
- Light otherLights[] - An array of other lights in the scene.
- Surface surface - A reference to the surface to be shaded.
- Surface otherSurfaces[] - An array of other surfaces in the scene.

As an example, here is some source code from the HighlightShader class:

	public RGB shade(Vector point, Vector viewerDirection, Vector lightDirection,
				Light light, Light otherLights[], Surface surface, Surface otherSurfaces[]) {
		RGB lightColor = light.getColorAt(point);
		
		Vector n = surface.getNormalAt(point);
		Vector h = viewerDirection.add(lightDirection);
		h = h.divide(h.length());
		
		double c = h.dotProduct(n);
		c = Math.pow(c, this.getHighlightExponent());
		
		return (lightColor.multiply(this.getHighlightColor())).multiply(c);
	}
      

Running Test Renders on Custom Surface, Textures, and Shaders

The following code can be used to run the rings rendering engine with a scene:

	public static void main(String args[]) {
		int width = 400;  // The image width
		int height = 400; // The image height
		int ssWidth = 1;  // The supersample width
		int ssHeight = 1; // The supersample height
		
		Surface surfaces[] = {new CustomSurface()};
		Light lights[] = {new DirectionalAmbientLight()};
		Camera camera = new Camera();
		
		Scene scene = new Scene(camera, lights, surfaces);
		
		RGB image[][] = RayTracingEngine.render(scene, width, height, ssWidth, ssHeight, null);
		FileEncoder.encodeImageFile(image, new File("test.ppm"), FileEncoder.PPMEncoding);
	}
      

This code would output a ppm encoded image file with 400 by 400 pixels.