Graphics

Czech technical University in Prague
Faculty of Information Technology
Department of Software Engineering
© Adam Vesecký, MI-APH, 2019

Positioning

Homogeneous Coordinates

  • in Euclidean geometry we use Cartesian coordinates
  • in projective geometry we use Homogeneous coordinates
  • we can express all affine transforms and projections as one matrix
  • transformations can be composed by matrix multiplication
  •  for points,  for vectors
  • Example: homogeneous matrix  of a rotation matrix  and a position vector
  • Homogeneous coordinates -> Cartesian coordinates
  • Cartesian coordinates -> Homogeneous coordinates

Rotation in 3D space

Rotational representations

Euler angles

  • Pitch, Yaw, Roll
  • simple, small size (3 floats), intuitive nature
  • the order in which the rotations are performed matters
  • gimbal lock issue - when a 90-degree rotation causes one of the three principal axes to collapse onto another principal axis

Axis + angle

  • axis of rotation plus a scalar for the angle of rotation
  • intiutive and compact
  • rotations cannot be easily interpolated
  • rotations cannot be applied to vectors directly

Rotational representations

Quaternions

  • similar to axis + angle, however, this is an algebraic field
  • unit-length
  • a unit quaternion can be visualised as a 3D vector + scalar
  • permits rotations to be concatenated and applied directly
  • permits rotations to be easily interpolated
  • can perform only one full rotation between keyframes
Usually, Euler angles are used for fast rotation around one axis
and quaternions for complex yet slow rotations around all axes

Rotation in affine space

Rotation about a fixed point

  • move  to the origin, rotate, move back
  • post-multiply order -> from the right or from bottom to top in the code
  • the origin of the object matters

OpenGL example:

glPushMatrix();

glTranslatef(250,250,0.0);    // 3. Translate to the object's position.

glRotatef(angle,0.0,0.0,1.0); // 2. Rotate the object.

glTranslatef(-250,-250,0.0);  // 1. Translate to the origin.

glPopMatrix();

Space

Model space

  • origin is usually placed at a central location (center of mass)
  • axes are aligned to natural direction of the model (e.g. a nose of an animal)

World space

  • fixed coordinate space, in which the positions, orientations and scales of all objects in the game world are expressed

View/Camera space

  • coordinate frame fixed to the camera
  • space origin is placed at the focal point of the camera
  • OpenGL: camera faces toward negative

Clip space

  • a rectangular prism extending from -1 to 1 (OpenGL)

View/screen space

  • a region of the screen used to display a portion of the final image

World-model-view

Clip space

View volume

  • View volume - region of space the camera can see
  • Frustum - the shape of view volume for perspective projection
  • Rectangular prism - the shape of view volume for orthographic projection
  • Field of view (FOV) - the angle between the top and bottom of the 2D surface on which the world will be projected

Perspective projection

Orthographic projection

Lookat vector

  • a unit vector that points in the same direction as the camera
  • if the dot product between lookAt vector and the normal vector of a polygon is lower than zero, the polygon is facing the camera

Animations

Animations

Stop-motion animation

  • predefined set of images/sprites
  • simple and intuitive, but impossible to re-define without changing the assets

Keyframed animation

  • keyframes contain values (position, color, modifier,...) at given point of time
  • intermediate frames are interpolated

Sprite animation

Skeletal 2D

Skeletal 3D

Vertex

Interpolation

  • Method that calculates points within the range of given points
  • Applications
    • graphics - image resizing
    • animations - morph between two transformations
    • multiplayer - morph between two game states
    • video - keyframe interpolation
    • sound processing - sample interpolation
  • Main methods
    • Constant interpolation (none/hold)
    • Linear interpolation
    • Cosine interpolation
    • Cubic interpolation
    • Bézier interpolation
    • Hermite interpolation

Bezier curve

  • parametric curve defined by a set of control points
  • most common - cubic curve, 4 points, 2 points provide directional information

Linear interpolation

Linear interpolation

  • for 1D values (time, sound)

Bilinear interpolation

  • on a rectilinear 2D grid
  • for 2D values (images, textures)
  • Q - known points (closest pixels)
  • P - desired point

Trilinear interpolation

  • on a regular 3D grid
  • for 3D values (mipmaps)

Example: 1D interpolation

Example: 2D interpolation

No interpolation

Constant

Bilinar

Cubic

Rendering pipeline

Graphics api

DirectX

  • since 1995
  • widely used for Windows and Xbox games
  • current version - DirectX 12

OpenGL

  • since 1992
  • concept of state machine
  • cross-platform
  • OpenGL ES - main graphics library for Android, iOS
  • WebGL - a subset variant of OpenGL for web

Vulkan

  • since 2015
  • referred as the next generation of OpenGL
  • lower overhead, more direct control over the GPU than OpenGL
  • unified management of compute kernels and shaders

Triangle meshes

  • the simplest type of polygons, always planar
  • all GPUs are designed around triangle rasterization
  • Constructing a triangle mesh
    • winding order (clockwise, counter-cw)
    • triangle lists, strips and fans
    • mesh instancing - shared data

Terms

Z-Fighting

Vertex

  • primarily a point in 3D space with  coordinates
  • attributes: position vector, normal, color,  coordinates, skinning weights

Fragment

  • a sample-sized segment of a rasterized primitive
  • its size depends on sampling method

Texture

  • a piece of bitmap that is applied to a model

Occlusion

  • rendering two triangles that overlap each other
  • Z-Fighting issue
  • solution: more precise depth buffer

Culling

  • process of removing triangles that aren't facing the camera
  • frustum culling, portals, anti-portals,...

GPU architecture

Rendering pipeline

  • Vertex Fetch: driver inserts the command in a GPU-readable encoding inside a pushbuffer
  • Poly Morph Engine of SM fetches the vertex data
  • Warps of 32 threads are scheduled inside the SM
  • Vertex shaders in the warp are executed
  • H/D/G shaders are executed (optional step)
  • Raster engine generates the pixel information
  • data is sent to ROP (Render Output Unit)
  • ROP performs depth-testing, blending etc.

Rendering pipeline

Vertex shader phase

  • handles transformation from model space to view space
  • full access to texture data (height maps)

Tesselation shader phase (optional)

  • two shader stages and a fixed-function tessellator between them
  • Tesselation Control Shader - determines the amount of tessellation
  • Tesselation Evaluation Shader - applies the interpolation

Geometry shader phase (optional)

  • operates on entire primitives in homogeneous clip space

Rasterization phase

  • Assembly - converts a vertex stream into a sequence of base primitives
  • Clipping - transformation of clip space to window-space and chopping off triangles that are outside the frustum
  • Culling - discards triangles facing away from the viewer
  • Rasterization - generates a sequence of fragments (window-space)

Rendering pipeline

Fragment shader phase

  • input: fragment, output: color, depth value, stencil value
  • can address texture maps and run per-pixel calculations

Final phase

  • Additional culling tests
    • pixel ownership - fails if the pixel is not owned by the API (OpenGL)
    • scissor test - fails if the pixel lies outside of a screen rectangle
    • stencil test - comparing against stencil buffer
    • depth test - comparing against depth buffer
  • Color blending
    • combines colors from fragment shader with colors in the color buffers
  • writes data to framebuffer
  • swaps buffers

Shaders

  • programs that run on the video card in order to perform a variety of specialized functions (lighting, effects, post-processing, even physics or AI)

Vertex shader

  • input is vertex, output is transformed vertex

Geometry shader (optional)

  • input is n-vertex primitive, output is zero or more primitives

Tessellation shader (optional)

  • input is primitive, output is subdivided primitive

Pixel/fragment shader

  • input is fragment, output is color, depth value, stencil value

Compute shader

  • shader that run outside of the rendering pipeline
  • used for massively parallel GPGPU computing

Shader usecases

Cube Tessellation

Geometry shader grass

Screen effects

Vertex shader

  • 3D-to-2D transformation
  • displacement mapping
  • skinning

Tessellation shader

  • Hull Shader - tessellation control
  • Domain Shader - tessellation evaluation

Geometry Shader

  • sprite-from-point transformation
  • cloth simulation
  • fractal subdivision

Pixel/fragment shader

  • bump mapping
  • particle systems
  • visual effects

Example: Geometry shader

#version 150 core

layout(points) in;

layout(line_strip, max_vertices = 11) out;

in vec3 vColor[];

out vec3 fColor;

const float PI = 3.1415926;

 

void main() {

    fColor = vColor[0];

 

    for (int i = 0; i <= 10; i++) {

        // Angle between each side in radians

        float ang = PI * 2.0 / 10.0 * i;

 

        // Offset from center of point (0.3 to accomodate for aspect ratio)

        vec4 offset = vec4(cos(ang) * 0.3-sin(ang) * 0.40.00.0);

        gl_Position = gl_in[0].gl_Position + offset;

 

        EmitVertex();

    }

 

    EndPrimitive();

}

Raytracing

  • conventional rendering converts each triangle into pixels on a 2D screen
  • RayTracing provides realistic lighting by simulating the physical behavior of light
  • not possible in real-time until recently
  • the light traverses the scene, reflecting from objects, being blocked (shadows), passing through transparent objects (refractions), producing the final color
  • APIs: OptiX, DXR, VKRay

Rasterization

Ray Tracing

Effects and Textures

Rendering features

Motion blur

Chromatic Aberration

Decals

Depth of field

Rendering features

Caustics

Lens flare

Subsurface Scattering

Ambient Occlusion

Texture

Texture mapping

  • a piece of bitmap that is applied to an object
  • may be used as a look-up table for calculations

UV coordinates

  • texture coordinates, range from [0, 0] (bottom-left) to [1,1] (top-right)
  • uv mapping - projecting a texture onto an object

Texel

  • an individual texture element

Texture Mapping

  • application of a texel on a 3D model

Types

  • diffuse map
  • height map
  • normal map
  • specular map
  • ...

Example: UV mapping

Texture mapping

Bump mapping

  • heightmap is used to generate normals
  • simple but not very accurate, heightmap uses only gray colors

Normal mapping

  • specifies a surface normal direction vector at each texel
  • uses RGB information as a 3D vector to give more accurate bump effect

Displacement mapping

  • heightmap is used to adjust vertices
  • provides realistic edges but requires a dense mesh

Example: Texture mapping

Color texture

Normal texture

Displacement texture

Color mapping

Color + disp + normal

Texture filtering

  • there is not a clean one-to-one mapping between texels and pixels
  • GPU has to sample more than one texel and blend the resulting colors

Mipmapping

  • for each texture, we create a sequence of lower-resolution bitmaps
  • objects further from the camera will use low-res textures

Nearest neighbor

  • the closest texel to the pixel center is selected

Bilinear filtering

  • the four texels surrounding the pixel center are sampled, and the resulting color is a weighted average of their colors

Trilinear filtering

  • bilinear filtering is used on each of the two nearest mipmap levels

Anisotropic filtering

  • samples texels within a region corresponding to the view angle

Texture filtering

Nearesth neighbor

Anisotropic

Bilinear

Trilinear

Antialiasing

  • used to smooth sharp edges of vertices

FSAA/SSAA (Super-Sampled Antialiasing)

  • uses sub-pixel values to average out the final values

DSR (Dynamic Super Resolution)

  • scene si rendered into a frame buffer that is larger than the actual screen
  • oversized image is downsampled
  • the pixel shader is evaluated multiple times per pixel

MSAA (Multisampled Antialiasing)

  • comparable to DSR, half of the overhead
  • the pixel shader only needs to be evaluated once per pixel

MFAA (Multi-frame sampled Antialiasing)

  • sample locations using MSAA across multiple frames

CSAA (Coverage sample Antialiasing)

  • NVidia's optimization of MSAA
  • new sample type: a sample that represents coverage

Example: Antialiasing

Lecture 8 Review

  • Rotational representation: Euler angles, Axis + angle, Quaternions
    • usually, Euler angles are used for fast rotation around one axis and quaternions for complex yet slow rotations around all axes
  • Spaces: Model space, World space, Clip space, View/screen space
  • View volume: region of space the camera can see
  • Animations: stop-motion animation, keyframed animation
  • Interpolation: method that calculates points within the range of given points
  • Terms: vertex, fragment, texture, texel, shader
  • Shader: a program that runs on GPU
    • vertex shader, geometry shader, tessellation shader, fragment shader, compute shader

Goodbye quote

Not even death can save you from me!Diablo 2