Thursday, October 20, 2011

I came across opengl questions a lot since I developed a 2d lib on graphics acc card of Fujitsu. Some points to always know are:

  1. What the hell is Opengl?
open graphics library or open spec for graphics lib. This library provides the spec how the graphics pipeline should work and how it should be invoked. Standard opengl is client server based architecture but one We used for KGE is direct lib call because of hard RTOS system nature.

2. What is the OpengL Pipeline ?

opengl functions in pipeline fashion, a user can modify the pipeline state by the api it defines.
pipline takes the input from 2 sources :
a raw pixel data or vertex data
a 3rd source can be display list but that is nothing but the container of pixel/vertex data

vertex data ----> evaluators------------> per vertex operations and primitive assembly ------->RASTERIZATION

evaluators are used to derive the vertices actual coord from control points sometimes used in describing the surface
per vertex operations : converts vertices into primitives, if Texture are used, their coord are created and transform here.
primitive assembly: clipping is applied followed by viewport and depth calculations. then culling is applied.
[ After primitive assembly state, one has the complete geometric primitive which are the transformed & clipped vertices with related color , depth and texture coord values]

2nd source of pipeline are direct pixel values
pixel data---->pixel operations---------|
|------------texture assembly
|------------RASTERIZATION

Pixel data /Pixel operations: pixels are read from system memory and unpacked into proper format , next the data is scaled, processed by a pixel map and clamped, and then either sent to texture assembly or to Rasterization step

Texture Assembly: apply texture images onto geometric objects, this step depends on implementation to implementation as it can be costly.

Rasterization----->per fragment operations----->framebuffer

Rasterization : converts geometric primitive/pixel data into fragments. where each fragment is onetoone map of framebuffer square pixel. color , depth , stippling, line width ... are calculated and associated with each of fragment. ontop of fragment , a series of operations are performed before actually putting fragment into framebuffer. e.g. texturing, scissor test, alpha test, depth buffer test, blending, dithering, logical operations, ...etc and all these operations can be enabled /disabled separately. finally the fragments are drawn over frambuffer as pixels.