Linux DevCenter    
 Published on Linux DevCenter (
 See this if you're having trouble printing code examples

Animation in SDL: OpenGL

by Bob Pendleton

OpenGL Surfaces

OpenGL is a powerful, portable, and elegant API for drawing 2D and 3D graphics. OpenGL is an open standard that is constantly being developed and reviewed through the public Architecture Review Board process. OpenGL continues to evolve to reflect changes in hardware and software technology. OpenGL reflects the distilled wisdom of thousands of programmers and researchers, all working to create a great graphics API.

There are a few "gotchas" with OpenGL. First off, OpenGL implementations come in two distinct flavors. There are hardware-accelerated implementations of OpenGL and pure software versions with no hardware acceleration. Hardware-based OpenGL is best for games and interactive programs. Software implementations provide a minimal level of graphics support for interactive use, but are useful for rendering images at sizes and color depths not supported by video cards. If an OpenGL program runs slowly on your computer, you should make sure your video card has hardware 3D support and install the latest drivers. Hardware-accelerated implementations of OpenGL are available for all widely used operating systems including Windows, MacOS, and all versions of Linux and UNIX. Due to the popularity of OpenGL with game developers, we can count on solid OpenGL support from video card vendors for the foreseeable future.

Another rough spot in using OpenGL is the interface between OpenGL and the operating system. You have to interact with the OS to create a window, or load a texture. Unfortunately, the functions needed to interact with OpenGL are different for each OS and windowing system. That is where Simple DirectMedia Layer (SDL) fits in. SDL provides a cross-platform API for working with OpenGL that is layered on top of the OS-specific APIs. You can write your code once and use it on every system that has both SDL and OpenGL.

Working with OpenGL Surfaces

An animation program written using an SDL OpenGL surface has the same structure as an SDL animation program written using hardware or software surfaces. The differences are in the details of using OpenGL for graphics instead of the SDL graphics APIs. Let's look at the differences between using the SDL APIs and the equivalent OpenGL APIs.

Initialize SDL

SDL supports three kinds of graphics surfaces: software surfaces, hardware surfaces, and OpenGL surfaces. Like software surfaces, OpenGL surfaces can be either windows on the desktop, or they can take over the entire screen for full-screen applications. The process of creating an OpenGL surface is different from creating any other type of SDL surface. The first step is the same for all SDL surfaces. We have to initialize SDL by calling SDL_Init().

  printf("Failed to initialize SDL error=%s\n", SDL_GetError());

Initialize OpenGL

You have to tell OpenGL how to configure the graphics system before you create the surface. For software and hardware surfaces, the configuration is specified through SDL_SetVideoMode(). OpenGL surfaces support features that are not found in the other types of surfaces, so the configuration process is more complex and uses APIs that are specific to OpenGL. Before calling SDL_SetVideoMode() we must use SDL_GL_SetAttribute() to provide the extra information needed by OpenGL. The first parameter of the function is the name of the attribute to set. The second parameter is the value we are requesting. The following code is an example of using SDL_GL_SetAttribute() to configure the color format of an OpenGL surface.

SDL_GL_SetAttribute(SDL_GL_RED_SIZE,   8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE,  8);

Those three lines tell OpenGL that I want 8 bits for each color channel. In other words, I want millions of colors, and I won't settle for anything less. If the system doesn't support that depth, then the call to SDL_SetVideoMode() will fail.

OpenGL Attributes

The following describes the OpenGL attributes currently available in SDL 1.2.6. You do not have to set any attributes. You only need to set an attribute if you need to control the value of the attribute. If you do not specify any attributes, you will get a default display that may not meet your needs.


The minimum number of bits of each color in a pixel. You only need to specify a value for these attributes if you need to force an RGB mode with specific color depth.


This is used to specify the minimum depth, in bits, of the alpha buffer.


This attribute is the depth of the frame buffer in bits. It is greater than or equal to the sum of the depths of the red, green, blue, and alpha values. If you want 24-bit color and an alpha channel in 32-bit pixels, you would give a color depth of 8 for each color and a buffer size of 32.


The number of bits in the Z buffers. Most hardware provides either 16-bit or 24-bit Z buffers. Asking for a larger Z buffer depth will fail.


The depth in bits of the stencil buffer.


These attributes are like the other color and alpha values, but for the accumulation buffer. The accumulation buffer is an off-screen buffer used for special effects. The accumulation buffer is usually at least twice as deep as the frame buffer.


This is set to either 0 for don't double buffer, or 1 for double buffering. This attribute must be used instead of the SDL_DOUBLEBUF video mode flag. It is only needed if you are using double buffering.


This attribute is used to turn stereo rendering on and off. When generating stereo images, the frame buffer is laid out to have left and right buffers as well as front and back buffers.


These attributes control full screen anti-aliasing (FSAA) on hardware that supports the multisample extension to OpenGL. If a system does not support the extensions, these attributes will simply be ignored. For an example of how to use these, look at the source code for testgl.c that comes with the SDL 1.2.6 release.

New features and new attributes are always being added to SDL. Support for full screen anti-aliasing was added while I was writing this article. Support for the pixel buffer (AKA pbuffers) extension to OpenGL is under development.

Related Reading

Linux Web Server CD Bookshelf, Version 2.0
6 Bestselling Books on CD-ROM
By A publication of O'Reilly Media

A key point to remember about all of the attributes is that they do not specify the exact value that you will get. They specify the minimum acceptable value. For example, if you ask for 1 bit of green, you may get 8 bits. If you ask for a 16-bit Z buffer, you may get a 24-bit Z buffer. On the other hand, if you ask for a 32-bit Z buffer and your hardware doesn't support it, SDL_SetVideoMode() will fail. That means that you should ask for the minimum your program can live with, and hope for the best. Use SDL_GL_GetAttribute() to get the values of the attributes after the video mode has been set.

I've also included a program called modes.cpp. It iterates through a large number of different video modes, trying pretty much every combination of the attributes that I consider to be reasonable and a few that are clearly unreasonable. On my PC, all but the full screen anti-aliased modes work. But no matter what I ask for, I always get a 32-bit mode with 8 bits for each color component. When I ask for an alpha buffer, I get 8 bits of alpha. When I ask for a Z buffer, I get a 24-bit Z buffer. Why is that?

A basic principle of graphics programming is that you ask for what you want, but you have to be prepared to work with what you get. To start to get a handle on why you have to live with what you get, you have to understand the concept of an OpenGL visual. Go find a copy of glxinfo (if you are using the X Window system) or wglinfo (if you are using Microsoft Windows). If you are using UNIX or Linux, you probably have glxinfo installed already. If you are using Windows, you may have to download a copy of wglinfo. When you run one of these programs, it will list a lot of information about the version of OpenGL you have installed on you computer. There are several options you can use to select the kind of information that is printed out and the format of the information. Try them all and read the output. It can be very enlightening.

The last thing these program print out is a list of visuals that are currently available on your computer. A visual is a specific configuration of the graphics hardware of the video card. The visuals that are currently available are not necessarily all of the visuals that are possible on your video card. On my Linux system, I get different lists of visuals depending on the color depth of my desktop. When my desktop is set to use millions of colors (24 bits of color), the listed visuals are all 24 bits deep with a buffer size of 32 bits, and when I set it to thousands of colors (16 bits of color) the visuals are all 16 bits deep with a buffer size of 16 bits. You will get different results on Windows. Here are a few lines from the visual table for my computer generated by glxinfo:

Table Header

Vis  Vis   Visual Trans  buff lev render DB ste  r   g   b   a  aux dep ste  accum buffers  MS   MS
ID  Depth  Type   parent size el  type      reo sz  sz  sz  sz  buf th  ncl  r   g   b   a  num bufs

16-Bit Visuals

0x21 16 TrueColor    0   16  0  rgba   1   0   5   6   5   0   0   16  0  16  16  16  16   0   0
0x22 16 DirectColor  0   16  0  rgba   1   0   5   6   5   0   0   16  0  16  16  16  16   0   0
0x23 16 TrueColor    0   16  0  rgba   0   0   5   6   5   0   0   16  0  16  16  16  16   0   0

32-Bit Visuals

0x21 24 TrueColor    0   32  0  rgba   1   0   8   8   8   0   0   24  8  16  16  16  16   0   0
0x22 24 DirectColor  0   32  0  rgba   1   0   8   8   8   0   0   24  8  16  16  16  16   0   0
0x23 24 TrueColor    0   32  0  rgba   1   0   8   8   8   8   0   24  8  16  16  16  16   0   0
0x24 24 TrueColor    0   32  0  rgba   0   0   8   8   8   0   0   24  8  16  16  16  16   0   0
0x25 24 TrueColor    0   32  0  rgba   0   0   8   8   8   8   0   24  8  16  16  16  16   0   0
0x26 24 TrueColor    0   32  0  rgba   1   0   8   8   8   0   0   16  0  16  16  16  16   0   0
0x27 24 TrueColor    0   32  0  rgba   1   0   8   8   8   8   0   16  0  16  16  16  16   0   0
0x28 24 TrueColor    0   32  0  rgba   0   0   8   8   8   0   0   16  0  16  16  16  16   0   0
0x29 24 TrueColor    0   32  0  rgba   0   0   8   8   8   8   0   16  0  16  16  16  16   0   0

If you decode the rather condensed table headers, you see that there is a column in the table for every attribute that you can set with SDL_GL_SetAttribute() and a few that you can't. The following describes the relationship between table columns and attributes.

SDL selects which visual to use by comparing the values of the attributes you specified to the rows in the visual table. Any attributes you did not set are treated as having a value of zero. SDL looks down the list until it finds a row in which all of the attributes of the visual are greater than or equal to the attribute values you specified. Because it uses a less-than-or-equal test, the smaller the value of the attribute, the greater the chance that the computer supports a compatible visual.

Memory Usage

The amount of video memory built into a video card is a scarce resource. If your program asks for an OpenGL visual that requires more memory than is available on the video card, the request will fail. This can happen even if the card supports the visual you asked for, because you might ask for a display size that is too large to fit. To help you pick a memory budget, the following describes the effect each OpenGL attribute has on memory usage.

When thinking about video memory usage, many programmers fail to consider how much video memory is already in use before the program starts. The graphics processor can work with data that is in video memory much faster than it can work with the computer's main memory. The OS and/or windowing system will try to take advantage of that speed by moving every graphic resource, from fonts to images, into video memory. Competition for video memory can cause your program to slow down or stall at odd times. It is a good idea to decide on a specific minimum video memory size, and both design to that size and regularly test on a machine with that amount of video memory.

Now let's take a look at each attribute and see how using it can affect the total video memory used by your program.

Attributes and Speed

The maximum performance of a video card is determined by how fast it can read from and write to video memory. That speed is controlled by two factors: the raw speed of the memory, and the width of the memory bus. The raw speed of the memory sets the read/write time, and the width of the bus is how many bits can be read or written at a time. The total memory bandwidth is the speed in reads or writes per second multiplied by the number of bits that can be read or written at one time. That means that if the memory can be written 100 million times per second (10-nanosecond write time) and the bus is 256 bits wide, then the system-write bandwidth is 25.6 billion bits per second. If a pixel is 32 bits wide, then this system can write 8 pixels at a time and write 800 million pixels per second. That number is the absolute limit to the number of pixels that can be written to this hypothetical video memory in a single second. I'm skipping over a lot of details, but no matter what designers do to get around this limit, at some point the architecture is limited by the bandwidth of the video memory chips.

The attributes you select control the amount of video memory bandwidth that is needed to draw a pixel on the screen. Let's look at a few cases in terms of a 256-bit memory bus. Note that to write a single pixel on such a bus, you may have to read 256 bits, modify the bits that represent your pixel, and write it back. But if you are writing more than one pixel, such as when a polygon is being filled, the odds are in your favor that you will be able to write more than one pixel per write. And, if you are lucky, you will be able to write a whole 256 bits' worth of pixels and not have to do a read at all. Most game program draw rather large polygons, so they get lucky most of the time. The following describes the effect that using different buffers has on graphics performance. The examples ignore the effect of texture- and bump-mapping. Using texture mapping and bump mapping increases the number of reads that have to be done for each pixel that is drawn.

The proceeding list only considers speed effects that result from memory bandwidth limitations. There are architectural effects that also come into play. For example, an architecture designed for using the Z buffer can mask the size effects of the Z buffer by placing it in a special block of memory that can be read and written in parallel with the color buffers. Or, a system designed to draw polygons may be surprisingly slow at drawing lines. There are many tradeoffs that have to be made. The actual performance of any given video card depends on a combination of its target market, target price, and the technology available at the time it was designed.

Create the OpenGL Surface

Once the OpenGL attributes are set the way we want, then we call SDL_SetVideoMode() to create the OpenGL surface. When you specify the SDL_OPENGL flag, the bits-per-pixel parameter is ignored; that information is instead provided by OpenGL attributes. The same is true for the SDL_DOUBLEBUF flag. The size parameters are used to specify the size of the display. The other SDL video mode flags work as expected.

if (NULL == (screen = SDL_SetVideoMode(640, 480, 0, SDL_OPENGL)))
    printf("Can't set OpenGL mode: %s\n", SDL_GetError());

Screen Flipping

The last thing we need before we can start animating is a way to swap the buffers. For SDL hardware and software buffers, we use SDL_Flip(). For OpenGL, we use SDL_GL_SwapBuffers().

I keep running into people who are confused by the way buffer swapping interacts with the video display. The video display hardware is constantly reading the contents of video memory and converting it to a video signal that your monitor then turns into the pattern of colored light that you see on the screen. The process of painting an image on the screen takes time. At 85 frames per second, it takes just under 12 milliseconds to draw the frame on your screen. The process is broken up into several phases, but the ones we are interested in are the frame time and the vertical retrace period. The frame time is the length of time from when the hardware starts displaying the current image on the screen until it starts display the next image on the screen. The vertical retrace period is a brief period at the end of the frame time when the video system has finished displaying one image but hasn't started displaying the next image.

If we change the display buffer during the frame time, the hardware will display part of the front buffer at the top of the screen and part of the back buffer at the bottom of the screen. Splitting the image like that causes a visual effect called tearing. We want the buffers to switch during the vertical retrace period so we never see parts of two frames on the screen at the same time. That means that when we measure the frames per second rate of a program it should never be faster than the vertical refresh rate of the monitor. Sadly, that is almost never what we see.

The problem is that some OpenGL drivers perform buffer swaps during the vertical retrace period and some do not. Others can be configured by the customer to synchronize or not. That means that a program that runs at 80 frames per second on one computer will run at 800 frames per second on an identical computer with a tiny change in the configuration of the video card. Of course, on the second computer, you will never see most of the frames that were drawn but they will be drawn. This difference in frames-per-second speed drives people nuts, because they don't understand what causes it.

The confusion is worsened by the manufacturer's habit of configuring cards to ignore vertical retrace. Why do they do that? My guess is that they know that if, given the choice between buying a video card that claims to run your favorite video game at 1,000 frames per second and one that only claims to draw 85 frames per second, most people will buy the 1,000 frames per second card. More must be better, right?

OpenGL Extensions

Hardware vendors have added many extensions to OpenGL. Some extensions were added as standard extensions by the ARB, and others were added by specific vendors. Many of the vendor extensions have been adopted by other vendors and have become de facto standards. It is not possible for SDL to provide a function call for every extension; I'm not sure it is possible for the SDL developers to even know about every extension. Instead, SDL provides SDL_GL_GetProcAddress(), which looks up functions by name and returns a pointer to the named function. You can then use that pointer to call the function. Using SDL_GL_GetProcAddress() allows you to access every OpenGL extension.

Custom Libraries

The idea that an application would load its own special OpenGL library seems a little odd when you first think about it. After all, don't you want to use the system library to get the best performance out of your video card? Well, no, not always. It may be that you know that a lot of the machines your application is going to run on do not have OpenGL support, so you include your own OpenGL library. (There is at least one commercial implementation of OpenGL that is based entirely on DirectX. It is designed to let OpenGL programs run on Windows without any other OpenGL support.) Or you may need to use software rendering, because you are generating images that are larger and have more bits of color or Z buffer than can be rendered on any existing video card. In those cases, you must load your own OpenGL library.

SDL provides SDL_GL_LoadLibrary() for loading custom OpenGL libraries. After you have loaded the library you must use SDL_GL_GetProcAddress() to retrieve pointers to all of the OpenGL functions your program will use. This function is not for the amateur or the faint of heart.

Animation with OpenGL

In the previous articles in this series, I used a simple animation program, softlines.cpp, to demonstrate SDL software surfaces. Then I converted it into hardlines.cpp to highlight the differences between hardware and software buffers. In this article, I'll continue with that theme by modifying the same program to do the same animation using OpenGL. The new program is called gllines.cpp.

The first change I have to make is to include the SDL OpenGL support library by adding an include statement:

#include "SDL.h"
#include "SDL_opengl.h"

Without this tiny change, the program won't compile.

The biggest change in the program is the removal of all of the software line drawing code. The first two programs had several hundred lines of code devoted to line-drawing routines. OpenGL has its own line-drawing code, so my old code can be deleted.

The next change is in the sweepLine class. SDL and OpenGL have very different ways of handling color. I had to make several changes in the program to accommodate those differences. In the SDL versions of the program, we used SDL_MapRGP() to convert colors to pixel values. We then use the pixel values to draw colored lines. In OpenGL, you set the color using the red, green, and blue color components just before you draw something. So I changed sweepLine to keep around the color components instead of just using a single pixel value. Then I had to change the actual line-drawing code to use OpenGL.

The next set of changes show up in main() before and after the SDL_SetVideoMode() call. Before setting the video mode, I had to add calls to SDL_GL_SetAttribute() to configure the OpenGL surface. To make sure that the program runs on as many machines as possible, I only ask for 1 bit each of red, green, and blue.

After setting the video mode, there are calls to glViewport() and glOrtho(). These calls put the (0,0) coordinate in the upper-left-hand corner of the screen and force the Y coordinate to increase down the screen. They also set the logical width and height of the window to be the same as the width and height measured in pixels. Programs tend to have built-in assumptions. If changes violate those assumptions, you are likely to introduce bugs. I'm trying to avoid that problem by making the coordinate system of the OpenGL surface match the coordinate system of an SDL software surface.

The rest of the changes are pretty small. The SDL code for creating pixel values for colors has been deleted. The call to SDL_FillRect() used to clear the back buffer has been replaced with a call to glClear(), which has the same effect. And the call to SDL_Flip() has been replaced with a call to SDL_GL_SwapBuffers().

When I tested that version of the program, it reported that it was running at over 500 frames per second and my system monitor showed that it was using 100% of the CPU. Every so often, the animation would stop and then jerk forward as the OS took the CPU away from my program to run another task. To fix that, I added one more short piece of code at the top of the main animation loop.

now = SDL_GetTicks();
if ((now - ticks) < 10)
ticks = now;

The code tests to see how long it has been since the last time the top of the loop was reached. It if is less than 10 milliseconds, then the program executes SDL_Delay(5) to slow down the program. After adding that change, the program still reports running at over 140 frames per second, but it only uses about 2% of the CPU.

Note: I used the magic value 5 for the delay because on a system where the clock ticks every 10 milliseconds (a lot of systems), you will get an average delay of 5 milliseconds (half of 10), and on systems where the clock ticks every millisecond (another large group of systems), you will get an average delay of 4.5 milliseconds. So asking for a delay of 5 milliseconds will, on average, give me a delay pretty close to 5 milliseconds. No other integer less than 10 has that property.

The final program is several hundred lines shorter than the original program. Many of these changes make the program smaller and simpler; I like changes like that. The majority of the changes were one-for-one substitutions of OpenGL APIs for SDL APIs, which did not force any structural changes on the program.


In this article, I have introduced all of the functions in SDL that interface directly with OpenGL. The gllines.cpp program demonstrates the use of those functions. Equally as important, I have described the effects on memory usage and performance that can result from using the many different kinds of OpenGL buffers.

My next article will cover using SDL with OpenGL to solve common graphic programming problems. SDL has a small set of graphics operations that are ideal for preparing graphics for use in OpenGL applications.

Bob Pendleton has been fascinated by computer games ever since his first paid programming job -- porting games from an HP 2100 minicomputer to a UNIVAC 1108 mainframe.

Return to the Linux DevCenter.

Copyright © 2009 O'Reilly Media, Inc.