1

I am currently writing a simple graphics application in C++ that just shows a room and objects in the room. I want to make this render using the GPU of a computer, but how do I do this with broad compatibility (Meaning Nvidia, Intel and ATI, old and new cards). My computer uses an AMD/ATI chipset (instead of an integrated graphics card on the Mother Board, I have a GPU integrated into the processor die, called an APU instead of CPU, and a dedicated graphics card, also ATI). However, most likely the computer running the program will either have an integrated graphics card from Intel or Nvidia, or have an Nvidia card, so I want it be able to use any graphics card, but not use the CPU. I am using the glut library for this application

UPDATE: The real problem is that the professor seems to think that extra code is required for the program to use the graphics card as opposed to the processor to render the graphics. As I understand it now, all graphics cards support opengl regardless.

2
  • 2
    Mouse over your tag opengl for one answer.
    – Eric J.
    Commented Feb 22, 2015 at 3:58
  • Ok, clarification. Turns out that what my professor meant was to use shaders to draw things instead of shapes. Makes a bit more sense, but still having difficulty with implementation of this.
    – cluemein
    Commented Feb 22, 2015 at 17:59

1 Answer 1

3

By using an abstract API that directs the drawing calls to the GPU's driver. OpenGL and Direct3D are just such APIs. Neither OpenGL nor DirectX are libraries. Yes you link to some library called libGL.so or opengl32.dll or d3dxxx.dll but these are just talking to the GPU's driver which will then direct the GPU to whatever it's been asked for through the API.

Or in other words: If your program is using OpenGL or DirectX and links against the standard interface libraries on the system (which is what happens by default), then it will work with whatever GPU there is, as long as its driver supports the API in question.

6
  • odd, as my professor is basically saying there are extra lines of code that need to be added in order to use the GPU instead of the CPU.
    – cluemein
    Commented Feb 22, 2015 at 4:32
  • Any chance your professor was talking about a program that would use the GPU as a coprocessor for numeric computations (as opposed to the more traditional use of the GPU to render graphics for the user to look at)? Commented Feb 22, 2015 at 4:46
  • @cluemein: Professors are not infinite wells of wisdom. And quite often professors will spread a lot of misinformation, if they're lecturing on a topic which is not their prime focus of research (for example a professor for systems design who was assigned a lecture on computer graphics or if his lecture reaches that chapter on graphics architectures). Never blindly trust your professors, always do your own research. And if you find flaws in their lectures, meet with them in their consultation hours and point that out.
    – datenwolf
    Commented Feb 22, 2015 at 16:15
  • Unfortunately, part of the assignment I am working on requires me to make the program render things using the GPU. That is actually what it says, 10 pts for making the program render using the GPU instead of the CPU.
    – cluemein
    Commented Feb 22, 2015 at 17:41
  • @cluemein: Your program already uses OpenGL and GLUT? Then there's no way how you can force it onto a certain implementation. If you were programming against the native OS you may be able to trick the OpenGL interface library to use the software rasterizer fallback by choosing a pixelformat that's not GPU supported. Otherwise there's no way in forcing it. Could you please post the assignment somewhere?
    – datenwolf
    Commented Feb 22, 2015 at 18:46

Not the answer you're looking for? Browse other questions tagged or ask your own question.