23

I am part of a team developing an application using C++ with SDL and OpenGL.

On laptops when the application is ran the dedicated graphics card is not used and the GL context fails to create because the integrated graphics card does not support the version of GL we want.

I have a feeling that this problem is specific to the laptop in question and not something we can solve through code. But, if anyone knows if there is a solution that'd be great.

4
  • Do you mean laptops with dual graphics cards? Does manually switching to the dedicated card help (in nvidia settings or wherever it is)?
    – riv
    Commented May 29, 2013 at 20:43
  • You might be able to use the target platform(s) specific API(s) to access what devices are available then pick which one to create the active context on. Though I have a feeling you are right and the inactive graphics device will not show up until turned on in the settings for the laptop as suggested by @riv.
    – kc7zax
    Commented May 29, 2013 at 20:46
  • 1
    @riv yes it is a laptop with dual graphics cards. We can of course add the application to the list of applications that use the dedicated card in the nvidia/ati settings but for end users we would prefer they don't have to do that. Commented May 29, 2013 at 21:05
  • 2
    The replies with __declspec(dllexport) are old and specific to the Nvidia optimus driver. Windows 10 has now it's own way to configure high performance GPU (see pureinfotech.com/set-gpu-app-windows-10). Are the replies still up-to-date, or is there a vendor neutral way to achive this in Windows 10 in the meantime?
    – jcm
    Commented Oct 28, 2021 at 8:25

2 Answers 2

39

The easiest way from C++ to ensure that the dedicated graphics card is used instead of chipset switchable graphics under Windows is to export the following symbols (MSVC sample code):

Enable dedicated graphics for NVIDIA:

extern "C" 
{
  __declspec(dllexport) unsigned long NvOptimusEnablement = 0x00000001;
}

Enable dedicated graphics for AMD Radeon:

extern "C"
{
  __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

Caveat: If the user has created a profile for the application to use integrated chipset, then these will not work.

I am unsure if this would work similarly under Linux / MacOS (unlikely).

8
  • Thanks for noticing that! Two underscores is more correct, but _declspec is working as expected: stackoverflow.com/questions/1399215/… Commented Aug 20, 2016 at 21:23
  • 1
    Hey, How could I use that in a c# project ? (WPF) Commented Feb 16, 2017 at 10:09
  • 1
    Awesome, thanks. Didn't expect it to work, but it does, at least with Nvidia! Also, didn't know you can dllexport stuff from .exe. Commented May 8, 2017 at 20:07
  • 1
    Should these symbols be exported in main executable (.exe) or can be in one of it's DLLs? Commented Feb 1, 2021 at 15:03
  • 1
    @RomanKhvostikov It has to be from the main executable, or from a library that is statically linked to it. DLLs do not work.
    – pvallet
    Commented May 5, 2021 at 17:16
4

Does it use NVidia dedicated graphics? AFAIK, the process of automatically switching from integrated to dedicated is based on application profiles. Your application is not in the driver's list of known 3D applications, and therefore the user has to manually switch to the dedicated GPU.

Try changing the executable name of your application to something the driver looks for. For example "Doom3.exe". If that works, then you've found your problem.

If that didn't help, try following the instructions on how to make the driver insert your application in its list of 3D apps:

https://nvidia.custhelp.com/app/answers/detail/a_id/2615/~/how-do-i-customize-optimus-profiles-and-settings

But the above is only for verifying if this is indeed your problem. For an actual solution to this, you should check with the graphics drivers vendors (AMD and NVidia) on the best way to insert a profile for your application into their lists. NVidia provides NVAPI and AMD has ADL and AGS. They're definitely worth a study.

4
  • 2
    The goal is to prevent the end user from having to add the application to the list of applications that use the dedicated graphics card. This could also occur on non-nvidia devices as well. I'll take a look at the link you sent. Is there a similar solution for ATI cards? Commented May 29, 2013 at 21:09
  • 1
    @ConnorHollis: The places where the application profiles are stored are well known. The straigtforward solution is to have the installer add an application profile for the drivers of AMD and NVidia.
    – datenwolf
    Commented May 29, 2013 at 21:49
  • 1
    But you should use the Nvapi from Nvidia for creating the application profile instead of writing to those places yourself as the location of that information has already changed in the past and might change again at any moment. Commented May 30, 2013 at 10:58
  • @ConnorHollis These APIs are definitely worth a look. I've put links to them in the answer.
    – Nikos C.
    Commented May 30, 2013 at 11:57

Not the answer you're looking for? Browse other questions tagged or ask your own question.