8

My system has the following specs:

GPUs:

  • AMD Radeon(TM) Graphics
  • NVIDIA GeForce RTX 2060 with Max-Q Design

CPU:

  • AMD Ryzen 9 4900HS with Radeon Graphics

When I run vkEnumeratePhysicalDevices() it only returns 1 device, my integrated graphics (AMD Radeon(TM) Graphics). But I figured out that I can force it to use my discrete graphics card if I use the graphics settings in Windows settings (Settings > System > Display > Graphics Settings, then browse to find the build .exe for the Vulkan project and set options to High Performance). However, when I run the Vulkan project now, it still only shows 1 gpu on the system, just it switched to only showing the RTX 2060.

This is obviously not something that one could expect a player of a game to do who has no experience with code (if I was to ever finish a game with Vulkan). I've reasoned that this might be happening because I'm on a laptop, so I tried disabling all power saving settings but nothing changed.

So I'm wondering if there is a way to find all GPUs on the system outside of Vulkan and have the physical device be set out of that list, since Vulkan seems to be unable to find all GPUs at once?

Edit: I figured out that I can get it to see the dedicated GPU if I turn off the integrated GPU in Device Manager, then turn it back on.

Also I am getting the error: validation layer: setupLoaderTrampPhysDevs: Failed during dispatch call of 'vkEnumeratePhysicalDevices' to lower layers or loader to get count.

Edit 2: I call vkEnumeratePhysicalDevices() like so:

    void pickPhysicalDevice() {
        uint32_t deviceCount = 0;
        vkEnumeratePhysicalDevices(instance, &deviceCount, nullptr);
        std::vector<VkPhysicalDevice> devices1(deviceCount);
        
        vkEnumeratePhysicalDevices(instance, &deviceCount, devices1.data());
        if (deviceCount == 0) {
            throw std::runtime_error("Failed to find GPUs with Vulkan support.");
        }

        std::vector<VkPhysicalDevice> devices(deviceCount);
        vkEnumeratePhysicalDevices(instance, &deviceCount, devices.data());

        std::cout << "Physical Devices(" << deviceCount << "):\n";

        for (const auto& device : devices) {
            //get the device properties
            VkPhysicalDeviceProperties props;
            vkGetPhysicalDeviceProperties(device, &props);
            //print the device and whether it is suitable or not
            std::cout << props.deviceName << " -- " <<  (isDeviceSuitable(device) && props.deviceType == VK_PHYSICAL_DEVICE_TYPE_DISCRETE_GPU ? "suitable GPU" : "unsuitable GPU") << std::endl;

            if (isDeviceSuitable(device) && props.deviceType == VK_PHYSICAL_DEVICE_TYPE_DISCRETE_GPU) {
                std::cout << "Device ^ accepted" << std::endl;
                physicalDevice = device;
                break;
            }
        }

        if (physicalDevice == VK_NULL_HANDLE) {
            throw std::runtime_error("Failed to find a suitable GPU.");
        }


    }

My base Vulkan code is modeled off of this website: https://vulkan-tutorial.com/Drawing_a_triangle/Setup/Physical_devices_and_queue_families

10
  • If Vulkan can't find the drivers for them, it certainly can't get physical devices for them, since in order to do that, it would have to find the drivers for them. Your question is circular: how can I make Vulkan find the things Vulkan can't find? Commented Jun 24, 2021 at 4:38
  • @NicolBolas But that's the thing, Vulkan can find both of them. It uses the Integrated Graphics by default, and uses the 2060 when I force it to. It just can't find both at the same time.
    – Aejjee
    Commented Jun 24, 2021 at 4:53
  • This may be related to driver or even os implementation. My setup of win10 on gtx1650 and uhd630 can find both of the devices in two device groups (one group for each), and can create device from there (using the lunarg vulkan sdk). Also, if you are using windows, dxgi have the functionality, though somewhat limited in sharing etc, to let you choose adapter. It seems linux limits gpu to x or wayland session, and the only option I know of to switch gpu within login is using bumblebee. Commented Jul 26, 2021 at 6:13
  • Also, the LUID from dxgi is guaranteed to be the same as the enumerated vulkan physical device's LUID (as long as they are the same device on the system), so that you can reliably work on the same device in both api. From The error you are getting, it is very likely that there's driver problem or broken sdk installation on your system. Checking vkconfig from the vulkan sdk tools may be a good start. Commented Jul 26, 2021 at 6:15
  • @shangjiaxuan I've updated all of my drivers, and I've reinstalled the SDK but the same issue persists. The weirdest part again is that once I disable my integrated graphics card in Device Manager, it manages to find the dedicated GPU. It's like my laptop is hiding my dedicated GPU when the integrated GPU is available. EDIT: In vkconfig it doesn't show anything anomalous except for only finding my integrated GPU.
    – Aejjee
    Commented Jul 26, 2021 at 7:27

1 Answer 1

10

After much reading of basically every forum on the internet that consists of the function name vkEnumeratePhysicalDevices(), and after the week long bounty on this question still remained unanswered, I've found the answer. It lies within the interop between AMD and Nvidia drivers, at least for my laptop (ROG Zephyrus G14). The forum in which the answer was hiding was here: https://github.com/KhronosGroup/Vulkan-Loader/issues/552

From what I understood from what they were saying over there on GitHub, what is happening as @pdaniel-nv describes, is the validation layers VK_LAYER_AMD_swichable_graphics and VK_LAYER_NV_optimus both want to pick High Performance GPUs, but they only want to filter for AMD or Nvidia GPUs respectively. (VK_LAYER_AMD_swichable_graphics wants to use AMD dedicated GPUs and VK_LAYER_NV_optimus wants to use Nvidia dedicated GPUs) So what happens is both drivers from AMD and Nvidia filter out each others GPUs when these layers are used so the list of available GPUs ends up with nothing.

So basically the solution is to disable one or both of those layers through Vulkan Configurator, or better yet in your code. The only downside is that the developer now has to write custom code to determine which GPU is optimal for their program—which honestly is no big deal in my opinion.

2
  • Thank you for taking the time and writing this. Helped me solve the issue!
    – rozina
    Commented Oct 12, 2021 at 7:59
  • I was just starting on the 'read every forum everywhere' journey, but happily one of the first steps landed me here. Thanks! Commented Jan 11, 2022 at 16:15

Not the answer you're looking for? Browse other questions tagged or ask your own question.