0

When I bought my computer a year ago, I somehow managed to change some setting(s) to allow me to use both graphics cards at the same time. Recently the dedicated one flamed out, so I took both my computer and the graphics card to a repair shop to have the dedicated card fixed. When I got it back, that setting had somehow been reset -- I don't know what he did, but whatever setting I enabled a year ago is now disabled and I've spent hours to no avail trying to figure out how to re-enable it. Currently I'm only able to get signals from the ports on the dedicated card -- nothing hooked up to the onboard card is getting a signal anymore.

I have read a large number of similar questions, and they mention to find display settings in the BIOS. I'm using Asus's UEFI BIOS, so the layout, sorting, etc varies drastically. I haven't been able to find any sort of video settings in Advanced Mode at all.

On the OS's side, it looks like everything is fine. I used a screenshotting program and slid the canvas onto the second monitor (hooked up to onboard), and it captured part of its desktop (even though the monitor is black with no signal - orange power button). In this same screenshot, I also included the Device Manager and Screen Resolution:

http://i.imgur.com/sC449wT.jpg

Here's a screenshot of DxDiag for both of the displays:

http://i.imgur.com/BXvkTio.png

It's also worth noting that the (VGA) second monitor wasn't showing up in Screen Resolution at first. I clicked Detect, Windows told me it found a VGA but there was a problem with it or something, and I forced it to connect anyway. I haven't been able to remove the monitor from Screen Resolution to repeat that to find the exact message I got though.

What can I enable in the UEFI BIOS or on my Windows 7 (64-bit) OS to restore functionality to the onboard card? Please let me know if you need any information that I may have left out -- I'll reply back with it as soon as possible.

Edit: On second thought, the OS might not be detecting the VGA monitor as well as I thought it was. The second screen in the screenshot is way smaller than what it was when I had it connected previously, and the resolution options in Screen Resolution for the VGA monitor are unconstrained:

http://i.imgur.com/5XMOcyn.png

2
  • It appears as though the stock drivers for the new graphics card were installed .. have you grabbed the latest drivers for your onboard and dedicated card? And the BIOS might not have anything to do with it .. most of the time that's just for 'primary display', not 'allow dual display' .. but if you can post the BIOS version as well that would be helpful in determining if there is indeed an option .. you can press the Pause/Break key on your keyboard when your computer boots to the BIOS screen to pause it from loading all the way so you can get that info if it flashes too quick ..
    – txtechhelp
    Commented Dec 23, 2015 at 7:30
  • The dedicated driver is up to date, and I believe the onboard chip uses the same driver. I have an Asus A55BM-E motherboard, and, from what I'm reading, the onboard chip is also a Radeon 7000 series like the dedicated one (which would explain why there's only one driver in the Display section of the Device Manager). The UEFI BIOS version is 2.10.1208.
    – Drew
    Commented Dec 25, 2015 at 2:02

2 Answers 2

0

You got to change the setting in NVidia Control Panel AMD equivalent.

On NVidia it would be: Right click on Desktop -> Nvidia Control Panel -> Manage 3D settings -> (and there you have global setting Preferred graphics processor - set it to high performace gpu)

Assuming that you have both drivers installed:

for Intel Graphics and AMD.

5
  • I'm not seeing that anywhere in AMD's settings: (1) The screenshot of the right window is from the "Global Settings" part of the Gaming tab. Here are screenshots of the tabs: (2), (3), (4). "System" is just a read-only list of system specs. Also, do I really need Intel drivers? As I mentioned in a comment in the OP, the onboard gpu is from AMD and uses the same driver as the dedicated card.
    – Drew
    Commented Jan 11, 2016 at 2:55
  • @EchoFive There is no such a thing as hybrid AMD+AMD as far as I know. What Laptop model is that? Could you give me a device manager screenshot (display adapters)? Generally in systems with Hybrid graphics, in most cases there is some kind of Intel HD Graphics, and that card proxifies everything (even when using high performance GPU). So maybe on AMD settings would be in Intel Graphics Panel. Commented Jan 11, 2016 at 9:32
  • A screenshot of the display adapters is in the OP (first image). It doesn't look like I have any Intel drivers installed (if what I require is an Intel driver, I guess that means the guy I took it to uninstalled it since it was working before?) Also, it's a desktop -- not a laptop. It's a custom build, but the motherboard model (which houses the onboard) is in the OP.
    – Drew
    Commented Jan 11, 2016 at 17:08
  • 1
    oh sorry then, thought it was a laptop... it works totally different way on desktops... just turn off your integrated GPU in bios Commented Jan 11, 2016 at 23:11
  • That was the right setting, but I had to change it from "auto" to "force" -- not disable it
    – Drew
    Commented Jan 14, 2016 at 3:09
0

This is an old question, but it's still applicable on many systems which may or may not share the characteristics of the OP's system.

As far as installing the iGPU driver, (Intel or Amd or ...) I would temporarily remove the dGPU and boot up with a monitor connected to the M/B. Win7 will download or uncompress any missing drivers and install them.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .