39

Many motherboards marketed as "gaming" has an integrated Intel graphic cards. Examples are the ASUS B150I PRO GAMING/WIFI/AURA and the Gigabyte GA-Z170N-Gaming 5 but these are just a couple of many. Note the "Gaming" word in their respective names.

Now I understand, that if you want to build a gaming PC most likely you would opt for Nvidia or AMD. This is because integrated video do not have a chance to compare with higher end Nvidia/AMD offerings. Correct me if I'm wrong.

I understand that putting in an integrated graphics into a motherboard increases it's cost. So there must be a reason why manufactures do this. It looks to me that putting an integrated GPU on a gaming MB is more of a rule rather than an exception.

I however cannot figure it out, what this integrated graphic is good for. Could you please explain what it can be used for (I'm guessing the intentional use, but any other possible uses too) given that for a gaming PC one is most likely to utilize an external GPU?

If you think any of my assumptions are wrong please point that out, since the whole thing does not make a lot of sense to me it is quite likely that it's my assumptions that are wrong somewhere.

11
  • 11
    A somewhat esoteric use would be if you're running Linux but want to play in a Windows virtual machine by giving it control of your graphics card. In that case, you need to have a second GPU to display the Linux OS (there is no way to have a GPU shared with both the host and the VM) and this little built-in GPU becomes very useful. Commented May 29, 2016 at 15:38
  • 13
    To get right to the point (since the answers are correct but avoid the critical detail): Those motherboards do NOT have any graphics processor integrated. All they have is a video output connector that allows, if you buy a CPU with integrated graphics, to be able to use it. But if you buy a CPU without integrated graphics (prior to Skylake, you could buy Xeon CPUs and use them with gaming motherboards, and on Xeon the iGPU is optional), the motherboard will not provide it.
    – Ben Voigt
    Commented May 29, 2016 at 17:41
  • 11
    "Gaming" is a meaningless marketing term and nothing more. You can buy non-gaming motherboards and have better performance for less money
    – Keltari
    Commented May 29, 2016 at 22:21
  • 2
    Nvidia or Radeon These are not the same things, it should be Nvidia or AMD (company names) or GeForce or Radeon (GPU model names). Commented May 31, 2016 at 11:09
  • 3
    @Paparazzi real gamers don't fall for the hype. We make fun of those who do
    – Keltari
    Commented May 31, 2016 at 15:06

5 Answers 5

16

There are few wrong assumptions and they led you to wrong conclusions:

Many motherboards marketed as "gaming" has an integrated Intel graphic cards.

The graphic card is on CPU. Intel made this decision, not motherboard maker. When buying Intel, GPU cannot be avoided.

I understand that putting in an integrated graphics into a motherboard increases it's cost.

It depends on what you're looking at. If you look at price of chips alone, the costs are not ground breaking. On LGA775 platform the GPU was integrated in chipsets, so some had integrated GPU while others were genuinely lacking the processing power. However, the low-end chipsets with GPU (eg. G41) were actually cheaper than high-end chipsets without GPU (eg P45). So we can conclude that while integrated card must increase the price of a chip, it's not really enough to justify costs of making 2 lines of chips: with and without. This is probably why Intel decided to put a GPU on every single consumer CPU.

Now, since the GPU is already on the silicon, we can consider the costs that can be decided by motherboard designer. If he wants to make the GPU work, he adds the connectors (probably the most expensive part of implementing onboard GPU this days), traces, and a handful of dirt-cheap passive components like those tiny resistors and capacitors. Those costs are still negligible. If we were talking about lowest-end budget motherboard, axing few dollars would probably be at least put under serious consideration - but on a high end motherboard that is already expensive any possible savings are negligible.

This is because integrated video do not have a chance to compare with higher end Nvidia/Radeon offerings.

I cannot really call you wrong on this one. With high-end they can't compare. However, the old wives tale that integrated GPUs are useless isn't true anymore! There are 2 desktop Intel processors (LGA1150 Broadwell, Core i5-5675C and Core i7-5775C) that have integrated Iris Pro Graphics 6200 that was a shock when it was released in Q2 2015. It's performance is comparable to low end discrete GPU, so it can be actually used to play most games on lower detail. If you're a gamer on tight power or space budget (eg. console-sized living room PC), I believe this would be a way to go. This integrated GPU was probably quite expensive, that's why it's seen only on $276 CPU.

There is also an elephant in the room here. I believe you've assumed that "gaming" means "top performance". Well, it does not. It's simply a marketing strategy. Nobody is really able to tell what "gaming" label means, except that it features aggressive styling and a higher price tag. Basically a premium product. So, when in doubt, just add every feature you can and you'll have one more point on feature list. Like pretty RGB lights that most users will probably lock up in the case and shove under the desk to be never seen again or shiny metal over PCI slot that does nothing but looks cool. (Seriously, lights? How are they in ANY way useful in gaming? I can't believe you questioned integrated GPU while there are lights on the mobo!)

12
  • 2
    There are mobos with integrated graphics, not just processors. Commented May 29, 2016 at 20:53
  • 1
    When buying Intel, GPU cannot be avoided. Wrong ! If you buy a multi thousand dollars intel ᴄᴘᴜ, chance are that the space used on the die for the gpu will be used to let the ᴄᴘᴜ having more transistor which means more throughoutput. Just check it on ark.intel.com (this include the latest gen). However, I’m not sure the video output of the motherboard would work. Commented May 30, 2016 at 0:01
  • And as far as "gaming" goes it's just makes a better question in my opinion. I better ask about "gaming" cards, indicate explicitly that I understand that it is a marketing term in the very first sentence of the question and put the word gaming in quotation marks, than try to come with more awkward definition of the mobo class I'm inquiring about. Apparently even then people feel compelled to point out that it's marketing. Thank you, I already know that =) Commented May 30, 2016 at 8:20
  • 1
    "he adds the connectors [...] and a handful of dirt-cheap passive components like those tiny resistors and capacitors. Those costs are still negligible." I suspect a lot of people have little idea how negligable these costs are. Note that if you contract a chinese PCB manufacturer, they will usually throw in as many surface-mount resistors/transistors/etc as your design requires. They just don't bother charging for them. Even in the UK in small volumes, those components cost significantly less than a penny each. In small volumes, HDMI connectors are < 30p each, delivered from China.
    – Jules
    Commented May 30, 2016 at 13:16
  • 1
    To a motherboard manufacturer, dealing in large bulk purchases of components, I would doubt that the extra cost of adding an HDMI port to a motherboard amounts to more than about 20p. The only reason they might choose not to do it is if they're struggling to make enough space for other components, particularly connectors that need to contend for precious space on the edge of the board.
    – Jules
    Commented May 30, 2016 at 13:17
45

There's a few. Firstly, nearly every modern single mainstream1 processor has a integrated on die GPU. The chipset supports it. Essentially your only cost is the traces and connectors, so it's a 'free' feature you can design in - unlike older designs. Interestingly, many of the Sandy and Ivy Bridge-era Intel chipsets outside the Z series made you pick one or the other (H series) or didn't have onboard video (P series). Many earlier processor families used a PCIe 'slot' for a onboard chip but most integrated graphics is on die.

Modern integrated GPUs do neat stuff like quicksync, which mean even with a discrete card the IGPU bit of your core can be working. With earlier drivers you needed a display (or a dummy display) but you can set up quicksync to work without one for faster transcodes or video playback. I'm sure AMD has something similar on their APUs - but I've not used them recently - they're somewhat more powerful than intel's models, and paired with a discrete radeon might do switchable graphics to save power.

It's also handy if your main video card's blown and you don't have a spare. I seriously found this useful with my last PC, which had GPU failure. Sure, you can replace it, but its totally worth it to be able to check just by yanking out the old card and changing the output the monitor is plugged into.

So, in short: "All the pricy stuff is already there and Intel insists, so why not add a cheap feature?"

1 I'd consider most Intel LGA 115x processors, and AMD APUs to be mainstream. The AMD FX series and Intel LGA 2011 are enthusiast focused, though the FX series kinda overlaps with intel's mainstream products on price. AMD fans may disagree.

As of 2018 - things get a bit more complex. Intel's core i3 and i5s are solidly mainstream. The i7 and i9 badges have mainstream and server inspired models. As for AMD - ryzen's the mainstream and threadripper is enthusiast.

7
  • 5
    AMD's cheaper processors are apus. I suppose AMD's gaming/performance oriented fx processors don't have on die GPUs...
    – Journeyman Geek
    Commented May 29, 2016 at 9:09
  • 9
    The third paragraph happened to me last year. The GPU failed and I just had to change the HDMI cable from the GPU to the motherboard (Intel CPU graphics) to be able to use the PC until I bought a new GPU.
    – Edu
    Commented May 29, 2016 at 11:21
  • 2
    The difficulty in finding a mid-range CPU without integrated graphics is a bit beside the point. Even if only 25% of CPUs had a iGPU, it would still make sense for the m/b manufacturer to add the output connector allowing you to use iGPU if you chose to buy it.
    – Ben Voigt
    Commented May 29, 2016 at 17:37
  • 17
    integrated devices are so nice. Im dating myself, but I remember when motherboards had a CPU slot, memory slots, and ISA slots and that was it. You had to buy serial/parallel cards, video cards, and even IDE HDD cards separately. You kids today are spoiled with integrated sounds cards, network cards, GPUs, SATA controllers, USB, etc. Plus we had to hand crank the machines for power and walk uphill both ways to school in the snow - wearing newspaper for shoes.
    – Keltari
    Commented May 29, 2016 at 22:26
  • 6
    @Keltari -- too true. And those IDE cards were so much cheaper than the MFM/RLL controller cards they replaced... but don't forget the slot for the 8087: what folks today are really spoiled by is the fact that they get floating point for free when they buy a processor, rather than having to get another one just for that purpose... :)
    – Jules
    Commented May 30, 2016 at 13:07
5

Could you please explain what it can be used for (I'm guessing the intentional use, but any other possible uses too) given that for a gaming PC one is most likely to utilize an external GPU?

There are two uses I can think of for integrated video in enthusiast hardware:

  1. It can drive an additional monitor. Have one or two monitors driven by the expensive PCIe GPU. Use those for your games that demand performance. Drive an extra monitor off the integrated video and use that for email and web browsing. Modern high-end cards tend to be able to drive more and more monitors, but I still think this is a valid point.

    My primary system has a monitor I use to keep up a web browser while gaming. It is great for having a wiki, forum, or Arqade up with information I can use in the game. My GPU has enough outputs that I can drive it off the primary video. If it did not, I would not hesitate to plug it into the integrated video connector.

  2. Troubleshooting if your GPU is damaged. You still have backup video to use your system while troubleshooting your primary video, or while ordering a replacement and waiting for it to arrive.

2
  • Good points especially number 1. I have not considered that because I usually use graphic cards that can support two monitors, but if you don't have such card or what a third one, it is certainly a possible use. Commented May 30, 2016 at 8:23
  • Re additional monitor -- it's been a while since I bought a new computer with an Intel chip, but the last AMD machine I bought had an internal GPU but the chipset on the motherboard couldn't enable the internal GPU if I wanted to be able to use an external graphics card. I presume Intel systems (and/or more recent AMD ones... this was maybe 4 years ago now) have removed that restriction?
    – Jules
    Commented May 30, 2016 at 13:21
0

I have a high-end Nvidia card, and at some point the commonly used boot loader on a lot of utility discs stopped working with it. So I try to boot clone disk for backing up, and can't see the boot menu! Likewise with newer copies of ubcd and any "live" CD. Even Windows could not be installed, nor win10 update be run, using the normal video card. I have to re-enable and plug a monitor into the built-in Intel video in order to do stuff.

So, I'm glad I spent $10 more and got the CPU with the integrated GPU, in this case.

2
  • 1
    I think this is wrong. The whole reason why you could not see anything on screen is exactly because you have the integrated card and the output is redirected there. If you did not have the integrated card your output would appear on the external one, because it has nowhere else to go and you would not have the problem. Commented May 30, 2016 at 8:25
  • 1
    No, the built-in video can be disabled in the boot settings (not just left on auto which detects what's plugged in), and some boot discs (that stay in text mode) work fine.
    – JDługosz
    Commented May 30, 2016 at 8:33
0

Sometimes when you upgrade your graphic cards your cards' driver software might not match, so you have to use the graphic card in motherboard to start your pc and get the driver.

2
  • I've never had a situation where my video card couldn't downgrade to standard SVGA mode so I could get Windows to boot and download a new driver...
    – Jules
    Commented May 30, 2016 at 13:25
  • I had such a situation last week. I tried to install an Nvidia driver update, and got a driver that does not work under Windows Vista - it will not even allow the computer to boot, so there was no way to request such a downgrade. I finally found the installation DVDs, booted from one of them instead of from the hard drive, and found that a certain repair program CLAIMED to do nothing useful,but actually restored the ability to boot from the hard drive. If you are running an Nvidia-based graphics board, don't even TRY to install a driver newer than 365.19.
    – milesrf
    Commented May 31, 2016 at 20:27

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .