Jump to content
  • Advertisement
Sign in to follow this  
fir

no gpu onboard ?

This topic is 2135 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

(I do not know much about this things so I am asking. )

AFAIK there are sold here PC's without GPU (?) just CPU can

handle graphics.. Do many people using this option ?,

Can games be written to run into it without GPU?.

 

(personaly as a consumer I like to not use to much 

energy and maybe would like tu buy machine without gpu

at all - I am not sure) 

Edited by fir

Share this post


Link to post
Share on other sites
Advertisement

There are no modern PC's sold without GPU. Every computer available to the average consumer needs a graphics unit in order to function (even if just in VGA mode). What you are seeing is PC's sold without a discrete graphics card, in which case the GPU is just a little chipset soldered on the motherboard, or, more recently, directly integrated into the CPU itself (so that buying the processor gets you the graphics card as well, if your motherboard supports it and if you aren't looking to use a discrete one instead).

 

Games and applications will work exactly the same as the drivers for those GPU's still implement DirectX/OpenGL/etc.. though perhaps not every feature, and certainly not at the same speed as a full-fledged graphics card, but, in short, you don't need any special code to write applications for them, the HAL (hardware abstraction layer) takes care of everything - barring driver bugs, anyway.

 

Many people use this option when they find integrated graphics cards sufficient for their needs. They won't be playing the latest Crysis on it, but for a lot of people (for instance, which only browse the internet and play light games, or even office desktops) that doesn't matter at all, so they use it because it's cheaper and uses less power.

 

For instance the CPU I am using right now (sandy bridge i5) has an integrated graphics card, but I have never used it, as I already have a discrete graphics card and in any case my motherboard does not support integrated graphics cards (you'll find it's often either/or, few desktop motherboards let you switch between both, with laptops though they are usually designed to be able to switch at will in order to save battery life).

 

EDIT: just to clarify, even if you are running a headless server you do need a graphics card. Your computer will *not* boot without one (so if you are running headless, an integrated graphics card is actually ideal as it takes no space and gets the job - no job - done, unless for some reason you want to do GPU computation on your server). For actually using a screen, the above applies as well.

Edited by Bacterius

Share this post


Link to post
Share on other sites

There are no modern PC's sold without GPU. Every computer available to the average consumer needs a graphics unit in order to function (even if just in VGA mode). What you are seeing is PC's sold without a discrete graphics card, in which case the GPU is just a little chipset soldered on the motherboard, or, more recently, directly integrated into the CPU itself (so that buying the processor gets you the graphics card as well, if your motherboard supports it and if you aren't looking to use a discrete one instead).

 

Games and applications will work exactly the same as the drivers for those GPU's still implement DirectX/OpenGL/etc.. though perhaps not every feature, and certainly not at the same speed as a full-fledged graphics card, but, in short, you don't need any special code to write applications for them, the HAL (hardware abstraction layer) takes care of everything - barring driver bugs, anyway.

 

Many people use this option when they find integrated graphics cards sufficient for their needs. They won't be playing the latest Crysis on it, but for a lot of people (for instance, which only browse the internet and play light games, or even office desktops) that doesn't matter at all, so they use it because it's cheaper and uses less power.

 

For instance the CPU I am using right now (sandy bridge i5) has an integrated graphics card, but I have never used it, as I already have a discrete graphics card and in any case my motherboard does not support integrated graphics cards (you'll find it's often either/or, few desktop motherboards let you switch between both, with laptops though they are usually designed to be able to switch at will in order to save battery life).

 

Allright tnx for answer. But yet a question. What is the difference

between such GPU-in-CPU (which is it seems more modern stuff related to i3-5-7  era) and some integrated gpu I got for example

 with some pentium 4 machine? Are the newest integrated stuff more powerfull than the old integrated stuff? What is the difference.

As to games, most of them will work but slow, or they will not work att all - Can the power of such gpu be compared? for example external gpu are

5x, 10x (?) faster?

Share this post


Link to post
Share on other sites

 

There are no modern PC's sold without GPU. Every computer available to the average consumer needs a graphics unit in order to function (even if just in VGA mode). What you are seeing is PC's sold without a discrete graphics card, in which case the GPU is just a little chipset soldered on the motherboard, or, more recently, directly integrated into the CPU itself (so that buying the processor gets you the graphics card as well, if your motherboard supports it and if you aren't looking to use a discrete one instead).

 

Games and applications will work exactly the same as the drivers for those GPU's still implement DirectX/OpenGL/etc.. though perhaps not every feature, and certainly not at the same speed as a full-fledged graphics card, but, in short, you don't need any special code to write applications for them, the HAL (hardware abstraction layer) takes care of everything - barring driver bugs, anyway.

 

Many people use this option when they find integrated graphics cards sufficient for their needs. They won't be playing the latest Crysis on it, but for a lot of people (for instance, which only browse the internet and play light games, or even office desktops) that doesn't matter at all, so they use it because it's cheaper and uses less power.

 

For instance the CPU I am using right now (sandy bridge i5) has an integrated graphics card, but I have never used it, as I already have a discrete graphics card and in any case my motherboard does not support integrated graphics cards (you'll find it's often either/or, few desktop motherboards let you switch between both, with laptops though they are usually designed to be able to switch at will in order to save battery life).

 

Allright tnx for answer. But yet a question. What is the difference

between such GPU-in-CPU (which is it seems more modern stuff related to i3-5-7  era) and some integrated gpu I got for example

 with some pentium 4 machine? Are the newest integrated stuff more powerfull than the old integrated stuff? What is the difference.

As to games, most of them will work but slow, or they will not work att all - Can the power of such gpu be compared? for example external gpu are

5x, 10x (?) faster?

 

It would be hard to compare the difference exactly so there is a number of graphics cards and integrated cards. As Bacterius said, almost always will an actual graphics card outperform the intergrated card in both speed and ability. You'd also expect intergrated cards to get better over time( but maybe not overtly so since most people who'd need a graphics card would most likely just buy one than use the integrated card ). And again as Bacterius said, integrated cards can do as graphics cards, but to an extent.

Are you currently using an integrated card, and want to see if a bottleneck in your program is from your integrated card, or from your code. If you want, you should ask for specifics and google the specs of the cards you want to compare; to see how your program would work on more powerful cards. It just seems like this may be the case.

Share this post


Link to post
Share on other sites

 

 

There are no modern PC's sold without GPU. Every computer available to the average consumer needs a graphics unit in order to function (even if just in VGA mode). What you are seeing is PC's sold without a discrete graphics card, in which case the GPU is just a little chipset soldered on the motherboard, or, more recently, directly integrated into the CPU itself (so that buying the processor gets you the graphics card as well, if your motherboard supports it and if you aren't looking to use a discrete one instead).

 

Games and applications will work exactly the same as the drivers for those GPU's still implement DirectX/OpenGL/etc.. though perhaps not every feature, and certainly not at the same speed as a full-fledged graphics card, but, in short, you don't need any special code to write applications for them, the HAL (hardware abstraction layer) takes care of everything - barring driver bugs, anyway.

 

Many people use this option when they find integrated graphics cards sufficient for their needs. They won't be playing the latest Crysis on it, but for a lot of people (for instance, which only browse the internet and play light games, or even office desktops) that doesn't matter at all, so they use it because it's cheaper and uses less power.

 

For instance the CPU I am using right now (sandy bridge i5) has an integrated graphics card, but I have never used it, as I already have a discrete graphics card and in any case my motherboard does not support integrated graphics cards (you'll find it's often either/or, few desktop motherboards let you switch between both, with laptops though they are usually designed to be able to switch at will in order to save battery life).

 

Allright tnx for answer. But yet a question. What is the difference

between such GPU-in-CPU (which is it seems more modern stuff related to i3-5-7  era) and some integrated gpu I got for example

 with some pentium 4 machine? Are the newest integrated stuff more powerfull than the old integrated stuff? What is the difference.

As to games, most of them will work but slow, or they will not work att all - Can the power of such gpu be compared? for example external gpu are

5x, 10x (?) faster?

 

It would be hard to compare the difference exactly so there is a number of graphics cards and integrated cards. As Bacterius said, almost always will an actual graphics card outperform the intergrated card in both speed and ability. You'd also expect intergrated cards to get better over time( but maybe not overtly so since most people who'd need a graphics card would most likely just buy one than use the integrated card ). And again as Bacterius said, integrated cards can do as graphics cards, but to an extent.

Are you currently using an integrated card, and want to see if a bottleneck in your program is from your integrated card, or from your code. If you want, you should ask for specifics and google the specs of the cards you want to compare; to see how your program would work on more powerful cards. It just seems like this may be the case.

 

 

No, It is about that that I dislike 1) high power consumption

2) another cooling fan, also  I dislike 3) the nvidia/ati differences 

(harder to develop) also i dislike 4) the high graphics power 

consumption games which are done without a brain (it is I like

3d games but not neccessary the heaviest ones) 

 

So i try to think what attitude to get about integrated gpu

which seem to be more the hardware I would like to like (and

if not consider them as a main platform to developing on If i

would go into 3d development (now I am not) ))

 

I am searching for example if such in-cpu-gpu may be used

at a least common denominator I could write for not bothering 

myself about the newest nvidia and ati stuff 

Share this post


Link to post
Share on other sites


for example external gpu are
5x, 10x (?) faster?
Goes like this:

Intel - it or miss. Often not properly documented in leaflets. HD4000 (some ivy bridge chips) is minimum considered decent. Haswell chips have a "Crystalwell" option, 128MiB of L4 cache. This is the fastest integrated option and the most expensive, marketed as "Iris pro" if memory serves.

Intel graphics is not really an option for gamers. HD4000 is the turning point as it's more or less as powerful as a card worth 50 bucks. Crystalwell is a real game-changer, I'd say it's as fast as a 80 bucks stand-alone card.

AMD graphics is better (with Crystalwell being the exception). Worth, at best 65-70 bucks.

In the end, they all get trashed by cards worth 100.

Share this post


Link to post
Share on other sites

 


for example external gpu are
5x, 10x (?) faster?
Goes like this:

Intel - it or miss. Often not properly documented in leaflets. HD4000 (some ivy bridge chips) is minimum considered decent. Haswell chips have a "Crystalwell" option, 128MiB of L4 cache. This is the fastest integrated option and the most expensive, marketed as "Iris pro" if memory serves.

Intel graphics is not really an option for gamers. HD4000 is the turning point as it's more or less as powerful as a card worth 50 bucks. Crystalwell is a real game-changer, I'd say it's as fast as a 80 bucks stand-alone card.

AMD graphics is better (with Crystalwell being the exception). Worth, at best 65-70 bucks.

In the end, they all get trashed by cards worth 100.

 

 

But in energy consumption, noise ? 100 bucks is not too much but if I got more watt consumption, noise i would not neccessary want it - worse is low amount of video memory, could it be in some way emulated or game who needs more vram just has no way to run on lower amount capable card?

Edited by fir

Share this post


Link to post
Share on other sites

Energy consumption is the big pro for integrated adapters. They can decrease the whole system consumption by a good 10% (and I'm being very conservative). Noise is more problematic but I can say CPU-integrated adapters are winners (albeit I can give you no numbers).

 

Integrated cards often (99.9%) use system RAM. You have it or you don't. Same for discrete, except you add their memory on top of system. In the end, if a game uses more VRAM than it's available performance will suffer. A lot. If you also run out of system RAM performance will slow to a crawl.

Just halving texture resolution is often enough to restore a decent performance.

 

In general however, you need perf or not. If you do, then integrated is not option and likely won't be for another couple of years as they're all on DDR3 (albeit Crystalwell might work around that issue).

Edited by Krohm

Share this post


Link to post
Share on other sites


No, It is about that that I dislike 1) high power consumption
2) another cooling fan, also  I dislike 3) the nvidia/ati differences 
(harder to develop) also i dislike 4) the high graphics power 
consumption games which are done without a brain (it is I like
3d games but not neccessary the heaviest ones) 
 
So i try to think what attitude to get about integrated gpu
which seem to be more the hardware I would like to like (and
if not consider them as a main platform to developing on If i
would go into 3d development (now I am not) ))
 
I am searching for example if such in-cpu-gpu may be used
at a least common denominator I could write for not bothering 
myself about the newest nvidia and ati stuff 

 

It sounds like you want to work on console games.

 

http://www.wired.com/gadgetlab/2013/05/xbox-one-development-photos/#slideid-138498

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!