8 months ago
so i was wondering if any gpu is better than a igpu, or is a igpu not that bad?
Depends on GPU.
Not all GPU's are created equal and there are older GPU's out there still in use.
On the Intel side, you have HD630 graphics, common on the i3-i7 CPU's. Performance wise, while better than you might expect, it was not specifically designed to target gaming. In a modern GPU lineup, a GT 1030 / 750 level of card will perform significantly better than HD graphics. To find an equivalent to HD graphics in the GPU world, you might have to roll back 10 or so years. I suspect HD630 would trade blows with an old GT8800 GPU although I still would not bet on it. I had an 8800 back in the day and it would game 720p and even run some stuff at 1080p. I recall it even ran Tomb Raider 2013 fairly smoothly at 720p (not 60fps but definitely at around console 30fps rates). Intel HD630 gives a choppier experience than I remember. Game was barely playable. Point is the GT 8800 is a very old card, probably around 11 or 12 years old, a dinosaur. Arguably would still outperform or at least offer similar frame rates as Intel's integrated GPU. Numbers would likely show HD630 being better than a lot of old GPU's but there is more to rendering than Teraflops, or should I say with these chips, Gigaflops...
AMD wise, the APU on the 2200g and 2400g are hugely stronger than Intel's HD630. You can actually play modern titles on these although a weak GPU like a GTX950/1050 still smokes these chips. Quite significantly I must add. These AMD chips are great for miniature builds, mini PC's with mini Cases like the InWin Chopin. These PC builds are likely for light gaming, emulation, and streaming. A good lounge PC but not a console replacer. You are still better off even with a dual core and a GT 750Ti than any of these AMD CPU's.
Only consider Integrated graphics if you have a laptop and like to mess around on some weaker or Indie titles on the road - stuff like Axiom Verge and so on. Older titles run well on Integrated graphics - games around 2010 and earlier tend to play at close to full speed at 720p. Fallout New Vegas or Fallout 3 for instance run completely fine on Integrated Graphics. Witcher 2 is playable as well. Newer titles can be made "borderline" playable, super low settings, 720p or lower still... no point really... Fallout 4 on HD630 for instance is hardly worth it.... AMD's Vega 8 and 11 can play modern titles no problem though, with appropriate settings tweaks. If you have to rely on Integrated Graphics, go AMD. No competition from Intel here. Ideally, even if budget is tight, you get a GPU. Even a weak one like RX560 or GTX950/1050 will run every single game out there medium settings 30fps or better - at 1080p for vast majority of them.
Put this way, even if you are on an extremely tight budget, you would be better off dropping down a tier on the CPU but adding a GT 750 than you would be relying on integrated graphics on a better CPU. For gaming, a GPU is a must. A dual core Celeron CPU with a GT750 would perform better on average in gaming than a 6 core i7 relying on HD630 alone.
thx so much!
Most say the 1030 is slightly better or same as a 2400G amd apu. If you run faster memory like 3200 and OC it some the 2400g does well for what it is, but you are not going to play AAA titles well. Some go with it and figure they will upgrade later, maybe get a rx580 or new cpu and larger gpu yet because the 2400 will bottleneck with todays larger gpu. I'm going with a 2400 I have old games around, I'll upgrade later. There are plenty of benchmarks for 2400/2200g on the net. 2200 is little less apu but certianly cheaper so better price/performance. Cheap build is common to run a 2200 and 570/580gpu because its a 4 core cpu in there, the 2400 is 4 core with 8 threads and little better igpu in it. Usually is one in the build guides. So no 2400 not good for serious gamer, but it can stream and play lesser games.
thx for telling me!