Intel says mainstream gamers don't need a discrete graphics card
They have a long way to go but I was watching some youtube vids with integrated cards running some modern games - the FPS was horrible but higher than I imagined it would be.
The problem is discrete cards haven't seen a node shrink in 4+ years. Pascal and Polaris should obliterate the integrated graphics this year.
Once Intel gets a hold of that stacked VRAM technology that AMD and nVidia are using in their upcoming GPUs, things are going to look a lot better. However, no integrated chip will ever outperform a high-end dedicated graphics card.
CPU has always been "catching up".. what these reviewers have not noticed is CPU HAS never actually caught up to em ;) stupid mentalist writers
Thing is. CPU's are quite powerful now, and most users don't need anything but a decent quad core. Now that those cores can be shrunk down it leaves more room for an iGPU. For most users and gamers this would mean same performance on the chip but a bigger GPU rather than adding more CPU cores. Obviously, they will always have a CPU that doesn't focus on iGPU's. Intel HD 4600 held up very well before I grabbed a video card. I'm pretty excited for laptops since the cheap models are inching closer and closer to low temperatures and good performance. It will never be as good as a dedicated card. But, a low-end and possibly mid-range card in the future? Feasible. I'm sure it will have a price to show for it though.
TechSpy is a community of awesome people posting and discussing the latest tech news. It’s part of NewsBoiler, a network of social news sites covering today’s pop culture.