Home / News / Software News / Vista SP2 In April? Windows 7 Accelerated CPU Graphics?

Vista SP2 In April? Windows 7 Accelerated CPU Graphics?

Gordon Kelly


Vista SP2 In April? Windows 7 Accelerated CPU Graphics?

Well it seems to be speculation time again with all things Redmond.

Two rumours are hitting the Web today and I'm classing them as hmmmn and hmmmmmn. Hmmmn first...

Windows Vista SP2 is allegedly being readied for an April 2009 release according to TechARP with a release candidate as soon as February. I'd add more m's to this talk but TechARP did get scoops on Vista SP1 and Windows XP SP3 so it has a decent track record and with just eight weeks to go until February (scary, right?) we should get to the bottom of this chitter chatter pretty quickly.

Next up Hmmmmmn: and this is because Microsoft is claiming it will introduce a new system of running graphics acceleration via the CPU with Windows 7 theoretically allowing cheaper machines to still get all the eye candy of new OS. Dubbed 'WARP' - Windows Advanced Rasterization Platform - it essentially enables Direct X10 and 10.1 to be run off the CPU and claims to render discrete graphics inert. Microsoft even pulls out some with early benchmarks showing Crysis hitting 7.36fps at 800 x 600 pixels on an eight core Core i7 system using WARP compared to 5.17fps on Intel DirectX integrated graphics.

Now while dedicated GPUs will hardly be quaking in their sleep, this could prove something of a threat to graphics on lower end systems - especially with technologies like Larrabee on the horizon. That said, we'll wait for further development and news on battery life hits before we place too much hope on this.

OK, to get with the Festive Spirit I'm reducing my cynicism to 'Hmmn' and 'Hmmmmn'... happy?


SP2 via TechARP

WARP via Custom PC


December 2, 2008, 2:13 pm

As long as SP2 fixes the problem with the way Vista handles data I'll be happy. I defragged my new laptop at the weekend and it took over 6 hours! It would have taken around 10-15 mins under XP. Also copying data to and from Vista machines is a pain and deleting large volumes is even more time consuming. How on earth did this happen? :-\


December 2, 2008, 3:08 pm

CPU accelerated graphics has got to be the most boring innovation for a long time. I can't wait for GPU accelerated programs! Where are they?


December 2, 2008, 3:29 pm

...Or how about a fast, stable, small-footprint operating system too...oh hang on, they've already got XP!


December 2, 2008, 3:48 pm

@steve, Also copying data to and from Vista machines is a pain and deleting large volumes is even more time consuming. How on earth did this happen? :-\

One word, DRM. :) I've the same problem, copying data onto even a USB stick and gets stupid rates. grrrr. Come on M$ copying/deleting/moving files is meant to be one of the main areas of the OS. And talking about defrag, can't they sort out NTFS to look after itself?, a very low priority thread keeping the file system in prime condition is'nt that much to ask, and not these stupid scheduled defrags either, that decide to kick in while playing CS. :)


December 2, 2008, 4:51 pm

Wasn't Vista meant to ship with a new file system? I'm sure it got canned.

But copying and moving data in Vista absolutely ridiculous.


December 2, 2008, 5:55 pm

Offloading graphics to the CPU? That's software graphics acceleration - what a brilliant and unique innovation!

Oh wait, it's 2008, not 1988.

I read a benchmark figure about WARP the other day, and DX10 rendering on a high end Core 2 Quad was able to push about 7fps on Crysis at only 800x600... sure, no sane man is going to be running Crysis via software rendering, but it shows just how slow and inefficient this concept is in comparison to utilising even a low-mid range contemporary GPU.

GPUs are seriously cheap/powerful these days, even the integrated laptop chipsets are decent enough, it makes absolutely no sense to push everything onto the CPU because it'll just max out your CPU cycles rendering objects that are far more trivial for specialised hardware, the GPU, to manage.

To me, this is just another sign of the Wintel monopoly rearing its head. Intel have a bug up their collective exhaust pipes about having their CPUs eat up the GPU market (gee, I wonder why...), but right now with the technology we have, there are very few benefits for the end user in this approach, and a lot of drawbacks. It's a step backwards if anything.

@ Steve - yes WinFS, it was one of the few really interesting features announced during the early Longhorn development days, and it got dropped pretty fast when they realised that it would actually involve some serious work. There's still no mention of it appearing for Windows 7 as far as I'm awre either. It's a real shame because NTFS is, quite frankly, complete pants.


December 2, 2008, 7:02 pm

I can't say I've noticed any problems with file transfer rates, I don't transfer a lot of data but vista's certainly perfectly capable of maxing out my USB drive's transfer rate.

@life: Eh? Intel are going to be the ones most threatened by this as they have the biggest share of the integrated graphics market, right now they're selling both the CPU and GPU for a system, microsoft could be removing the need for those intel GPUs.

This development is one where the devil is in the details. We may find out that the CPU is only faster than the GPU for older GPUs, in which case it's not a threat to anyone and assuming the CPU requirements are fairly low will help more people run Seven. This is supported by the fact that the graphics tested must have been an X3100 or even GMA950 since the X4500 will get more than that in crysis. http://www.extremetech.com/...


December 2, 2008, 10:43 pm

WinFS is not a file system, but a means of accessing the data through libraries. It was however indeed dropped for Vista, but IS in 7 (it is in the betas now).


December 3, 2008, 12:59 pm

@ Xiphias - Look over Intel's horizon with products such as Larrabee and Ibex Peak (and their prior purchase of Havoc), they are attempting to shift more graphics processing tasks away from full blown nVidia/AMD style GPU card implementations and closer to their core (pun definitely intended) market, i.e. the CPU. If you have been along to any of their conferences in the last couple of years, you can see exactly where their focus is here.

Yes, Intel currently sell IGPs like the GMA family that crop up on every other laptop and budget mobo, but this is not their bread and butter and they do not compete with nVidia/AMD in the mid-higher end of the spectrum. It's certainly an important area for them, but it is all in support of their processor business rather than in competetion with it.

More than just Intel, zoom out a tad and take a peek at the wider industry. nVidia are moving along quickly with CUDA and PhysX, and AMD/ATI are peddling (Fire)Stream... these companies are intentionally bringing a lot more emphasis onto the GPU at the expense of the CPU. This development here seems very much like a geared response to attempt to prevent this potential evolution of the GPU from taking on tasks (and thus, product value) that have traditionally been within the processor domain.

I think AMD smelled this coming some time ago, or at least saw the potential was there, and is no doubt why they felt the need to take the risk on the ATI acquisition. Arguably, that still has yet to produce the results they were looking for, but you have to wonder what position they would be in for this impending future if they had stayed outside of the graphics arena.

As for Microsoft and Intel, they historically have a very clear symbiotic relationship, though this will always be superficially played down due to their phobia of (more!) anti-trust lawyers hounding them. Both are certainly individual companies, but, particularly where MSFT is concerned - it's a strong case of "better the devil you know".

comments powered by Disqus