Thursday, December 15, 2005

Microsoft removes the graphics subsystem from kernel in Windows Vista and the relationship to OS X Tigers Quartz 2D Extreme has linked to a story on that states Microsoft is going to remove the graphics subsystem from the Windows kernel and move it back into user mode according to Microsoft infrastructure architect Giovanni Marchetti. I guess I must have missed this news, but it's huge. Here are some links I got from Google: Vista graphics drivers to be more stable than XP drivers, ATI says - This article actually speaks to the driver model going to user mode Graphics Hardware and Drivers for Windows Vista - Really no insight, but this is the top hit for Longhorn Display Driver Model. Windows Longhorn Display Driver Model - Details and Requirements - PowerPoint - Look at slides 7 and 8 for the comparison between XP and Vista. This has a lot of information about exactly what MS is trying to achieve and what the benefits are. Don't get confused though, this isn't a reversal of moving the graphics subsystem into the kernel in Windows NT 4.0. That was done, if I remember correctly, to increase performance of GDI over Windows NT 3.51 since the number of context switches to get graphics work done was so costly that running heavy video apps or games was not practical. This is the now infamous move of Win32K.sys. Windows XP maintained this model, offloading a bit of work through GDI+ to the graphics card, but this was just a band-aid, GDI was showing its age and doesn't work modern graphics hardware and the way applications are being used. Don't misunderstand, I am not a kernel engineer, but the trial and tribulations of this architecture have always been interesting to me. I also have read all the Windows Internals books, nee Inside Windows NT in the Helen Custer days, except the latest. Here is the book on Amazon to that (thanks Mark R. again). Vista doesn't revert to the previous NT 3.51 style model. The graphics subsystem in Vista is a complete brand new. Look at slide 9 in the PPT file, GDI commands are now rendered in SOFTWARE. The Vista graphics subsystem attempts to treat the GPU and video memory just as it treats the CPU and RAM. Multiple applications can access the GPU simultaneously and video memory is virtualized just as RAM is. These functions are contained in the kernel and called by DXGkrnl. DXGkrnl still calls a IHV piece of code, but the way I read this the footprint is significantly reduced and IHV code is less likely to be a cause for hangs. Most of the IHV work is done by the UMD, and I think that means user mode driver. Take a look at slides 18-20 Here is why you should wait until Vista and probably 2nd generation video cards are out to support the OS. Until the hardware is built for it, GPU scheduling is done in software instead of hardware. That means slow and will hit the CPU. In this slide deck, you really want hardware that supports the "Advanced" model. The article states this brings Windows up to par in terms of graphics subsystems with Mac OS X and Linux. Ars Technica has a very good explanation of the OS X graphics subsystem, called Quartz, in 3 pages of information created when Mac OS X 10.4 Tiger was launched in April: Mac OS X 10.4 Tiger - Quartz Mac OS X 10.4 Tiger - Quartz 2D Extreme Mac OS X 10.4 Tiger - QuickTime 7 Apple's Mac OS X Quartz Feature The only problem with moving all the work onto the GPU in Tiger with Quartz 2D Extreme is that it's disabled. The code is in there, but it's not turned on. Reasons for this have never been explained, the only information that has come out is in the About 10.4.3 Software Update, which states: Disables Quartz 2D Extreme—Quartz 2D Extreme is not a supported feature in Tiger, and re-enabling it may lead to video redraw issues or kernel panics. So I think the graphics subsystem in Vista is going to surpass the capabilities in Tiger, at least until a post 10.4.3 software update either enables Quartz 2D Extreme or OS X 10.5 comes out. The strange thing is you can still enable Quartz 2D Extreme, but I thin the 10.4.3 update note is to discourage those on the bleeding edge that they are doing A Bad Thing(tm). I would love to know what problems where encountered that Apple decided to disable the functionality in Tiger. One of the things I am wondering about is the Vista "Advanced" mode video cards. Is Apple going to use these? If so, are they going to change OS X to utilize the hardware based scheduling features? Even though Apple has been using hardware accelerated compositing and VRAM virtualization in OS X for years now, is Apple going to have to change their implemenatation to get closer to the way Vista is handling graphics operations to use the new video cards? What is the patent situation with the capabilities of these new cards? Is MS licensing these patents to ATI and Nvidia? Are Nvidia and ATI effectively just manufacturing cards for MS to spec? MS and Apple cross license patents, I wonder if MS is using any Apple patents to do the Vista graphics subsystem, or vice-versa if Apple is using MS patents for Quartz. Games on Mac OS X perform slower on similar GPU hardware than games running on Windows XP, see World of WarCraft as an example. I wonder how game performance will be now that the entire UI and all applications are sharing the GPU as a virtualized resource. Do games perform on Vista on par with OS X, are they as fast as XP, faster than XP? I haven't really seen anyone talk about this yet, even the XP to Vista comparison. Does MS have a special "games-mode" hack in the graphics subsystem to make it all work faster? The Vista 1.0 graphics subsystem scares me from a backward compatability and Vista native app stability perspective. My money's on at least 2 Service Packs before the Vista graphics subsystem is solid. Of course I could be wrong and Vista is taking so long to get it right, but MS has done nothing to prove it won't take 3 versions to get this right. I don't think MS has been very forthcoming about the "advanced" model video cards, perhaps to prevent an Osbourne effect for this holiday season on PCs or graphics card upgrades. Just a theory, but I am glad I am not in the market for either upgrade until the "advanced" approved video cards are out.