
i guess i was speaking to the "wrong" crowd. We all have DX9 cards, or will buy them for gaming. We know where to find them cheap. But majority of computers/people out there don't have one & don't know what it is. So they'll end up buying a new PC for vista (like most bought a new pc for xp). all new PC's (with the exception of expensive, older intel's) are 64-bit compatible.
if you didn't know anything about pc's would you start upgrading parts just to install a new version of Windows? Or would you want it to run "out of the box"? Odds are, either way, people will need to upgrade a vid card to run it. Everybody in my family (parents, grandparents, brothers, sisters, wife) except me will need a video hardware upgrade to run it, and i'm betting most of them would just buy a whole new $500 PC to run it instead. which would be 64-bit. Many of them bought pc's/laptops within the past two years, so they're not old either, Vista capable hardware just isn't installed on all newer pc's/laptops.
i see it as the same thing as when 32-bit CPU's (386) came about. Win95 required them even though 16-bit 286's were still plenty full. And when Quake 3 required a GPU. People bought GPU's to run it, and now they're standard. I believe if MS doesn't start requiring 64-bit we'll never really see a PC integration to it.

i agree it would be expensive to upgrade to 64-bit, but DX10 is Vista only & DX10 games will be vista only. So you'll probley end up spending a couple hundred on a DX10 vid card along with a couple hundred on a Vista liscence. Throw in the new "windows certified games" MS plans on doing, then it will be even more convulated.
so yeah, it's cheaper to just buy an ATI 95xx+ card (aparently, all cards lower then that don't get driver updates as of feb'06), but most people won't do that anyway, they'll just buy a new PC.