NeoThermic wrote:TomCat39 wrote:As for the NVidia 6XXX gpu just upgrade to 9XXX. Probably not just that simple. Most 6XXX cpu's were AGP, 9XXX is only PCIe.
Considering that the 6x series was the first card of nvidia's to have SLI, I'm going to disagree on that one. nVidia only made AGP versions because at the time there was demand for them, but anyone building a new system back in the 6 series days had the choice of PCIe and should've gone with it.
If they were building from scratch. A lot of people were just upgrading video and AGP was the main platform in that case, which happens to be the majority of the cases at the time. The reason I say this is because only the enthusiast buys the cutting edge technologies. Mainly because the support and developement isn't there upon initial release. The average joe goes with the cheaper, more supported thing until the bugs are worked out. Unless he's green to PC's and has no history of buying the cutting edge and waiting 3+ months before there is stable drivers, most games work, etc etc etc. So at the transition card of AGP to SLI (the 6XXX series), that the AGP sold more than the PCIe. I think PCIe became more accepted through the 7xxxx series being it had been around a few months. Also remember, at that point in time, and probably still, only about 3%-5% of the market was enthusiast. That's not a whole lot buying the latest and greatest. AMD and Intel don't make much if anymoney on their latest and greatest. The flagships are usually just bragging rights for both companies and owners.
NeoThermic wrote:TomCat39 wrote:And the top end cards are 500 USD each. The cost of a PS3 is only 600 USD right now.
Except the graphics ability of the PS3 is comparable to an nVidia 7800 (the RSX is basically a specially designed 7800). Yes, your 600 USD PS3 has the power of the card in your old machine. So stop talking about top end graphics cards when the PS3 itself isn't even a 1/10th of one.
NeoThermic
That's just it. I'm not talking about a direct comparison because the hardware in a PC, that is equivalent to the PS3, is extinct long before 3 years has passed in the PC gaming industry. So you can't compare a PC with a life of 1 year to a console's life of 3 years for a cost over time. The time frame has to match. To get the time frame in the PC realm of 3 years before it's nearing or is beyond it's gaming performance, you have to start with the best at the time of build. Thus all my comments about a "good" gaming rig. If you don't start off with the top hardware, you are just shelling out more dough anywhere from 6 months to 2 years, in upgrades, which just adds to the purchase costs and ends up spending whatever money you saved in the initial build.
All my comments as I have said in my previous two posts before this, is talking about one time costs, starting from scratch, running over 3 years (the average life of a console or used to be). And even in buying mid level gaming PC and upgrading over the 3 years, you will still spend more in 3 years time than you will in the same 3 years for the 1 console. Granted, doing this method you get more than the 3 years out of the PC usually unless the PC architecture changes enough to require a near total rebuild such as AGP to PCIe. But that's not the point I've been arguing. I agree that the PC, as a whole, has a longer life span than any console. And that it is way more versatile. But for those extras, the PC is more expensive. If it wasn't, who'd ever buy consoles? It wouldn't make sense beyond the casual gaming aspect.
And Neothermic. I don't think at any point I ever said a console was as powerful or more powerful as a PC, regardless of my comparisons of top end PC's to the best console currently, the PS3. So don't misconstrue my comparison as a power comparison of capabilities. That's the farthest from the truth, I'm purely matching the time frames of life span. To do that you have to find the PC hardware that holds out as long as a console usually does. Usually that means top end hardware. Anything less doesn't last as long and requires upograding prior to the time frame being met. Thus can't be compared legitimately in a cost/time analysis. Since cost isn't a constant, time has to be. But I'm sure you know full well the math behind it. I think everyone is comparing equal costs but not stating the differences in time between the two. I know the equivalent price in PC as a console definitely expires in the gaming realm way faster than the equivalent console. Can you imagine trying to play recent games on a PC that costs as much as a Wii? This is why it's not my comparison. Hopefully this clarifies my comparisons and what I've been arguing about and why I still say PC's are more expensive as a gaming platform.
This has been my experience to date and so it's also my belief until the consoles are a bit more expensive (which I see coming in the horizon). I just don't get why not a single person here even slightly acknowledges that is might even be a possibility. The general consensus from everyone here is the PC's are cheaper than consoles period. Which is just so not true being that the "average" PC is about 1000 USD. Most consoles are half that. Not even going near upgrades on a PC, that just keeps adding to the cost of the PC.
One person said I can get a decent gaming rig for about 700 USD. To clarify, I disagree with the term decent. Sure I can get a rig for 700 that can play games now and maybe play them okay. But will that rig, left untouch (no more money spent on it) be playing games released for it 3 years from now? Probably not or at the bare minimum spec settings. I know the PS3, Wii and XBOX 360 will play games released for them in 3 years time (assuming they aren't replaced by next gen so have games released for them)and probably pretty well too.