[Games] Alan Wake
16 years ago
Who here remembers being excited about that game when they showed off trailers back in '05? xD Seriously, wow. Stuff like this makes me want to cry a little bit inside. Rewind nearly 5 years.
Remedy, the company behind the awesome awesome game Max Payne, announced a psychological thriller where you play as an author who has moved to a small mountain town and started to go crazy. Suddenly your wife is missing, except there's a nurse who looks like your wife or something, aaand the stuff that you're writing in your book starts to come true, and there's a light/dark dimension to the game where you're only safe near light, so during the nighttime you retreat to a lighthouse, and there are these mysterious spooky robed and hooded figures who come for you in the dark, and you're mostly armed with a flashlight or lamp, maybe a revolver at best. Cutting-edge graphics, DX10-only on Windows, also on Xbox. Beautiful visuals. More advanced physics and vastly bigger worlds than Half Life 2 and the other stuff we were enjoying in the 2004-2005 period.
At some point they showed the game off and were bragging about multicore support (Seriously? That shit's standard now. Goes to show its age). They were showing off how they could even leverage, not just dual core stuff, but quad as well. On the then-preproduction Intel Core 2 Quads they were all "Check it out, we have a thread for AI and game stuff, a thread for physics, and a thread which just handles streaming in the game world!"
During 2009 they announced that they were stopping development of the Windows version to focus on the 360 version instead. So much for all that "DX10-only! Look how great we run on a quad core!" business. Presently, the game is due out for a mid-2010 launch on 360 only.
Honestly I don't know what to feel about the whole thing. I mean, I loved Max Payne, and Alan Wake looked to be potentially pretty hair-raising and gorgeous. But that was 5 years ago. 5 years as in, the time between one console and the next one, or ~3 computer part generations. If that game came out 2 years ago it would've looked on par with other titles. I'm reminded of STALKER, which was just stunning when they showed it off in 2001... a little less so as the release date slipped from 2003 to 2007. Game delays, sadly, happen. Eye candy gets stale if you don't have it on shelves soon enough.
What really irks me about the whole situation, then, is the loss of a potential PC title to be Xbox-exclusive. That wasn't Microsoft's call, the developers just didn't want to expend the effort to put it out on PC. It really saddens me; gone are the days when games were made for PC and then ported to consoles. Now we're lucky to get games made for consoles and ported to PC :|
Bleh. Maybe some day gaming will grow up. As all the people who used to be kids playing N64 on little TVs, all paid for by their parents, get real jobs and earn real money and it occurs to them that paying $1000 every 3 years for a computer is nothing in comparison to their $80/month cellphone bill, $3000 home theater setup, and $30,000 car... they'll get real gaming systems. 'cuz whatever your $300 console can do, your $1000 gaming box is going to do three times better.
Remedy, the company behind the awesome awesome game Max Payne, announced a psychological thriller where you play as an author who has moved to a small mountain town and started to go crazy. Suddenly your wife is missing, except there's a nurse who looks like your wife or something, aaand the stuff that you're writing in your book starts to come true, and there's a light/dark dimension to the game where you're only safe near light, so during the nighttime you retreat to a lighthouse, and there are these mysterious spooky robed and hooded figures who come for you in the dark, and you're mostly armed with a flashlight or lamp, maybe a revolver at best. Cutting-edge graphics, DX10-only on Windows, also on Xbox. Beautiful visuals. More advanced physics and vastly bigger worlds than Half Life 2 and the other stuff we were enjoying in the 2004-2005 period.
At some point they showed the game off and were bragging about multicore support (Seriously? That shit's standard now. Goes to show its age). They were showing off how they could even leverage, not just dual core stuff, but quad as well. On the then-preproduction Intel Core 2 Quads they were all "Check it out, we have a thread for AI and game stuff, a thread for physics, and a thread which just handles streaming in the game world!"
During 2009 they announced that they were stopping development of the Windows version to focus on the 360 version instead. So much for all that "DX10-only! Look how great we run on a quad core!" business. Presently, the game is due out for a mid-2010 launch on 360 only.
Honestly I don't know what to feel about the whole thing. I mean, I loved Max Payne, and Alan Wake looked to be potentially pretty hair-raising and gorgeous. But that was 5 years ago. 5 years as in, the time between one console and the next one, or ~3 computer part generations. If that game came out 2 years ago it would've looked on par with other titles. I'm reminded of STALKER, which was just stunning when they showed it off in 2001... a little less so as the release date slipped from 2003 to 2007. Game delays, sadly, happen. Eye candy gets stale if you don't have it on shelves soon enough.
What really irks me about the whole situation, then, is the loss of a potential PC title to be Xbox-exclusive. That wasn't Microsoft's call, the developers just didn't want to expend the effort to put it out on PC. It really saddens me; gone are the days when games were made for PC and then ported to consoles. Now we're lucky to get games made for consoles and ported to PC :|
Bleh. Maybe some day gaming will grow up. As all the people who used to be kids playing N64 on little TVs, all paid for by their parents, get real jobs and earn real money and it occurs to them that paying $1000 every 3 years for a computer is nothing in comparison to their $80/month cellphone bill, $3000 home theater setup, and $30,000 car... they'll get real gaming systems. 'cuz whatever your $300 console can do, your $1000 gaming box is going to do three times better.
FA+

PC: Sit and play on your own.
PC: Play online with your mates
Consoles: Play split-screen
PC: LAN Parties
Seriously, if you think that PC gaming has to be playing games by yourself, you haven't been exposed to the proper games and way of playing them. Of the 9 top-selling games on Steam right now (top 9 because one of the top 10 is listed twice, having two versions), 4 of them are multiplayer match centric, 3 are strictly MMOs, and only two--Divinity II: Ego Draconis and Dragon Age: Origins--are single player only. Of the games that I've played over the past year, most of them have been mostly multiplayer, TF2, Killing Floor, L4D, L4D2, Borderlands, CoD4, CoD:WaW, etc. Using Skype to provide constantly open voice chat, the only disadvantage compared to having a friend right next to them is you can't... clap them on the back when you survive a hard wave of zombies? I'd say that's more than made up for by the ability to pop in a game any time you've got 15 minutes to burn, instead of having to fit a trip to somebody's house into your schedule... if, indeed, they even live in the same state as you.
The best clan-based games are on the PC!
Expected response: "Consoles are better for local multiplayer because they're cheaper and smaller so you can get 4 friends in a game like Halo or Smash Bros."
PC gaming has such an unsociable stigma to it :/
It works in some cases, I remember many awesome games of NFS3 with my bro, both using the keyboard at the same time, where as consoles seemed to be designed for the somewhat simple common denominator, easy to use and want more players ? Just jam in some more controllers...
That said, I feel there is an important point we must make, that there is LAN gaming via consoles like 4 ps3's and people can do that just as easy with their awesome pc's but the one thing consoles are good at is they are designed to be marketed towards the casual gamer, having the console in the living room for example for others to watch
As far as LAN/multiplayer goes, split screen is shit, either have the dedication to save for a rig/console to get the full experience or don't bother I say, unless its just for a game of "crazy search"
Bleh, I always feel my replies make little sense, I blame loz for distracting me :P
I won't deny that a single box isn't suited for several people playing simultaneously, in most cases. It's not a hardware thing as far as I know, just a software limitation... developers don't put in the distinction between multiple input devices because the assumption is that you'll have one person using each system. There's no reason I can think of that you couldn't do split-screen or same-view multiplayer on a single computer with everybody using their own controller, though.
However, and this is at least one place where we'll agree, the ability to do several-person-one-system multiplayer, while a cheap solution, is far from ideal in almost every other way. The exceptions are same-view games like Pikmin 2, Little Big Planet, side-scrollers and fighting games, where drawing more characters isn't that hard. In games where your screen is chunked off into quarters, everyone ends up with a dinky chunk of the screen. Sure, everyone is disadvantaged the same amount, but I'd prefer everyone to have a decent screen than everyone be gimped to some VGA-resolution 15-20" screen quarter which opposing players can peek at. Hence my ending comments about how maybe gamers will grow up and buy real hardware.
Hooking a 360 or PS3 up to a 50" 1080p TV and $2000 sound system is the equivalent of letting a child drive a sports car. Hooking a Wii up is like... letting a hamster drive :| What's the point of spending so much money on a high-end home theater if you're just going to gimp it with sub-720p output? And yet people do it.
LOL how very true, I see crap like that all the time and think what a waste of money it is when its not put to use x3
Besides, any console gamer can be a loner as well. For example, when I had my 360, I played it without hooking it up to the internet even once, nor did I have anyone else over or more than one controller.
So, I have to respectfully disagree with you. :3
As for the graphics card issue, Radeon has been talking about looking more into go-graphics cards. >.> But you didn't hear it from me.
-An override of the graphic's chip-set on the motherboard.
-A graphic memory "boost" specifically for the graphic's chip-set.
-A secondary graphic's chip-set that handles advanced OpenGL or more advanced DirectX attributes(Advanced color correction, better pixel shading, better particle effect rendering, ect)
When dell released the XPS line of laptops, the three, or maybe four, companies that were developing go-graphics cards quit, focusing more on on-board chipsets that would be better for future laptops. Now, word on the underground is that Radeon is picking up the technology again, seeing as how most gamers now are on the move with laptops, yet the ones that are capable of playing today's games are running 3000-4000 dollars. In other words, Radeon is trying to make a po-boy's fix for graphics woes at around 125 ~ 200 bucks.
Honestly though, I'd buy one if it means I could get OpenGL 2.3 support at the least on my laptop. Damn dell chip-sets... :/
"vastly bigger worlds than Half Life 2"
Sandbox Game vs Liner Sandbox Game.....Not a really fair fight xD
Um....did that help? ._.
Some games can actually pull off a game that is both Linear and Sandbox.
Halo 3: ODST is a great example. You can run around this city, but in the end you'll end up being shoved into the same point. They kinda shoo you into general directions and expect you to take it from there.
I'm still pissed at niBriS about thatm.
http://www.youtube.com/watch?v=gNjyw-D1YP0
I didn't know those were separate games. I think the reviewer was confused because they were talking about fire physics and being in a small mountain town with monsters and being a writer.
How come all this creepy stuff happens in small mountain towns? :P
Silent Hill, Quiet Mountain, Noiseless Bluff, Hushed Butte? They're all creepy towns with polygonal-shape headed inhabitants!
Alan Wake was just one of many PC/360 games that are now just 360.
Microsoft can take my money, but they need to send an executive over so i can lay down some cum ropes on their face, followed by a savage beating.
No, the profit they make is a combination of game sales, peripherals, and Xbox Live. Honestly I have no idea how much it costs for them to maintain Live, or what profit they pull on the peripherals.
Yeah, they'd still make money selling Windows to non-gamers, but that's beside the point. They can still make an extra sale to each gamer who uses it. It ultimately boils down to whether they want to make their $100-150 per person off of a Windows sale, or a bunch of game sales.
And what Microsoft wants versus what developers want isn't always the same, either. Console developers get their $50 whether it's a PC title or 360 title.
And if they want to eliminate Windows as a gaming platform, then why bother with DX10, 10.1 and 11?
Thx you :P
YaY for Canadian gamers
What cause the video game crash was so many companies was making videogames... alot of horrible ones for the atari 2600 which eventually caused it. Now the reason why PC is a unfavorable platform was due to piracy on the business end, they started implementing codes and codexes so they can "protect" their money and sell legit copies. Which is a pretty big turn off for the consumer.
If anything if it weren't for the console companies being able to license their products by approving of the game being released on their platform, what would stop people from copyright infringements and over all lawlessness with the integrity of their products.
Console games did it where PC has yet to even try that... so in other words sure Steam is great for making it legit and convenient, however you must be reminded that what makes a game isn't so much the hardware or how you play it.
It's all about the game design, art direction, and story that you play it for.
I swear most gamers need to get off their god damn high horse and just play what they enjoy.
Licensing and development costs for consoles is a double-edged sword. Sure, it might cut down on the number of crappy titles out there, because a game is a bigger investment and a bigger risk, but that also means you're potentially killing a lot of creative potential.
I mean, look at Portal. The game which served as the spiritual basis for Portal, solving spatial puzzles by making shortcuts through the level, was made by some people as a class project, basically. Killing Floor is a UT2k4 mod. Having a platform that anyone can dick around on for free is a great way for would-be developers to experiment with new ideas... you know, without paying thousands of dollars just to get a single development kit, to say nothing of game engines to play in D:
I play both mediums because each has their own exclusives, just that I find what is most convenient for myself. However remember much like holywood, gamers won't play the second runner if they can afford the better ones out there.
Sure you can say Modded games are a great thing, I don't doubt that, But still the need to License a product is essential for the console business, it is also a major reason why you can trade in titles also with the console platform... so if you do buy that PC title you never will pick up a second time... you can turn it in for something else.
I thought the reason you can't resell PC games is because, with piracy so easy, they've had to adopt CD keys to ensure people aren't pirating. Really, that's what you pay for, because you can easily get a copy of a disc. Nobody will repurchase a PC game, indeed stores won't take them back once opened, because you could just record the key and use it yourself, preventing a future user from being able to. With consoles, the disc--more difficult to dupe (and in the case of previous systems, impossible or more expensive than a retail purchase)--acts as the key, verifying that you have an original copy of the game. Really, why else do you think it's the case that you can rent and resell every form of music (CD, cassette, record), movie (Blu-ray, HD-DVD, DVD, VHS), and interactive software (games for every console ever made) with the SOLE EXCEPTION of PC software starting from about the time when the Internet got big? Because only with the PC software does the possession of a disc not mean you have full access to the features of that title--namely, multiplayer. I'm pretty sure I'm within my legal rights to sell you a copy of a game that I bought, you'd just be foolish to buy if it had a code associated with it.
A notable exception to this is Steam. Because the software can remove a game as easily as it adds it, there's no reason they can't revoke your ownership of a game, so they could indeed let you give used games to people or trade them back in for a few bucks... but why would they want you to do that? Steam isn't Gamestop. They only have one competitor I've ever heard of, Direct2Drive or something, and I've never even been to their website, only heard the name in passing. There isn't so much competition that they feel compelled to offer that feature, and it will only lose them money. Any copy of a game you can't give to someone else is potentially a copy that they have to buy, or you have to buy for them (hence the gift feature). So while they have a model which COULD allow them to accept game returns and do rentals for titles that use Steam integration instead of CD keys (some Steam-purchased games come with keys that you have to manually enter), it'd just lose them money (and who wants to download a 10 gig game and "return" it a day or two later?).
I guess we should be more precise here. Yes, "Licensing" is the reason there is gaming. Without licenses to content and software you couldn't access them. I don't assume that's what you meant, though, and I'm curious as to how licensing got brought into this stuff.
As for the trading in, I find it a sheer benefit cause I don't go back to play titles that often... I play the game till I beat the snot out of it... and I don't really come back to it. Consider me that guy that plays a game, till I exhaust almost every possibility before giving the game up. Even now I have very little incentive to play titles like MKvs.DC when I find Street Fighter 4 a much well polished game, both are 2D fighters... however I might even eventually return SF4 due to the fact that the only fighter I return to play over and over is downloaded to my 360's hard drive and that's Marvel vs. Capcom 2.
No matter how much you disagree with how now a days about the idea of licensed products, on the industry stand point its what was critical to even be a industry for the games. Infact no matter how much I whine, plea, or cry... because Nintendo nor Sega... even Sony won't release some titles here in America as a licensed product, we end up missing alot of crap due to how the average gamers would accept some of the games. "Mother 2 aka (Earthbound)" (which was a great game but it's was the marketing is what killed it and time and place and even how to get it.) However it's still true, if they don't see that enough X gamers will pick up a title due to the design it never gets released here in the states, due to financial concerns it just never gets released.
[India never got Fallout 3 due to the use of the NPC called the "Brahim" and in Japan they wouldn't allow you to blow up the bomb in Megaton and renamed the Fatman to something else.]
And I don't much care about what happened in 1983. I wasn't around. You weren't around. Gaming then was very, very, very different from gaming now.
I'm not saying there's no point in trading in. Just explaining why it doesn't happen with the PC. Frankly, with the rising issues of piracy on consoles, I'm surprised they haven't started to crack down on it as well with unique key codes. In 5 years it won't matter, as the majority of purchasing will be done electronically, and there will be no reselling. Can you resell Virtual Console games? What about Xbox Live Arcade titles? Stuff you buy off the Playstation Network?
And yes it was a much different time than now, but still we can always end up back at square one if the industry allows such a thing to happen.
Now as for piracy for consoles, it's actually in my opinion more difficult to find the hardware and such to get games illicitly for consoles. It takes too much for it's real worth to go out of one's way and get the supplies needed. [should know, have a friend with a hax'd PS2 and he has to jump through hoops to get the damn thing to work, and reason for use of it.]
All's I heard was that there were a fuck-ton of people who got punished for pirating Modern Warfare 2 on the 360. And we had a bunch of DVD images on our server for PS2 games, and Sepf's PS2 played pirated games just fine :|
Now as for pushing a button... no, I just find the "Newer" gamers out there more concerned with graphics, art direction and presentation versus what is good game design. I'm looking at you Rockstar games >_O trading depth of game design for graphics and story telling.
But still I digress, Your getting mixed up with the industry trying to get rights on making movie based video games.
I'm talking about the Activision/Atari fiasco with games like Pitfall.
Younger generation doesn't have super high end computers, nor do they have a lot of money for games. In comparison, PC gamers who have a lot of money for hardware are going to want it to be worth their investment. And there's another boon to PC gaming too--when it comes to console games, every copy made, even if it's never sold, requires an up-front payment to Microsoft or Sony. If you think you're going to sell 1 million units and you only sell 500,000 then oops, not only do you have the potential of not breaking even on development costs and to eat the cost of 500,000 boxes with discs in them, you've gotta pay up to $10 per disc to Microsoft or Sony.
And piracy, last I heard, was actually becoming quite a big deal on consoles. Something about a million or so players being banned for having pirated copies of games? That's a lot D:
Also I do know piracy is on the rise for the consoles. But it's more well known and easier to pull off on the PC. So it still has some catching up to do with the Piracy on PC. Though at this rate it won't be long untiall it does.
In dark corners they have a moment when a huge mob is chasing you with weapons and you lack of them... and you just had to run like hell to safety, and as you do your sanity and such is teetering away. The protagonist will start getting more traumatized as he inspects and finds grisly things or his vision will blur if he looks over high heights and things like that.
Infact what I enjoyed about the games was how they used sounds and delusional effects in game to screw with the player. Much like the very beginning to bioshock before you go down to rapture, it had those elements where they had you walking into a dark room, with the lights flittering on and off...
But still what is probably still critically acclaimed for the horror genera is RE4. For it finally scraping the old formula of pre-rendered enviroments, limited inventory space that penalized you, and crappy combat and shifting to over the shoulder, a upgradeable inventory along with weapons, and the power to aim with the gun.
However what is most important with the horror genera is not to make the gamer have to fight a very broken system, and keep building the tension... how RE4 did it and they did it right was the cut scenes that have those "quick time" moments that you HAVE to look out for.
As for Alan Wake, I'm willing to bet that the extra five years hasn't done much to add polish. Kind of like how Too Human was a bland, slow, and rough experience after nearly ten years of development.
That's cool. But well here's what developers say now: we go cross platform because it increases our market range and therefore profits. Consoles are good cause the average schmuck and parents with kids own one. Pc's are generally a niche enthusiast market for gamers. Everyone owns a pc, but 80% of them are $299 dell's or something cheap they got to go on the internet that's also probably 5 years old. Bigger profits come from a market baseline on a static system configuration where exceptions and malware don't need to be accounted for in the developers time, and the average living room may have one of these boxes.
What about piracy? Pc's are such a hassle since everything pc can be pirated. Ehh why open ourselves to the risk anymore?
So while I can always enjoy my old titles again on a new rig with all the improved performance and have all the freedom in the world with how I can play the game, and use a KEYBOARD AND MOUSE that's just not the market incentive. Lets make consoles more like computers to make it easier to make games on consoles. Pc's are hard, lets just make them for consoles now.
<-rage
QFTMFT
Most of the hardware in my desktop is 4 years old now (I think? A lot of the non-important parts are older) and it still runs like a dream on crack bastard child of a greased monkey.
I mean sure, I can bring the thing to its knees pretty easily sometimes, but for gaming, I'm good to go.
Don't make me guess at acronyms >_<
I only first heard of Alan Wake at E3 last year, so I guess it shows how behind I am, but the trailer made it look pretty cool, so I put my preorder down before I read the whole history behind its development. It kinda irks me to see how some people are bashing it for being developed for so long and instantly deciding it's going to be crap. I know past records haven't been great (Haze and Too Human), but there have been times where it's worked in the title's favour (Team Fortress 2)... At least we know it's not going to be another Duke Nukem Forever. Funny how DNF is a racing acronym for 'Did Not Finish'.
I'm actually kinda looking forward to it. The premise of the way the game works, in a way, kinda reminds me of Eternal Darkness (in that the longer he stays awake, freakier stuff starts to happen... not quite the Sanity guage, but, eh), and I loved Eternal Darkness. It'll be an interesting experience, regardless of how good/bad it is.
And, for the record... I like the Wii. ;_;
Ultimately, the only hard difference between PCs and consoles is the idea of having a standard system for 5-7 years versus constantly evolving standards. You can use a teeny little display or 100"+ projector screen, headphones or 7.1 professional-quality sound system, thumbstick or mouse (in some cases) for either. They can have any genre for any number of gamers, local or online.
If I'm honest, I prefer the standard system idea that consoles run with, because it's one my wallet can keep up with. Maybe once I leave university and get into a full-time job, that may change, but, I'll still be a console gamer. I can see the benefits of PC gaming - games being, generally, £10 to £15 cheaper being a big bonus, as well as community mods for most titles. But I can't afford to keep updating hardware to keep up with graphical advances. That, and I use a laptop, so, it's going to be even more expensive when this inevitably becomes obsolete hardware.
It's also the one that's seen the lowest reduction. Xbox 360 Elites were £300 when they first came out, I got mine last year for £200 and now I think they're something like £180... the original price of a Wii. How much is a Wii now? Roughly £165.
I actually think Sony are the worst offenders for overpricing with this generation. Back when the PS3 was the cheapest way to watch Blu-Ray the price was almost - Almost! - justifiable. Now Blu-ray is more common - hell, this laptop has a blu-ray drive - the PS3 is still too expensive. Coupled with the lack of many decent exclusives - I can think of about 5, maybe 6 - and the fact blu-ray isn't quite as mainstream as DVD yet, I can't see myself getting one for quite a while. And the PSP is just as bad. I thought Nintendo was milking it when they wanted £150 for a DS with a fairly poor camera and no GBA slot (although the multimedia features were nice), but then along comes Sony wanting £225 for the PSP Go. I wouldn't pay £150 for a DSi - I won mine in a competition - and there's no way in hell I'd pay over £200 for a handheld. £100 for the original DS was a stretch. Good thing I like the DS. *Shakes fist at Nintendo*
I'm all over Blu-ray, but not for stand-alone players. I like to rip the discs to our computer. Accessible at all times by all systems on our network =D
And if I knew how to rip DVDs/Blu-Rays to my HD, I'd have no HD space left. X3
Bleh... it feels like a side-effect of something which, for all I know, is purely in my imagination. It feels like for titles from the mid 1990s through mid 2000s were less about money and market saturation, and more about making good games. None of this "exclusive to that platform because they're writing a check with lots of 0s" stuff. Maybe it's just because I was only focused on PC games, I dunno... all I know is we didn't have CEOs dancing around back then talking in public about their love for milking us for every dollar they can D:
First off, the console splits hardware and software into clearly defined generations. I am actually in favor of this as it means I essentially got more value for my money invested on hardware. I buy now, use it for a number of years as prime material before it gets rotated down the line for newer stuff. And personally, theres an enthusiast in me but I'm going to state the simple fact that I dont have the time or money to be bothered with upgrades at monthly increments, even a year is pushing it for me. And from experience, 80% of people I actually know either dont have the proper expertise, desire, time, or money to have to buy new parts, nor do they want the hassle. As a representative sample, we can say the same for the average consumer. they want gratification and not headache. Come home, play your game, dont worry whats under the hood, just that its there and you can play it. The ease of use is what makes the console market, the titles are what sell the consoles.
About the only people who cater to enthusiasts are the hardware developers themselves. Generally, whatever sells the best and turns the most profit is where each developer is at, and whoever can sell the most units wins. On the other hand, your enthusiast hardware is still there, with diminishing performance returns per enthusiast pricetag. Very niche in the nature itself. GPU's on the other hand have to be the biggest gimmick product, graphics cards come with the most absurd pricing while all they generally handle is textures and polygons. Much of this is wasted real estate, yet they can push a card $300 more than it needs to be over a 6-10fps return for the consumer.
Theres another dimension to the hardware argument too. We are at physical material limitations now. We actually have been for a few years, since the pentium 4 and the athlon xp, and soon, transistors simply wont be able to get any smaller. The demand has been to make "smarter" hardware and likewise software needs to change to support the hardware. So despite hardware "generations" software has been scaling up to bridge these gaps we have. Most of what goes into your games is done by your CPU(again, gpu really only doing polygons and textures, etc) so direct x in turn was developed to balance the traditional load, putting more of it on your empty real estate with 10 and 11. Now the standard in cpu's is multicore, so software is being made to utilize multi threading, but that still isn't a perfected practice, so new programming languages are being developed to use the prospect of parallel processing with scaling to what is expected to be 8 cores and beyond. Simply telling the application to put ai on one core, and rendering on this core simply doesn't cut it.
I actually think these hardware limitations are what really made the generation gaping plausible and so well received in the first place. I'd like to say that microsoft had some good insight or timing to enter into the console industry when they did, or just massive coincidence.
The gaming industry itself is well, simply not even competitive now. You can try telling me it is, but its not. Some titles are going to sell no matter what. Everyone else is just there to make a game, make a buck. The console industry almost died off before nintendo saved it, and now nintendo is the largest perpetrator in ruining the industry again. During the 90's it really was about the quality of the game, and companies had to be careful and do something to set them apart from the rest. Now, risk is unacceptable. The developers generally desire larger budgets to make a bigger game, and being cross platform helps that. Somewhere down the line they saw it was most cost effective to take manpower from doing certain ports to betting them all on the winning horse. Which is where the console's designed after pc's model exploded back in the pc users face. Consoles also being designed identical to pc's has the potential to erupt in its own face as well, as the ease to make games for the various platforms has spawned the trash that floods the market today. Theres a limit to what people take, and I think a console with a $600-$1k price range is what will do it, but the companies have observed this limit are are going to play with it as much as they can. And it is like this, that they can create a delicate balancing act between selling consoles and shovelware trash games flooding store shelves. As long as theres a few shining titles they can hog to themselves, they can ensure they will have a selling product, and the market wont collapse.
The ONLY thing even remotely related to that is that, with PCs, you have the OPTION to take things further. You can get more graphics hardware, bank balance-permitting. So you do have to choose at some point just how important the visuals are to you. Some people don't want to face that choice, but yanno what? Fuck 'em. In the neck. They have to choose for EVERYTHING ELSE in the world. How much is their car performance worth to them? How much is their meat freshness worth to them? How big of a TV screen are they willing to pay for? What are the most fancy shoes they're willing to get? How fancy of a phone do they need? How big of an MP3 player would they like? Even a console gamer has to decide how important visuals are to them when they invest in a display--bigger is better, higher resolution is better, higher bit depth is better, 60 vs 120 Hz is important, contrast ratio and display brightness are all non-trivial factors which make the difference between a $300 set and a $30,000 one. I deem it unacceptable for someone to get a console simply because they don't want to make one more choice--the value of the digit between the $ and the 00 in the price tag. It's a multiple-choice question. Default in the middle, 3. Find the average age, in years, of the games you've played over the past few months. For every year, decrease the digit value by 1. Further decrease it by 1 if graphics really aren't important to you. Increase it by 1, however, if they are important to you, and increase it by 1 for every year that you want your games to look at their peak.
I'm confused about the whole GPU = gimmick bit, perhaps you could clarify that? I mean, I have some nice graphics hardware. It pushes 90-120 fps in a lot of the games I'm playing, and it ran me $400 at the time. I don't consider it a poor investment at all, and I fail to see how it doesn't improve my gaming experience.
I'm also confused about the physical limitations thing. People have been saying we're at the physical limits for years, and every prediction of that has been overcome by feats of materials engineering and picking the right metals to use in processors. I'm not saying that won't happen, but it's a touch early to cry wolf. Anyway, I thought the recent trend towards multi-core CPUs was for quite the opposite reason. It's expected that chips will continue to get smaller, with easily ten times the density we see today coming in the near future. The reason to make everything multithreaded is because it's physically possible to stuff more transistors in the same space, but a real bitch to make a single-threaded processor which is 4 times as complex. So the simple solution for the hardware guys is to instead make 4 processors on a chip and tell the software guys to split up the task, more or less. The switch to multiple cores was to keep processor design from being bogged down from their own massive complexity, or so I was under the impression. Am I wrong here?
And I simply don't understand any of this PC generational gap influencing console stuff you're talking about. You've been able to build multi-CPU (single core each) systems for a decade now. In the mainstream it didn't roll along until P4 hyperthreading and the Athlon X2, but even then... it didn't have much immediate positive effect. If you'll remember back 5-6 years, people were saying "Fuck that, if you want high-performance gaming, you get the single core with lots of gigahertz". People had to make flimsy arguments about "well you can re-encode video and do a virus scan while you play games" because initially, there was no reason to take a 2 GHz dual core processor over a comparable, likely cheaper 3.2 GHz single core. Nothing would take advantage of it, so you were just shooting yourself in the foot. The only compelling reasons to switch were when games did start to take advantage of that added performance, single core speeds stagnated, and people realized that the problem would only get worse over time. When you were able to tap into 100% of one core and 60% of the second you could match the performance of the 3.2, although by that time we'd moved beyond 2 GHz dual cores. With multithreading in software and hardware able to handle more operations per second than in older single-core systems, the adoption was only natural. I really don't get this generational gap thing you're talking about...
And from where I'm standing, MS entered the game industry at a pretty shitty time. The Xbox got its fair share of the beating stick, and the only good thing about it is that it took the brunt of the hits. The PS2 was posed as a behemoth, small and efficient with a huge back-library of PS1 games and all the developers that said system brought with it, and the Xbox was... a VCR-like brick with a single-digit-GB hard drive, a Pentium III and some Nvidia graphics which they later got so worked up about over pricing issues that it caused them to abandon Nvidia for their next system. It was oversized and under thought-out, and about the only good thing it did for gaming in general was to make online play more standard and expected. Prior to that, only PC gamers took online play seriously, and short of a few Dreamcast titles, only PC gamers had ever played online and/or with more than 4 people in a game, period.
I'm... not trying to tell you that the gaming industry is competitive. It's not, and therein lie the problems.
I think my only comfort is knowing that if PC gaming dies, it's going to fuck over console gamers. They'll never realize why it happens, but they'll be fucked over, same way Macs would be fucked over if the PC industry died. How do they have such nice, high-performance hardware to pick from when it's time to make a new system? The PC industry (as far as consoles are concerned, that only applies to their graphics parts, but still). If PC gamers weren't around, graphics cards would be even more niche than they are today. Mom and pop don't need more than a few gigaflops of GPU power for all their video decoding, window drawing needs. And look at professional (CAD) graphics hardware, that shit is always way more expensive than gaming hardware, even for almost the same hardware. Without PC gamers and their wallets subsidizing the development of companies competing for GPU price, performance and efficiency, console developers would have two options: have advanced chips custom-designed almost from scratch and pass that price on to the console gamers, or stick with Gamecube-level graphics chips (although that was made by ATI, even!) like the Wii is doing. And you wanna talk about gimmicks, if all you don't have anything in the processing bag to differentiate your product, basically all you've got is the controller. And we can see what Nintendo is doing to differentiate itself in that respect :|
Yes, it is ignorance. Have you worked with the average consumer lately? Both with computers and even in retail? How often I have to explain explicitly how the card goes into the slot despite the picture instructions right ON the slot they HAVE to see in order to jam their card in the slot? Console removes a step. It also mac'ifies the gaming industry, this is what devs like, industry standard. First the language, then the hardware.
Due to present day trends in game development, yes. Early days? rage, stb velocity, voodoo, not a chance, you had to make quite an investment then. Because I dont miss these days, I am thankful for how it is now. TV's also dont factor into most consumers thoughts on buying a console. TV's and theater systems are the "console enthusiast"s high end. inversely, those dont always factor into the console so much as simply their home entertainment system. Think of the console for most as an extension of their other big purchases.
I've been eyeing gpu's for a while. I am still using an 8800 GTS i paid a little under $400 for when it was a new card. I pull 120+ on most games, but typically I shouldn't have to worry about lower than 80fps average. I cant do something like say, the nvidia 3d with this setup and run at 1650x1050, but I haven't had to cross that boundary yet. If thats what my current performance is, other cards wont need to deliver more because the human eye cannot see anything that high, thus until I decide to do something more complex, or my games start to get obtrusively ugly or even unplayable, I dont need a new card.
As far as hardware limitations go, yes. CPU's were getting unmanageably hot, and they had to build smater, hence splitting the load across two cores. two cores can run parallel faster because electrons have to pass through two smaller cores, instead of one large core. Second, the transistors are tiny, and evidently transistors that are becoming mere atoms in thickness are taking too long for an electron to pass through. Engineers know exactly how small they can make their parts before an electron passing through it causes it to explode, and current lithography techniques are showing limits. We are still going to get more out of cpu's made out of silica, and the core scaling is continuing. This scaling will however hit the same physical limitation as the single core cpu, which is the time it takes an electron to physically move through the components. They may even begin to start making core sandwiches if certain techniques go well, but it is important to know that speed itself has not been increasing in the same manner, just the way data is being handled by the hardware. Though if graphene becomes viable as a semiconductor, that could mean a huge burst in technology.(i personally dont see a breakthrough in the quantum processor, light processor, or other things soon)
Ms entered and the xbox was like their hazing into the industry. By all accounts, the 360 is a vast improvement, and now they arent expected to leave. Price is important, and now they have been able to make their product so widely available because your $2-400 investment is riding years past when its hardware was originally expected to go stale. And yes, pc gamers have been playing online since well before any console. I remember having a fun match of duke 3d with 12 people before with my 33.6 modem >.>; Dont get me started on xbox live though, paying microsoft for the privilege to play with low latency and lag ridden games. I tried to move to consoles for my fps needs before with gow2, and between a controller being like trying to do surgery with a cudgel, and the fact that my bullets would hit a wall 2 seconds after someones head was gone, I couldn't do it. And duke 3d with 4 players lags like shit too on live(gg)
Yes, if consoles collapse, the pc will be here for everyone. So will the games. Developers ARE shunning the pc market, and they are being short sighted, not realizing that if they remove another market, they will harm their own, and consequently go back to the still existing platform. Side note: nintendo stopped competing, they began to play their own game and let ms and sony fight each other. I dont consider them in the same business anymore.
Correct my GPU history if I'm in the wrong here, but there was only a short period where that nasty many-GPU-vendor situation existed. I mean, prior to the mid-90s, most everything was 2D sprites and dedicated graphics cards weren't necessary, or in many cases, could add anything... up through nearly 2000 even, you could get titles which worked in software... Half Life and its follow-ups, Homeworld, etc. Not to show off my embarassing youth, but back then I didn't even know what a graphics card was, and I doubt our sytems had dedicated graphics, yet I still enjoyed games prior to 2000. And by 2002 when I did start paying attention to graphics (I remember drooling over the $400 Ti 4600 with *gasp* 128 MB of RAM running Jedi Outcast and Medal of Honor on a $900 ~18" LCD, back in 2002... then near pissing my pants when I bought my Radeon 9700 later that year), the choice had boiled down to ATI or Nvidia. Am I missing a bunch of heavy-hitting titles between the late 1990s and 2002 which relied strongly on graphics cards from that plethora of companies, or something?
Perhaps I should rephrase. I don't mean to make it seem that a consumer sits down and says "Hm, I want an Xbox, an I should get a new TV with it so it will look awesome." Perhaps they already have a TV. Perhaps they're getting a new TV. It doesn't really matter, the point is that for almost all things in life, people have to weigh how much it's worth to them and how much they need. It's true whether we're talking about horsepower in your car or inches on your TV. GPU power is much the same. A unit of horsepower or an inch of diagonal screen measurement or a gigaflop are all pretty abstract things to work with, but their effects are not: better car performance, more immersive movies, and more detailed, clear games respectively. As I said, I don't view "don't feel like placing a value on how good games look" as a suitable excuse to favor a console over a PC, it's lazy--even for an average American.
I had an 8800 GTS, served me well until I upgraded to my dual 260s (at $200 apiece after rebate, it felt like a good deal at the time). It felt like each 260 was just a bit more powerful than the 8800, which combined with the less-than-doubled performance of using SLI, meant I could get approximately the same framerate in 3D (doubled performance, but two versions of each frame rendered, negating each other). The human eye can see past 80, but you really get diminishing returns. At least you have a good way of looking at things, so long as things aren't distractingly unsightly or jerky, your hardware is good. If only everyone had such rational thinking, maybe PC gaming wouldn't be viewed as such a nerdy upgrade-fest. Just because the hardware is out there doesn't mean you HAVE to buy it... I mean, you can always get a better TV than the one you own, but that you don't see people saying "I don't want to get a TV, because tomorrow they'll have a better one, and I'll HAVE to upgrade to it for my movies to look as awesome" >_>;
Quantum processing is... yeah. Would do more harm than good if people ever figured it out xD Goodbye RSA encryption!
It was interesting to watch the Xbox go from being the most ridiculed system (although the Gamecube sold fewer, I think, there was still that bit of lingering Nintendo love) to almost the expected brand. Serious gaming just doesn't take place on a Wii, with very rare exceptions, and the PS3 is still more ridiculed despite the facts that it's been out for a year less, has fewer hardware issues, is faster, doesn't charge for online play, includes a Blu-ray player out of the box (with, they say, planned support for 3D in both movies and every PS3 game), and is on a smaller hardware revision. It says a lot, I think, that Microsoft went from having the ugly console which sold not a quarter as well as the Playstation of the day to having the leading system which is cool and almost expected to play, in gaming circles.
I still don't see how this has anything to do with what you called the generational gap, though. They made a shift to a triple-core PowerPC which was a bit strange, perhaps if Intel had things rolling a year earlier then a Core 2 variant might've been favored... still, PowerPC is nothing new. And they used an R600-based graphics chip--that generation is only notable for two reasons, the first being that it was the basis for the 360 (self-referential!) and the second that it re-introduced, for the first time in several generations, hardware tessellation... again, because of the 360. The choice of a DVD drive is only notable because it's almost on the wrong end of the generational gap (you can't really fault Microsoft for not using Blu-ray or HD-DVD in a console from 2005, there really wasn't an alternative) but otherwise isn't relevant... they abandoned internal storage after single-handedly introducing it to the console market just 4 years prior, and after their competitors had adopted it... which I hardly think was a smart choice, again, wrong side of the generational gap. Perhaps the most remarkable thing about the hardware is that they used GDDR3, which isn't very impressive given that DDR3 was in use in what, 2002... but rather, because they used it as system memory. A little less impressive when you realize that they just did this so they could have shared memory between the GPU and the CPU, and anything less than GDDR3 would've made the GPU cry. Still, it's hard to argue with 22.4 GB/sec of system memory, my DDR2 only gets 8.5 per module and that stuff is much newer.
Oh wait, it isn't hard to argue with that at all, it's called 512 MB of shared memory. I would not buy a new GPU with anything less than 640 which would be pushing it (I'd much prefer ~1 GB per card), nor would I ever consider putting a RAM module less than 1 GB in my system (nor buy anything under 2 GB). A 1080p frame and a 720p frame (assume the game outputs at 720p and it's upscaled to 1080) runs 9 megs by itself, a 10k polygon model will in the very best case (a model which can be described as a continuous strip of polygons) take 120 KB and in the worst case (completely disjoint polygons) 1/3 MB, more if you'd like to store texture coordinates or anything else. Textures themselves can suck up space like nobody's business, and in this era we're not just talking one texture per object, we're also looking at parallax mapping, bump mapping, displacement mapping and possibly specular mapping. Video can be streamed from the disc with a minimal buffer, but music should be buffered and sound effects can pop up faster than the seek time of the optical disc, so those have to be in memory... there's the core OS for the system, the Xbox Live stuff running in the background... and let's not forget, oh, the GAME ITSELF. You know, the part which isn't pretty textures and wireframes and music, but big lists of objects in the level which you need to consider drawing every frame, every detail of every AI and physics simulation state for every relevant object in the game world, lists of objectives, in-game text, temporary scratch space that algorithms use for resorting things and storing values between computations...
So really, the wonder of consoles is that they're able to do what they do with 512 MB of shared system resources. These days, I wouldn't run anything more than XP on 512 MB of non-GPU RAM, and I wouldn't seriously expect 3D games to be too happy with the situation (especially if I had 512 MB of RAM between the CPU and GPU). Hell, on my early-2005 build, I paid $220 for a pair of 512 sticks of DDR400...
That final point, about if consoles all died... that's what I was trying to make the other night with
Ultimately that's my view on stuff, either have unified standards which hardware and software can be developed to, or kill the proprietary hardware systems (consoles and Macs). Wouldn't that be an interesting game ecosystem, selling games which worked across a wide range of computer implementations (modern PC hardware) and simply had "optimized for blah" on the label? They'd design it for PC and market it as being optimized for the Microsoft Game Spec 4.0 or the Sony Game Spec 7.0 or the Nintendo Game Spec 1.5 (which, I figure, is about a fair version number for them right now c.c). Consoles would be just mass-produced PCs conforming to one or more of these specs, games would do the same. It'd be like a hybrid of the current generation system of game consoles mixed with the standards we expect in DVD or Blu-ray drives, and the numerical rating system that Windows uses to classify your performance. If all I wanted was a low-end experience with wacky controls I'd get a system conforming to the Nintendo Spec 1.5, aka Wii. If I craved more, I could always opt for a faster PC that conformed to and met the Nintendo Spec 1.5. Same graphics capabilities, just faster and more capable. You get the idea.
Ugh, imagine the savings in effort. PC architecture evolves very gradually, except for graphics--which deals in strict supersets. There wouldn't be any of the nonsense which plagued the PS3 early on, with developers struggling to figure out how to fully use the CPU, or the custom shaders for the Wii. Standards would evolve slowly so developers could adapt over time, rather than having the first batch of games for a system run like arse.
And yes, Nintendo isn't in the game business anymore. They're in the fitness business, as I remarked elsewhere. First they get gamers moving their arms about, then they force them to stand up with the Balance Board and introduce fitness games and weight tracking software, and soon they'll have a pulse-taking peripheral? That's shit you get for a home gym, not to play Mario or Unreal Tournament. I mean trust me, I'm all for immersion--the silly-expensive 3D glasses and Novint Falcon should attest to that--but shifting your body weight and taking your pulse isn't the way to get it.
Know what's fucking absurd about this whole thing? "I like consoles. They provide for cheaper gaming." $400 + $600 + $250 = $1250 = the cost of a decent desktop computer a few years ago, sans I/O. $300 + $300 + $200 = $800, which will still build a decent computer today. Those are the prices of the major consoles at launch, and presently. Cheaper my ass. Oh, + $300 for Xbox Live. Suck a fuck, cheapo console users, and learn some goddamn math.
To take a step back, yes, the emergence of the "3d accelerator" saw a whole slew of companies. I was probably in my hardware enthusiast prime there and followed the stuff. What I would have done for a $600 canopus spectra or the $2000 for a seagate cheetah 18(which was just 18gb with a 30k spindle) :O The flock got thinned, weather the companies died off, found different markets or were acquired. I often remember feeling like I had just managed to scrounge up just enough cash to get a new GPU or something, and in less than a year homeworld came out and I wanted more.
"generational gap" is in reference to the current development mindset. Previously, games were clearly cut by the engine they use and the industry standards in technology. A "next gen" game was typically one that pushed present day performance and needed more, using new technology, lighting, shaders, whatever. Now, companies are cross platform, and picking the cash cow which is the console. They are forced therefore to work on the hardware they have, taking the current game engine generation and getting what they can out of it. Also yes its important to note to everyone that the reason games can "appear" to run well on a console is because the dev's know the device they are working with and know all the little dirty tricks and things to do to create the illusion of performance, or keep their game from bottlenecking. So, in this way an observable generation is created. Titles that come out on this generation will work acceptably or good on my hardware which is 3 years old. When talks for a new console come around, the titles are going to be tailored to fit it, if developers continue to follow this trend, and like getting a new console, it would officially be time to do a new build.
You also cant talk sense into console zealots. You cant beat it in either(most games arent cross platform multi) and they dont listen even then. And yeah, they arent cheaper. To me the only real factor that distinguishes the environment is that a console in an idiots hands is less likely to be riddled with malware that make the device unusable. Even for people with fair computer knowledge, that stuff can still bite you in the ass like an STD.
And that's all I have to say on that D:
That said, you can still buy a decent gaming computer for at least another year or two by just buying about $300 worth of computer hardware. And even minimum-wage jobs make somewhat close to that in one week (speaking as someone working at a near-minimum-wage job at the moment), assuming a 40-hour work week.
And when you realize that you're paying something like $300 on top of the cost of an Xbox for the ability to play it online, and it occurs to you that $300 will buy a pretty nice graphics card 2-3 years into the life of your gaming rig, suddenly a gaming PC seems a lot more cost-effective.
Of course, if you're really thrifty, you'll hold off on getting that soda with your fast food meal every day, which will save the dollar a day necessary to have another $1000 in a few years for a complete system overhaul, should you wish it -_-