[Tech] On GPUs, Tessellation and DX11
Posted 16 years agoLike everything I write, this was initially... pages. So here's the condensed version.
Computer graphics. In all modern 3D games, they consist of texturing and lighting wireframes. Wireframes, things like these. Arrangements of triangles which represent solid surfaces.
In the past 10+ years, a lot of things have changed. In texturing, we used to just have one texture per object to give it the right color and pattern. Over time we've added bump mapping to simulate complex shadows caused by bumps and dips on otherwise flat surfaces, parallax mapping to let us see into those virtual dips and cracks, and specular and environmental mapping to tell the game how shiny surfaces are and what they should reflect. That's a big part of how we got from Half Life to Halo 3. So too have there been breakthroughs in lighting: where we once could only afford to have static lighting, we added volumetric shadows which show the shape of objects casting them, High Dynamic Range to make really bright objects seem brighter and better reflect how light interacts with our eyes, and even indirect lighting which allows light hitting objects to illuminate nearby objects--one of the reasons you never get a perfectly black shadow outside.
But you know what hasn't changed much? Those wireframes. Whether it's Mech Warior 2 or Cry Engine 3, our characters and powerups and landscapes are all ultimately the same wireframes, just varying in complexity. Sure, games will include multiple wireframes for everything these days, so you can see a zombie's individual fingers when they're up in your face but the computer doesn't waste time drawing every digit when they're a 100 yards off, but even that's a sloppy solution--what happens when you get really, really close to something, closer than the designers felt it worthwhile to add detail? Let's say you want to examine a trivial prop--maybe a faucet or a doorknob. In current games, the detail you see is limited to what the developers put in.
With any luck, that will soon change. With Windows 7 came the launch of DirectX 11. One of the features on that is hardware tessellation, which allows us to change and improve the way that we work with wireframes. Take a look at this. The model on the left was made by... I don't know, scanning somebody's head, or manually positioning the triangles. The model to the right of that, which is considerably more detailed, smooth, and realistic, was made entirely through an automated process (and the model all the way to the right was done with the same process applied a second time). Based on the 3D geometry of the original wireframe, the computer was able to divide each triangle into 4 smaller ones and calculate new positions for their corners to create a more curved and believable surface. It's the same stuff that Pixar does with their animated films to make sure characters don't have blocky noses and flat foreheads--they just make the triangles so tiny that you can't see any one individually. This is tessellation, and DirectX 11 brings support for it. This means that potentially, in a game which properly leverages this technology, you can get as close as you want to a curved surface and never see the flat shapes that make it up. Your computer just says "Uh oh, he's getting really close, we better make some more triangles!" and the object gets more detail. Obviously this won't be able to add extra details and imperfections to objects, like tire treads and bumps on somebody's skin, but your objects will be able to have smoother curves without the game developers wasting their time and storage space making a ten bajillion triangle wireframe for every pen cap and pebble in the game. Just imagine if your favorite games from a decade ago had this feature--you could play them today and while the textures might be blurry and the lighting simplistic, at least the character's hair, clothing, equipment, and the world around them would have reasonably smooth curves. It'd be much less painful to replay older games.
Of course, the other big plus for DirectX 11 adoption is that it means adopting the changes of DirectX 10 as well. A new shader model, more programable stuff you don't care about, geometery instancing, etc. The point is that it's updated and better, though with an adoption hampered by XP users stuck with DX9. With Windows 7 proving to be a pretty big hit (estimates are ~40 million units by the end of the year and another 140 million in 2010, but those are just estimates!), game developers will hopefully start developing for DX10/10.1/11 first and foremost... sooner rather than later. DirectX 9 was an exciting and evolutionary change in back in 2002, but we're coming up on its 7th birthday in about 3 weeks and I for one would be happy if it could retire before its 8th.
Questions, comments, always welcome. Or just geek out and share some cool video clips of pretty upcoming games. Also, please don't try to warp this into a "graphics vs gameplay" argument forum. That's so last year. A game should offer both. Finally, credit where credit is due:
kairhl initially and inadvertently prompted me to write this.
Computer graphics. In all modern 3D games, they consist of texturing and lighting wireframes. Wireframes, things like these. Arrangements of triangles which represent solid surfaces.
In the past 10+ years, a lot of things have changed. In texturing, we used to just have one texture per object to give it the right color and pattern. Over time we've added bump mapping to simulate complex shadows caused by bumps and dips on otherwise flat surfaces, parallax mapping to let us see into those virtual dips and cracks, and specular and environmental mapping to tell the game how shiny surfaces are and what they should reflect. That's a big part of how we got from Half Life to Halo 3. So too have there been breakthroughs in lighting: where we once could only afford to have static lighting, we added volumetric shadows which show the shape of objects casting them, High Dynamic Range to make really bright objects seem brighter and better reflect how light interacts with our eyes, and even indirect lighting which allows light hitting objects to illuminate nearby objects--one of the reasons you never get a perfectly black shadow outside.
But you know what hasn't changed much? Those wireframes. Whether it's Mech Warior 2 or Cry Engine 3, our characters and powerups and landscapes are all ultimately the same wireframes, just varying in complexity. Sure, games will include multiple wireframes for everything these days, so you can see a zombie's individual fingers when they're up in your face but the computer doesn't waste time drawing every digit when they're a 100 yards off, but even that's a sloppy solution--what happens when you get really, really close to something, closer than the designers felt it worthwhile to add detail? Let's say you want to examine a trivial prop--maybe a faucet or a doorknob. In current games, the detail you see is limited to what the developers put in.
With any luck, that will soon change. With Windows 7 came the launch of DirectX 11. One of the features on that is hardware tessellation, which allows us to change and improve the way that we work with wireframes. Take a look at this. The model on the left was made by... I don't know, scanning somebody's head, or manually positioning the triangles. The model to the right of that, which is considerably more detailed, smooth, and realistic, was made entirely through an automated process (and the model all the way to the right was done with the same process applied a second time). Based on the 3D geometry of the original wireframe, the computer was able to divide each triangle into 4 smaller ones and calculate new positions for their corners to create a more curved and believable surface. It's the same stuff that Pixar does with their animated films to make sure characters don't have blocky noses and flat foreheads--they just make the triangles so tiny that you can't see any one individually. This is tessellation, and DirectX 11 brings support for it. This means that potentially, in a game which properly leverages this technology, you can get as close as you want to a curved surface and never see the flat shapes that make it up. Your computer just says "Uh oh, he's getting really close, we better make some more triangles!" and the object gets more detail. Obviously this won't be able to add extra details and imperfections to objects, like tire treads and bumps on somebody's skin, but your objects will be able to have smoother curves without the game developers wasting their time and storage space making a ten bajillion triangle wireframe for every pen cap and pebble in the game. Just imagine if your favorite games from a decade ago had this feature--you could play them today and while the textures might be blurry and the lighting simplistic, at least the character's hair, clothing, equipment, and the world around them would have reasonably smooth curves. It'd be much less painful to replay older games.
Of course, the other big plus for DirectX 11 adoption is that it means adopting the changes of DirectX 10 as well. A new shader model, more programable stuff you don't care about, geometery instancing, etc. The point is that it's updated and better, though with an adoption hampered by XP users stuck with DX9. With Windows 7 proving to be a pretty big hit (estimates are ~40 million units by the end of the year and another 140 million in 2010, but those are just estimates!), game developers will hopefully start developing for DX10/10.1/11 first and foremost... sooner rather than later. DirectX 9 was an exciting and evolutionary change in back in 2002, but we're coming up on its 7th birthday in about 3 weeks and I for one would be happy if it could retire before its 8th.
Questions, comments, always welcome. Or just geek out and share some cool video clips of pretty upcoming games. Also, please don't try to warp this into a "graphics vs gameplay" argument forum. That's so last year. A game should offer both. Finally, credit where credit is due:
kairhl initially and inadvertently prompted me to write this.[Tech] Google Chrome OS
Posted 16 years agoFor those who don't know, Google let out some information about Chrome OS yesterday. It's their take on a computer operating system. Let's talk about it!
The basic idea, if I understand it correctly, is that people spend so much time doing so much stuff in a web browser, why not basically just make the OS so stripped down that it consists mostly of a web browser? Instead of a desktop email client, use Gmail in a browser. Instead of Excel and Word, use Google Docs in a browser. Instead of AIM, use Gmail chat in a browser. Cloud computing to the max. And before I forget to mention it, the intended use for Chrome OS is netbooks--although it can of course work on any computer.
Let it first be said that I consider this to be an interesting take on things which isn't entirely without merit. I feel that, if technology continues to improve as it has been in past decades, it's entirely possible that we'll be carrying around things similar in appearance to Star Trek PADDs some day. A thin touchscreen display dominating the device and some wireless hardware to connect you to a system which does storage and computations could totally work with the right network, storage and processing infrastructure. However, that is not the state of things, nor will it be until the end of the next decade, at the earliest.
Allow me to list off the immediately perceived benefits and downsides of this method of computing, as things are in 2009:
Good stuff
* No need to pay for the OS, save $30-130
* OS is lightweight, startup is fast and system ages better
* Documents and content are backed up on the 'net, and can be easily shared
Bad stuff
* Net access--what happens if you don't have it?
* Bandwidth--even a nice 10+ Mbit connection will slow your file access by a factor of 50
* Storage--what do you do with all your files?
* Performance--what if you need to do something like an image render, making a home movie, or play a game?
Let's tackle the good stuff first. Not having to pay for an OS is a big win for light users, no getting around that. And the OS will be less bloated than Windows and OS X I'm sure, with faster startup. Why is that, though? Because Google trimmed away a lot of the stuff that makes computers powerful. They're versatile machines which handle lots of local processes, provide fast access to terabytes of local files, and allow for GPU acceleration for fun (games) and work (image editing, etc). When you trim that away so you have a glorified web browser, well, you lose that stuff. The implications of which I'll get to later. And startup is faster? That strikes me as almost completely irrelevant in 2009. Both PCs and Macs are able to go to sleep and wake back up within seconds, faster than Chrome OS will boot, and letting you get back to what you were doing. Stop shutting your computers down all the time and put them to sleep, problem solved. Finally, being able to share your documents is definitely good, but that comes with some limitations I'll also get to very soon.
So, the problems with this approach. The things which make me squirm in my chair and be glad that I'm currently using Windows.
Net access. It's great when you have it. At the apartment, with a land line connected to our wireless network that gives reliable download rates of 1MB/sec and up, browsing the web is a reasonably smooth experience. Of course, that's not always the case. In the past 12 months I've been in several buildings on my former college campus, several airplanes, several airplane terminals, several conventions, and other assorted locations which either: have patchy/slow free wireless, charge for wireless, have no wireless at all but good cell reception, have shitty cell reception, or have no cell reception at all. It is at these times, when I can't check FA or chat or even catch up on the news, that I am more glad than ever that I have movies and music and games and text on my local machine. Simply stated, unless you've got a system which makes heavy use of offline caching, you're going to have your computing experience interrupted by net availability. It takes the occasional and easily remedied problem of "why is this thing going slowly" and replaces it with rather frustrating and un-fixable issue of "where is my reception".
Bandwidth. On a good day with a good connection, it's not unreasonable to expect 1-2 MB/sec download and 200-300 KB/sec upload from a home wireless network. If you're on a shared/public/802.11g/cellular connection, expect to get as little as a tenth of that. Ever wait for your computer to do something and you can see one of the lights blinking like crazy, maybe it's vibrating more than usual or you can hear a sort of humming/clicking-noise? That's the hard drive trying to load and save files, and it's a very common bottleneck for performance. A standard hard drive might get 60 MB/sec transfer and be able to move to a random file in .02 seconds. A good 'net connection will be closer to 2 MB/sec transfer and relay and retrieve information in closer to .1 seconds. And a cell connection can be several times slower still. So all that waiting you have to put up with? It's not going to get any better. On a good connection, downloading something like a full-size photo can be pretty quick, you just wait a second or two. Uploading it can take 5 times longer, and of course if you're going to have files and such to download, you have to upload it there in the first place (I'm not talking general web browsing, rather using the web to store personal files, etc). Switch to a less optimal connection and suddenly that 5-10 second upload time is the expected download time, and your upload time starts to creep up towards a minute. So while Chrome OS might do a cold restart 40 seconds faster than your desktop, you're going to chew through those 40 seconds gained pretty damn fast when you actually try to, you know, look at something other than flat text. To put it in terms which are easier to grasp... time to copy 1 GB of new photos from your camera to your laptop: 30-60 seconds. Time to copy 1 GB of new photos from your camera to the 'net with 100 KB/sec upload: nearly 3 hours.
Storage. Ignoring the fact that it might take you half the day to get your vacation photos uploaded to your online photo gallery even with a fully automated transfer, where do you expect to store them all? You can keep a limited number of them on hand at all times by making them into Gmail attachments, but they're limited in size and emailing things to yourself feels silly. You really want to use some sort of actual online image gallery. Google's offering, Picasa, only gets you 1 GB for free--and like YouTube, they gain rights to reproduce anything you submit to it, even if you don't make it public. Pretty sure they don't smile on porn in those galleries either. And they're just image galleries! And just for images you own! What if you've got a collection of your favorite... I dunno, LOLcat pictures. Or say you've got 5 GB of home videos, or 15 GB of music, or anything really--what do you do with it? Where do you put it? You're going to end up paying for it somehow, whether it's by dodging ads every time you want to listen to a song you put online, or letting Google feed you ads for gay cruises when picks up on tags for some pictures you were sharing, or by you flat-out paying for the service.
So you're going to have less storage than if you went with a regular OS and kept your files on the computer, it's going to be ten times slower to access it, and you're going to pay a lot more for it in all likelihood... assuming, of course, that you've got 'net access when you want it. The example of watching a movie on a flight is a prime example of all of these things: the movie occupies several gigabytes which is a small fraction of a dollar to keep indefinitely on your computer but much more online, it will take a day to upload it and if you've got a good connection you'll be able to stream it in real time, but you're on a PLANE so chances are you have no connection whatsoever.
And finally, Performance. Some things can be done for fairly cheaply online--Gmail, Facebook, and FurAffinity are all services which do some behind-the-scenes computation to spit out the data your browser shows you personally. For these services, the computations are so slight that the occasional advertisement view generates enough revenue that the service usually pays for itself. Not everything is that way. World of Warcraft charges monthly fees (in theory) to pay for their server costs, because they need to run lots of systems 24/7 to ensure players can quest together. Of course, a game like World of Warcraft wouldn't run on Chrome OS because Chrome is all about running things in the browser, which WoW certainly won't do. The best you're going to get with an OS like that are flash games, and even though my system can churn through the newest games with relative ease it absolutely cowers at full-screen flash animations and games with the settings turned all the way up. So to restate, the problem here is that the idea of doing all the work in the browser means that performance-demanding tasks have to be done there as well--and I think that companies will not be eager to rewrite programs to do all the heavy lifting on their end of things. Hey, it's either that or write every program in Flash. Have I told you about how abysmal interactive Flash performance is on real computers, to say nothing of netbooks?
When you think about it, there's a smooth continuity of computer distributed...ness... from the pre-WWW era Windows box to the Google Chrome OS. On the one side, everything is kept and done locally, with high performance but no connectivity. On the other side, everything is kept and done remotely, with low performance but high connectivity. Right now we're in a happy medium, I like to think. We watch YouTube videos from a web server through a web browser, and we listen to local music on a local media player (or edit local files in a local instance of Photoshop). There are programs which will upload your files to the web for backup purposes, Gmail Offline will download your email from the web for when you lose your connection, etc... all sorts of programs and tools help to bridge the gap between web servers and your computer, making sure you have the local speed and reliability combined with web connectivity. This is a good area to drift around and explore in.
Some might argue that I'm missing the point. Chrome OS isn't aimed at replacing Windows for gamers, or graphic artists, or network administrators. It's a netbook OS which seeks to decentralize data from your computer, and speed things up. As things are for the forseeable future, the former makes a system potentially worthless for all tasks if connectivity slows down or is interrupted, as well as flying in the face of the second by replacing optimized code running directly on the hardware with Javascript and Flash running in a browser. It's already possible to use Microsoft Office to compose a document on your system when you're offline, and use Google Docs (complete with its weak formatting tools and slower response time) when you're online. Opting for Chrome OS robs you of the ability to do the offline work, and potentially gives you worse performance, just so you can shave half a minute off your unnecessary start-up time.
I'm going to play Devil's advocate here for a moment. Google's whole argument for making Chrome the basis of the OS is that people mostly use computers these days to do things that are online, and online means in the browser, so why not just make everything you do live in the browser somehow? Well, maybe what they need to do is make 'online' NOT equate to something in a standard tabbed browser window. Make a browser window that doesn't look like a web browser, and let me navigate it to a site and run it in a way that looks like a local program. Oh wait, Chrome already does that. And Mozilla's Prism aims to do the same. They put it better than I could: "Unfortunately the web browser, which was originally designed for reading documents, is not an ideal environment for running applications. It is frustrating and time-consuming to wade through a mass of browser windows and tabs just to find your email client. Unstable applications can slow down or crash your entire browser. And many of the conveniences offered by modern operating systems are unavailable to web apps running in the browser." Modern desktop operating systems have window management tools, mini-apps, and high-performance applications running natively on the hardware--these things have been refined and redesigned for over two decades. It feels pretty damn foolish to discard them just because Google's logical follow-up for "People spend a lot of time in the web browser" is "let's eliminate everything except the web browser".
The basic idea, if I understand it correctly, is that people spend so much time doing so much stuff in a web browser, why not basically just make the OS so stripped down that it consists mostly of a web browser? Instead of a desktop email client, use Gmail in a browser. Instead of Excel and Word, use Google Docs in a browser. Instead of AIM, use Gmail chat in a browser. Cloud computing to the max. And before I forget to mention it, the intended use for Chrome OS is netbooks--although it can of course work on any computer.
Let it first be said that I consider this to be an interesting take on things which isn't entirely without merit. I feel that, if technology continues to improve as it has been in past decades, it's entirely possible that we'll be carrying around things similar in appearance to Star Trek PADDs some day. A thin touchscreen display dominating the device and some wireless hardware to connect you to a system which does storage and computations could totally work with the right network, storage and processing infrastructure. However, that is not the state of things, nor will it be until the end of the next decade, at the earliest.
Allow me to list off the immediately perceived benefits and downsides of this method of computing, as things are in 2009:
Good stuff
* No need to pay for the OS, save $30-130
* OS is lightweight, startup is fast and system ages better
* Documents and content are backed up on the 'net, and can be easily shared
Bad stuff
* Net access--what happens if you don't have it?
* Bandwidth--even a nice 10+ Mbit connection will slow your file access by a factor of 50
* Storage--what do you do with all your files?
* Performance--what if you need to do something like an image render, making a home movie, or play a game?
Let's tackle the good stuff first. Not having to pay for an OS is a big win for light users, no getting around that. And the OS will be less bloated than Windows and OS X I'm sure, with faster startup. Why is that, though? Because Google trimmed away a lot of the stuff that makes computers powerful. They're versatile machines which handle lots of local processes, provide fast access to terabytes of local files, and allow for GPU acceleration for fun (games) and work (image editing, etc). When you trim that away so you have a glorified web browser, well, you lose that stuff. The implications of which I'll get to later. And startup is faster? That strikes me as almost completely irrelevant in 2009. Both PCs and Macs are able to go to sleep and wake back up within seconds, faster than Chrome OS will boot, and letting you get back to what you were doing. Stop shutting your computers down all the time and put them to sleep, problem solved. Finally, being able to share your documents is definitely good, but that comes with some limitations I'll also get to very soon.
So, the problems with this approach. The things which make me squirm in my chair and be glad that I'm currently using Windows.
Net access. It's great when you have it. At the apartment, with a land line connected to our wireless network that gives reliable download rates of 1MB/sec and up, browsing the web is a reasonably smooth experience. Of course, that's not always the case. In the past 12 months I've been in several buildings on my former college campus, several airplanes, several airplane terminals, several conventions, and other assorted locations which either: have patchy/slow free wireless, charge for wireless, have no wireless at all but good cell reception, have shitty cell reception, or have no cell reception at all. It is at these times, when I can't check FA or chat or even catch up on the news, that I am more glad than ever that I have movies and music and games and text on my local machine. Simply stated, unless you've got a system which makes heavy use of offline caching, you're going to have your computing experience interrupted by net availability. It takes the occasional and easily remedied problem of "why is this thing going slowly" and replaces it with rather frustrating and un-fixable issue of "where is my reception".
Bandwidth. On a good day with a good connection, it's not unreasonable to expect 1-2 MB/sec download and 200-300 KB/sec upload from a home wireless network. If you're on a shared/public/802.11g/cellular connection, expect to get as little as a tenth of that. Ever wait for your computer to do something and you can see one of the lights blinking like crazy, maybe it's vibrating more than usual or you can hear a sort of humming/clicking-noise? That's the hard drive trying to load and save files, and it's a very common bottleneck for performance. A standard hard drive might get 60 MB/sec transfer and be able to move to a random file in .02 seconds. A good 'net connection will be closer to 2 MB/sec transfer and relay and retrieve information in closer to .1 seconds. And a cell connection can be several times slower still. So all that waiting you have to put up with? It's not going to get any better. On a good connection, downloading something like a full-size photo can be pretty quick, you just wait a second or two. Uploading it can take 5 times longer, and of course if you're going to have files and such to download, you have to upload it there in the first place (I'm not talking general web browsing, rather using the web to store personal files, etc). Switch to a less optimal connection and suddenly that 5-10 second upload time is the expected download time, and your upload time starts to creep up towards a minute. So while Chrome OS might do a cold restart 40 seconds faster than your desktop, you're going to chew through those 40 seconds gained pretty damn fast when you actually try to, you know, look at something other than flat text. To put it in terms which are easier to grasp... time to copy 1 GB of new photos from your camera to your laptop: 30-60 seconds. Time to copy 1 GB of new photos from your camera to the 'net with 100 KB/sec upload: nearly 3 hours.
Storage. Ignoring the fact that it might take you half the day to get your vacation photos uploaded to your online photo gallery even with a fully automated transfer, where do you expect to store them all? You can keep a limited number of them on hand at all times by making them into Gmail attachments, but they're limited in size and emailing things to yourself feels silly. You really want to use some sort of actual online image gallery. Google's offering, Picasa, only gets you 1 GB for free--and like YouTube, they gain rights to reproduce anything you submit to it, even if you don't make it public. Pretty sure they don't smile on porn in those galleries either. And they're just image galleries! And just for images you own! What if you've got a collection of your favorite... I dunno, LOLcat pictures. Or say you've got 5 GB of home videos, or 15 GB of music, or anything really--what do you do with it? Where do you put it? You're going to end up paying for it somehow, whether it's by dodging ads every time you want to listen to a song you put online, or letting Google feed you ads for gay cruises when picks up on tags for some pictures you were sharing, or by you flat-out paying for the service.
So you're going to have less storage than if you went with a regular OS and kept your files on the computer, it's going to be ten times slower to access it, and you're going to pay a lot more for it in all likelihood... assuming, of course, that you've got 'net access when you want it. The example of watching a movie on a flight is a prime example of all of these things: the movie occupies several gigabytes which is a small fraction of a dollar to keep indefinitely on your computer but much more online, it will take a day to upload it and if you've got a good connection you'll be able to stream it in real time, but you're on a PLANE so chances are you have no connection whatsoever.
And finally, Performance. Some things can be done for fairly cheaply online--Gmail, Facebook, and FurAffinity are all services which do some behind-the-scenes computation to spit out the data your browser shows you personally. For these services, the computations are so slight that the occasional advertisement view generates enough revenue that the service usually pays for itself. Not everything is that way. World of Warcraft charges monthly fees (in theory) to pay for their server costs, because they need to run lots of systems 24/7 to ensure players can quest together. Of course, a game like World of Warcraft wouldn't run on Chrome OS because Chrome is all about running things in the browser, which WoW certainly won't do. The best you're going to get with an OS like that are flash games, and even though my system can churn through the newest games with relative ease it absolutely cowers at full-screen flash animations and games with the settings turned all the way up. So to restate, the problem here is that the idea of doing all the work in the browser means that performance-demanding tasks have to be done there as well--and I think that companies will not be eager to rewrite programs to do all the heavy lifting on their end of things. Hey, it's either that or write every program in Flash. Have I told you about how abysmal interactive Flash performance is on real computers, to say nothing of netbooks?
When you think about it, there's a smooth continuity of computer distributed...ness... from the pre-WWW era Windows box to the Google Chrome OS. On the one side, everything is kept and done locally, with high performance but no connectivity. On the other side, everything is kept and done remotely, with low performance but high connectivity. Right now we're in a happy medium, I like to think. We watch YouTube videos from a web server through a web browser, and we listen to local music on a local media player (or edit local files in a local instance of Photoshop). There are programs which will upload your files to the web for backup purposes, Gmail Offline will download your email from the web for when you lose your connection, etc... all sorts of programs and tools help to bridge the gap between web servers and your computer, making sure you have the local speed and reliability combined with web connectivity. This is a good area to drift around and explore in.
Some might argue that I'm missing the point. Chrome OS isn't aimed at replacing Windows for gamers, or graphic artists, or network administrators. It's a netbook OS which seeks to decentralize data from your computer, and speed things up. As things are for the forseeable future, the former makes a system potentially worthless for all tasks if connectivity slows down or is interrupted, as well as flying in the face of the second by replacing optimized code running directly on the hardware with Javascript and Flash running in a browser. It's already possible to use Microsoft Office to compose a document on your system when you're offline, and use Google Docs (complete with its weak formatting tools and slower response time) when you're online. Opting for Chrome OS robs you of the ability to do the offline work, and potentially gives you worse performance, just so you can shave half a minute off your unnecessary start-up time.
I'm going to play Devil's advocate here for a moment. Google's whole argument for making Chrome the basis of the OS is that people mostly use computers these days to do things that are online, and online means in the browser, so why not just make everything you do live in the browser somehow? Well, maybe what they need to do is make 'online' NOT equate to something in a standard tabbed browser window. Make a browser window that doesn't look like a web browser, and let me navigate it to a site and run it in a way that looks like a local program. Oh wait, Chrome already does that. And Mozilla's Prism aims to do the same. They put it better than I could: "Unfortunately the web browser, which was originally designed for reading documents, is not an ideal environment for running applications. It is frustrating and time-consuming to wade through a mass of browser windows and tabs just to find your email client. Unstable applications can slow down or crash your entire browser. And many of the conveniences offered by modern operating systems are unavailable to web apps running in the browser." Modern desktop operating systems have window management tools, mini-apps, and high-performance applications running natively on the hardware--these things have been refined and redesigned for over two decades. It feels pretty damn foolish to discard them just because Google's logical follow-up for "People spend a lot of time in the web browser" is "let's eliminate everything except the web browser".
[Tech] Quote of the week
Posted 16 years agoTaken from a MacRumors.com forum thread on H.264 vs VC-1 codecs:
"I'm sure we all know that the H.264 codec is supposed to be "open", but what does that mean? And why is this an advantage?"
It means you're not locked into a proprietary MS solution. It also means you have a number of companies competing to provide the best H.264 codec possible. Personally, I never make choices that lock me in to a single vendor, if I can help it.
Fucking LOL.
For those of you who don't see the irony, it's... on a website for rumors about Apple stuff c.c;
"I'm sure we all know that the H.264 codec is supposed to be "open", but what does that mean? And why is this an advantage?"
It means you're not locked into a proprietary MS solution. It also means you have a number of companies competing to provide the best H.264 codec possible. Personally, I never make choices that lock me in to a single vendor, if I can help it.
Fucking LOL.
For those of you who don't see the irony, it's... on a website for rumors about Apple stuff c.c;
[Games] Left 4 Dead 2
Posted 16 years agoLet us discuss it.
When they first announced L4D2 and its very early release compared to its predecessor, I was, like many people, furious. I had heard rumor that Valve was going to release more maps and content for L4D, I had heard something like a total of 10 campaigns by the end of 2009. So... where was our fucking content? They've only released half a campaign (a shitty one) and a single, tiny map since the game came out. I had thought, from what I knew about L4D2 at the time, that they should make the two games out of three parts: The L4D engine, the L4D1 content, and the L4D2 content. Sell each for $20-25, so you can get either "game" for about $50, and if you own one you can get the content of the other for about $25 more. Expansion pack pricing, pretty fair, right?
And then I played the demo, and it immediately became apparent that such a thing never would have worked. L4D2 isn't just content, so by definition, it can't be a DLC pack. Sure, they added lots more weapons, maps, models, music and icons. But they also changed the look and feel of the game, changed game mechanics, and cranked up the gore. Simply put, for those who haven't tried the demo, L4D2 makes L4D1 feel like a half-assed community-created mod knock-off of L4D2, which is the actual professional game.
There's this intangible element in games which differentiates them between being platinum-grade classics and 7.x-rated ho-hum fests which you wish you had rented before buying. In FPS, it's often related to the weapons. The guns have to look detailed and believable, the shots have to be loud and powerful, the impacts need to be forceful and lethal... otherwise it just feels like you're firing off a paintball gun. Killing Floor showed me proper weapons in a zombie game--shots will take off limbs, blow apart heads, and they've got a lot of kick with a satisfying bang. L4D2 is much the same. Weapons feel powerful. Even the starting pistols will take apart common enemies where previously L4D just gave you... a headless ragdog and a splash of red wine. Then I go back to L4D2 where my shotgun removes somebody's jaw, peels off their lower back, or causes them to stumble around for a few seconds while their small intestine uncoil and leave a short trail on the ground. Oh my god. This is a zombie game. Why the hell did I pay $50 not a year prior for some 2004-looking zombie shooter with... six guns? Seriously.
The truth of the matter is that I was never very satisfied with some of the things Valve did in L4D1. Although it was mildly amusing to see the Apple-style use of multi-meaning catch phrases on the campaign posters, I didn't really like how the campaigns were all entirely disjoint. Why were only those characters still uninfected and healthy 2 weeks after a pandemic? How did they find each other? How is it that repeatedly they escape and yet they keep finding themselves in all these terrible situations? With the exception of the DLC pack's first 10 seconds, where they're shown in a helicopter that crashed from the end of a previous campaign, there's absolutely no continuity. L4D2 is supposed to rectify that, providing a longer sort of story rather than dumping the characters in cool, but cliche settings.
I was also annoyed at the special infected in L4D1... mostly when doing multiplayer. A proper zombie game should just feature hoards of zombies, but they have to throw things off with tanks and whatnot. And although they haven't totally fixed this in L4D2, they're at least getting the idea... making common characters who are harder to kill in some ways, like the police officers and hazmat zombies. Anything that mixes up the flow of gameplay without dumping a tank on you every level is good in my book.
So I guess, even though I would've preferred it if they gave more free content to L4D1, what I've seen in L4D2 is so much more polished and fleshed out that I'm more happy with the opportunity to buy a -proper- zombie game. I won't deny that I love attention to detail, be it in the graphics, maps, music, or gameplay of a title... and I'm glad that Valve chose to crank the game up a few notches.
Here's hoping that the game (which I preordered a while back) is as good as the demo makes it look.
When they first announced L4D2 and its very early release compared to its predecessor, I was, like many people, furious. I had heard rumor that Valve was going to release more maps and content for L4D, I had heard something like a total of 10 campaigns by the end of 2009. So... where was our fucking content? They've only released half a campaign (a shitty one) and a single, tiny map since the game came out. I had thought, from what I knew about L4D2 at the time, that they should make the two games out of three parts: The L4D engine, the L4D1 content, and the L4D2 content. Sell each for $20-25, so you can get either "game" for about $50, and if you own one you can get the content of the other for about $25 more. Expansion pack pricing, pretty fair, right?
And then I played the demo, and it immediately became apparent that such a thing never would have worked. L4D2 isn't just content, so by definition, it can't be a DLC pack. Sure, they added lots more weapons, maps, models, music and icons. But they also changed the look and feel of the game, changed game mechanics, and cranked up the gore. Simply put, for those who haven't tried the demo, L4D2 makes L4D1 feel like a half-assed community-created mod knock-off of L4D2, which is the actual professional game.
There's this intangible element in games which differentiates them between being platinum-grade classics and 7.x-rated ho-hum fests which you wish you had rented before buying. In FPS, it's often related to the weapons. The guns have to look detailed and believable, the shots have to be loud and powerful, the impacts need to be forceful and lethal... otherwise it just feels like you're firing off a paintball gun. Killing Floor showed me proper weapons in a zombie game--shots will take off limbs, blow apart heads, and they've got a lot of kick with a satisfying bang. L4D2 is much the same. Weapons feel powerful. Even the starting pistols will take apart common enemies where previously L4D just gave you... a headless ragdog and a splash of red wine. Then I go back to L4D2 where my shotgun removes somebody's jaw, peels off their lower back, or causes them to stumble around for a few seconds while their small intestine uncoil and leave a short trail on the ground. Oh my god. This is a zombie game. Why the hell did I pay $50 not a year prior for some 2004-looking zombie shooter with... six guns? Seriously.
The truth of the matter is that I was never very satisfied with some of the things Valve did in L4D1. Although it was mildly amusing to see the Apple-style use of multi-meaning catch phrases on the campaign posters, I didn't really like how the campaigns were all entirely disjoint. Why were only those characters still uninfected and healthy 2 weeks after a pandemic? How did they find each other? How is it that repeatedly they escape and yet they keep finding themselves in all these terrible situations? With the exception of the DLC pack's first 10 seconds, where they're shown in a helicopter that crashed from the end of a previous campaign, there's absolutely no continuity. L4D2 is supposed to rectify that, providing a longer sort of story rather than dumping the characters in cool, but cliche settings.
I was also annoyed at the special infected in L4D1... mostly when doing multiplayer. A proper zombie game should just feature hoards of zombies, but they have to throw things off with tanks and whatnot. And although they haven't totally fixed this in L4D2, they're at least getting the idea... making common characters who are harder to kill in some ways, like the police officers and hazmat zombies. Anything that mixes up the flow of gameplay without dumping a tank on you every level is good in my book.
So I guess, even though I would've preferred it if they gave more free content to L4D1, what I've seen in L4D2 is so much more polished and fleshed out that I'm more happy with the opportunity to buy a -proper- zombie game. I won't deny that I love attention to detail, be it in the graphics, maps, music, or gameplay of a title... and I'm glad that Valve chose to crank the game up a few notches.
Here's hoping that the game (which I preordered a while back) is as good as the demo makes it look.
[Games] Borderlands. ZombieCat sounds like Clerks narrator.
Posted 16 years agoYou know the Clerks animated TV series?
zombiecat sounds about 90% like the narrator >_>
I know this thing because I played Borderlands with him (and a number of other people over the past 2 days). The game isn't perfect. It's like a convertible with no doors. It's a bitch to get everyone inside, but once that's taken care of, you can look forward to hours of exploring, having fun with your friends, shooting people and vehicular carnage (uh, I guess the metaphor falls apart on those last 2 things... at least, I hope it does D: ).
Because really, the game is fun. It's like a cross between Trigun, Fallout 3 and Left 4 Dead, to those of you who have experienced all of those. It starts out feeling like a stock FPS where guns do less damage (headshots aren't instantly lethal, etc) but after 10 or 20 hours you've got bad-ass weapons which regenerate ammo, cause people to explode when you hit them, set them on fire, etc. Given recent trends in gaming, I should be delighted just by the fact that I'm able to play for more than 8 hours without beating the game. And, I am. In true Diablo style, I can always raid enemy encampments to get a little more XP, improve my weapon proficiencies (Morrowind/Oblivion style here, where you improve some skills by using them!), and hope to get rare loot... although the experience is vastly better, because rather than being limited to seeing what's 50 feet away from me in an 800x600 window, I get a first-person experience which matches what I'm used to in the real world and doesn't leave me cursing the developers for my utter inability to see where I'm going.
It's a game which is good to play with friends, not just because they can balance out the gameplay (again in Diablo style, someone is a melee tank while another person does ranged supporting fire and yet another does healing and buff auras), but because the game isn't 100% "GO GO GO NEXT ENEMY ENGAGE NOW!" like you'll experience in Call of Duty games. This title is more playful and explorative (is that a word?). If you want to stop for a few moments to check out a new piece of equipment, or go joy riding in a Warthog-like buggy, or raid a starting camp to see how few bullets your now-overpowered weapons need to dispose of enemies, nothing is stopping you! It's kinda like a co-op GTA in that respect, you're in an open world that can be explored and traversed as little or much as you want, with fights popping up almost anywhere (though in GTA, you always started them, and here it's enemies spawning).
Of course, I'm only level 18, and there's still a lot of game to do. Still, if the next few days are anything like the previous few days, and they fix the difficult-as-fuck-to-host multiplayer, I'll be able to recommend the game without reservations to people who enjoy Diablo and FPS mechanics.
With that said I'm going to go play with my electrified SMG, cluster-shot-firing revolver, and 3x fire damage sniper rifle ~_~
zombiecat sounds about 90% like the narrator >_>I know this thing because I played Borderlands with him (and a number of other people over the past 2 days). The game isn't perfect. It's like a convertible with no doors. It's a bitch to get everyone inside, but once that's taken care of, you can look forward to hours of exploring, having fun with your friends, shooting people and vehicular carnage (uh, I guess the metaphor falls apart on those last 2 things... at least, I hope it does D: ).
Because really, the game is fun. It's like a cross between Trigun, Fallout 3 and Left 4 Dead, to those of you who have experienced all of those. It starts out feeling like a stock FPS where guns do less damage (headshots aren't instantly lethal, etc) but after 10 or 20 hours you've got bad-ass weapons which regenerate ammo, cause people to explode when you hit them, set them on fire, etc. Given recent trends in gaming, I should be delighted just by the fact that I'm able to play for more than 8 hours without beating the game. And, I am. In true Diablo style, I can always raid enemy encampments to get a little more XP, improve my weapon proficiencies (Morrowind/Oblivion style here, where you improve some skills by using them!), and hope to get rare loot... although the experience is vastly better, because rather than being limited to seeing what's 50 feet away from me in an 800x600 window, I get a first-person experience which matches what I'm used to in the real world and doesn't leave me cursing the developers for my utter inability to see where I'm going.
It's a game which is good to play with friends, not just because they can balance out the gameplay (again in Diablo style, someone is a melee tank while another person does ranged supporting fire and yet another does healing and buff auras), but because the game isn't 100% "GO GO GO NEXT ENEMY ENGAGE NOW!" like you'll experience in Call of Duty games. This title is more playful and explorative (is that a word?). If you want to stop for a few moments to check out a new piece of equipment, or go joy riding in a Warthog-like buggy, or raid a starting camp to see how few bullets your now-overpowered weapons need to dispose of enemies, nothing is stopping you! It's kinda like a co-op GTA in that respect, you're in an open world that can be explored and traversed as little or much as you want, with fights popping up almost anywhere (though in GTA, you always started them, and here it's enemies spawning).
Of course, I'm only level 18, and there's still a lot of game to do. Still, if the next few days are anything like the previous few days, and they fix the difficult-as-fuck-to-host multiplayer, I'll be able to recommend the game without reservations to people who enjoy Diablo and FPS mechanics.
With that said I'm going to go play with my electrified SMG, cluster-shot-firing revolver, and 3x fire damage sniper rifle ~_~
[Games] **** YOU CONSOLES. JUST **** YOU.
Posted 16 years agoYanno, I normally try to be "live and let live" to some degree when it comes to stuff like this. I get it. You're on a budget, but you enjoy gaming. Maybe you're not tech-saavy but you want to use your home theater. There are reasons for people to buy gaming consoles, thus reasons for them to exist, and even though I don't like the hardware monoculture (or... biculture, if that's a word?) that emerges, I figure it doesn't really affect me very much so I should just let it slide. That part of me is now dead. Console bullshit is infecting PC gaming.
The thing which is really grinding on my nerves today is that Modern Warfare 2 is retailing for $60 on Steam. Normally when you pre-order a game on Steam, you get 10% = $5 off, so the game is $45. MW2, no discount. And that's fine, I can understand Activision/Infinity Ward not being on board with that. The problem is that the default price of the game has gone up $10, to match console prices. There's a reason that PS3 and 360 games sell for $10 more--it's because those systems initially sell at a loss, and take a long time to break even, so the game sales are how Sony and Microsoft make their money. They command a fee to make games for their proprietary hardware (see kids, proprietary hardware is just so awesome, isn't it?), and that cost is passed on to the gamer. At the end of the day, the net result is that people who buy fewer games (or no games) get the system for the cheapest, and people who buy the most end up paying more for it once the hardware discounts are outpaced by the game price markups. This bullshit subsidizing does not occur on PCs--every company involved makes a profit for every part, sort of like Nintendo. That's why PC and Wii games haven't been affected by the price increase, until now.
Although I can only speculate as to the justification for raising the price of Modern Warfare 2 to $60, my best guesses would be A) "The game has a lot of content and with Spec Ops is basically 2 games, $60 is a bargain", B) "It's a very high-profile title which cost a lot to make", or C) "We're just bringing the prices up to the industry standard." I already explained by C is bullshit. B is equally bogus because at the prices and volumes the game will have, we're looking at something that will generate enough revenue to pay for another Lord of the Rings movie rather than some half-assed mission/weapon update of an already grossly profitable title from 2 years ago. And A is a moot point because FPS story modes these days are way too short, and Infinity Ward chose to make the main game lack any co-op. That's becoming an expected feature these days.
This comes on top of the earlier announcement that the PC version of Modern Warfare 2 will be gimped with the inability to do mods, dedicated servers, and custom content. It's all gotta use Infinity Ward's own matchmaking service. Call me paranoid, but introducing console pricing and console multiplayer limitations onto the PC platform in a really fucking big title like this doesn't bode well for things. What can we expect in the future, a complete shift toward an Xbox Live-like system where we have to pay to do multiplayer, because we aren't allowed to host our own servers?
Of course, these are just the latest offenses suffered by the PC gaming crowd. We already have to live with, or not live with in some cases, 1) later releases, 2) later updates, 3) senselessly console-exclusive titles, 4) the inability to do LAN play with a single copy of the disc, even though consoles allow 4-player split-screen, and 5) the lower frequency of "collector's edition" copies of games compared to console counterparts. We're second class citizens, and it's disgusting.
And how could I forget, we all have to live with the shit performance of the current range of consoles. I'm not even going to consider the Wii, I'm just talking about PS360 here. They're using hardware which is several generations old, and it shows. Why do you think they can't even push 720p in most of the prettier games, and have to resort to freaking "600p" (that's 25% higher than 480p = PS2/Gamecube-era graphics)? That, combined with the 360's limited DVD space (7 GB) and the PS3's limited Blu-ray transfer speed (9 MB/sec), as well as the limited memory for both systems (512 MB), means that only the PC is really able to have, load, access, process, AND display lots of high-grade content at any given time. Why am I bitching about the performance of systems that I'm not using? Because, as a game developer, how much extra effort are you going to invest into higher-quality models, textures, and effects when only your smallest market, PC gamers, will enjoy it? Gone are the Doom 3 days when a company says "We're going to launch this game with settings that aren't accessible until you own a graphics card which is currently not for sale". No, if you're going to sell 1 million copies to PC gamers and 5 million copies to console gamers, who are you really going to tailor the experience towards?
Consoles are dragging games down in complexity, graphical quality, and modability while they bring UP the prices of games. Brilliant. And even as they bring down the experience for all PC gamers, their rigidity and simplicity helps to curb enhancements in gaming experiences before they can even take off.
Can you use a PS3 to get 3840*2160 resolution (quad HD) by using 4 projectors to make an massive screen? No, but you can move your projector back twice as far and get 1/4 times the brightness and 1/13 the detail. What about getting a nice 120 fps on a 120 Hz display with your trusty 360? Ignoring the software cap, the hardware can barely push 60 fps as is. What about using the 360 with 3D glasses to give your games life-like depth? No hardware, no software, not enough performance, in short no support. Can your PS3's USB ports drive an amBX system to provide localized lighting cues and game-controlled fan feedback? What about using a Novint Falcon to provide realistic gun recoil, allow you to feel virtual objects, and provide other advanced touch feedback? No? No? What's that? You provide RUMBLE? You can thrash about like a can of angry bumblebees then the player fires a gun? And you'll UPSCALE to 1080p? You mean you'll provide the same image scaling to your games that PC users have expected from their photo, video, and game software since the '90's? Well aren't you just the CUTTING EDGE of all things new and immersive in gaming. (Note: Except for the quad-1080p setup, Sepf and I have all of these things working on our computers. Instead of 3840x2160 he runs his games at 5040x1050).
Less rage-filled summary for those who care:
* Consoles are possibly going to spread $60 games and trim custom content and servers from PC software equivalents
* Consoles are most likely limiting the amount and quality of content in PC games with their low-end hardware and widespread appeal
* Consoles are definitely hampering the adoption of more advanced input and output devices
Quit dragging us down with you.
The thing which is really grinding on my nerves today is that Modern Warfare 2 is retailing for $60 on Steam. Normally when you pre-order a game on Steam, you get 10% = $5 off, so the game is $45. MW2, no discount. And that's fine, I can understand Activision/Infinity Ward not being on board with that. The problem is that the default price of the game has gone up $10, to match console prices. There's a reason that PS3 and 360 games sell for $10 more--it's because those systems initially sell at a loss, and take a long time to break even, so the game sales are how Sony and Microsoft make their money. They command a fee to make games for their proprietary hardware (see kids, proprietary hardware is just so awesome, isn't it?), and that cost is passed on to the gamer. At the end of the day, the net result is that people who buy fewer games (or no games) get the system for the cheapest, and people who buy the most end up paying more for it once the hardware discounts are outpaced by the game price markups. This bullshit subsidizing does not occur on PCs--every company involved makes a profit for every part, sort of like Nintendo. That's why PC and Wii games haven't been affected by the price increase, until now.
Although I can only speculate as to the justification for raising the price of Modern Warfare 2 to $60, my best guesses would be A) "The game has a lot of content and with Spec Ops is basically 2 games, $60 is a bargain", B) "It's a very high-profile title which cost a lot to make", or C) "We're just bringing the prices up to the industry standard." I already explained by C is bullshit. B is equally bogus because at the prices and volumes the game will have, we're looking at something that will generate enough revenue to pay for another Lord of the Rings movie rather than some half-assed mission/weapon update of an already grossly profitable title from 2 years ago. And A is a moot point because FPS story modes these days are way too short, and Infinity Ward chose to make the main game lack any co-op. That's becoming an expected feature these days.
This comes on top of the earlier announcement that the PC version of Modern Warfare 2 will be gimped with the inability to do mods, dedicated servers, and custom content. It's all gotta use Infinity Ward's own matchmaking service. Call me paranoid, but introducing console pricing and console multiplayer limitations onto the PC platform in a really fucking big title like this doesn't bode well for things. What can we expect in the future, a complete shift toward an Xbox Live-like system where we have to pay to do multiplayer, because we aren't allowed to host our own servers?
Of course, these are just the latest offenses suffered by the PC gaming crowd. We already have to live with, or not live with in some cases, 1) later releases, 2) later updates, 3) senselessly console-exclusive titles, 4) the inability to do LAN play with a single copy of the disc, even though consoles allow 4-player split-screen, and 5) the lower frequency of "collector's edition" copies of games compared to console counterparts. We're second class citizens, and it's disgusting.
And how could I forget, we all have to live with the shit performance of the current range of consoles. I'm not even going to consider the Wii, I'm just talking about PS360 here. They're using hardware which is several generations old, and it shows. Why do you think they can't even push 720p in most of the prettier games, and have to resort to freaking "600p" (that's 25% higher than 480p = PS2/Gamecube-era graphics)? That, combined with the 360's limited DVD space (7 GB) and the PS3's limited Blu-ray transfer speed (9 MB/sec), as well as the limited memory for both systems (512 MB), means that only the PC is really able to have, load, access, process, AND display lots of high-grade content at any given time. Why am I bitching about the performance of systems that I'm not using? Because, as a game developer, how much extra effort are you going to invest into higher-quality models, textures, and effects when only your smallest market, PC gamers, will enjoy it? Gone are the Doom 3 days when a company says "We're going to launch this game with settings that aren't accessible until you own a graphics card which is currently not for sale". No, if you're going to sell 1 million copies to PC gamers and 5 million copies to console gamers, who are you really going to tailor the experience towards?
Consoles are dragging games down in complexity, graphical quality, and modability while they bring UP the prices of games. Brilliant. And even as they bring down the experience for all PC gamers, their rigidity and simplicity helps to curb enhancements in gaming experiences before they can even take off.
Can you use a PS3 to get 3840*2160 resolution (quad HD) by using 4 projectors to make an massive screen? No, but you can move your projector back twice as far and get 1/4 times the brightness and 1/13 the detail. What about getting a nice 120 fps on a 120 Hz display with your trusty 360? Ignoring the software cap, the hardware can barely push 60 fps as is. What about using the 360 with 3D glasses to give your games life-like depth? No hardware, no software, not enough performance, in short no support. Can your PS3's USB ports drive an amBX system to provide localized lighting cues and game-controlled fan feedback? What about using a Novint Falcon to provide realistic gun recoil, allow you to feel virtual objects, and provide other advanced touch feedback? No? No? What's that? You provide RUMBLE? You can thrash about like a can of angry bumblebees then the player fires a gun? And you'll UPSCALE to 1080p? You mean you'll provide the same image scaling to your games that PC users have expected from their photo, video, and game software since the '90's? Well aren't you just the CUTTING EDGE of all things new and immersive in gaming. (Note: Except for the quad-1080p setup, Sepf and I have all of these things working on our computers. Instead of 3840x2160 he runs his games at 5040x1050).
Less rage-filled summary for those who care:
* Consoles are possibly going to spread $60 games and trim custom content and servers from PC software equivalents
* Consoles are most likely limiting the amount and quality of content in PC games with their low-end hardware and widespread appeal
* Consoles are definitely hampering the adoption of more advanced input and output devices
Quit dragging us down with you.
[Games] Borderlands for $33.75, purchase time!
Posted 16 years agoA number of reviews have hit today for Borderlands, and most seem to place it in the high 80s without really having much to complain about. Humor, interesting visuals, good atmosphere, solid music and voice acting, good FPS gameplay, a good level of RPG features, crazy gunplay and insane loot-whoring.
As no serious issues have been raised,
sepffuzzball and I are game for pre-ordering it. After all, for $33.75, how bad can it be? We'll be able to take two other people into our 4-pack, but if we can find 4 other people willing to Paypal us the funds up front, Sepf and I will each get a 4-pack. Beyond that, however, all we can do is let people group up on their own. You can't gift away all four copies in a four-pack!
Group A:
1.
twile
2.
chaosie
3.
bazz
4.
firondraak
Group B:
1.
sepffuzzball
2.
ixyao
3.
deusexbestia
4.
fcskyrara
Although Sepf and I have each gotten 4-packs of the game ($135 on Steam, $33.75 apiece) after finding 6 other people, people who are interested can still put their names forward to look for groups to pool resources with.
Expressed Interest:
newfdraggie
lorddark
maxraccoon
akkeresu
Looking For Group:
briskfulstorm
sovy
minnieshoof
hellkat
As no serious issues have been raised,
sepffuzzball and I are game for pre-ordering it. Group A:
1.
twile2.
chaosie3.
bazz4.
firondraakGroup B:
1.
sepffuzzball2.
ixyao3.
deusexbestia4.
fcskyraraAlthough Sepf and I have each gotten 4-packs of the game ($135 on Steam, $33.75 apiece) after finding 6 other people, people who are interested can still put their names forward to look for groups to pool resources with.
Expressed Interest:
newfdraggie
lorddark
maxraccoon
akkeresuLooking For Group:
briskfulstorm
sovy
minnieshoof
hellkat[Games] Borderlands for $33.75, who wants in?
Posted 16 years agoBorderlands launches in a bit over a week for consoles, and a bit over two for PC. Although normally this sucks, the potential upshot is that we may be able to verify if the game is good by reading reviews while still having the preorder pricing in effect.
Steam's offering a bundle of 4 copies of Borderlands for $134.97, which comes out just under $33.75 per copy. This makes the console version ($60 + tax which is what, $63-65?) look embarrassingly expensive, which amuses me... but anyway, back on topic. If we're able to find a couple other people willing to chip in, we can all benefit from a discounted copy of the game.
So here's how it'll work. People can get in 'line' here, and probably a few days before the game releases, while pre-order pricing is still in effect, I'll go down the list and check with people to make sure they're still interested. Sending payment via Paypal will secure them a copy, to be gifted over when the game launches on the 26th. Even if the pre-order pricing goes out of effect unexpectedly though, the regular price is 4 copies for $150 = $37.50 each, which is still cheaper than a preorder or regular copy.
So who else wants in? It'll be first-come first-serve unless somebody is unreachable.
Note that even if I find enough people to fill up my own 4-pack, this is still as good a place as any to try and find other people to group up with. Of course neither I nor any FA staff can be responsible for deals made through this site, but if you're willing to risk giving somebody 35 bucks, this just might let you save 10 or 15.
Steam's offering a bundle of 4 copies of Borderlands for $134.97, which comes out just under $33.75 per copy. This makes the console version ($60 + tax which is what, $63-65?) look embarrassingly expensive, which amuses me... but anyway, back on topic. If we're able to find a couple other people willing to chip in, we can all benefit from a discounted copy of the game.
So here's how it'll work. People can get in 'line' here, and probably a few days before the game releases, while pre-order pricing is still in effect, I'll go down the list and check with people to make sure they're still interested. Sending payment via Paypal will secure them a copy, to be gifted over when the game launches on the 26th. Even if the pre-order pricing goes out of effect unexpectedly though, the regular price is 4 copies for $150 = $37.50 each, which is still cheaper than a preorder or regular copy.
So who else wants in? It'll be first-come first-serve unless somebody is unreachable.
Note that even if I find enough people to fill up my own 4-pack, this is still as good a place as any to try and find other people to group up with. Of course neither I nor any FA staff can be responsible for deals made through this site, but if you're willing to risk giving somebody 35 bucks, this just might let you save 10 or 15.
[Furry] Birthday
Posted 16 years agoNot mine, my fiance's. I'll be bending
sepffuzzball down over a couch all day, please leave presents in the rear :3
sepffuzzball down over a couch all day, please leave presents in the rear :3[Games] Borderlands
Posted 16 years agoEveryone who's mentioned this game to or around me (and only in the past couple weeks) has been really excited about it. I guess that's natural, given that people who aren't excited for a game tend not to talk about it, but I mean... I never heard about it until the start of the month, and it's coming out at the end of the month, and the hype seems rather abrupt.
So I'm interested in it. Anyone here who knows about it, share your knowledge with me. Aside from being an FPS (<3) with some RPGish elements (<3 <3) and having Diablo-grade loot (<3 <3 <3) and being an online co-op (<3 <3 <3 <3), what is there about this game which gets you all hot and bothered? I mean... that's only like 10 <3s right there. I need at least 11 to be convinced I need this game.
Go!
So I'm interested in it. Anyone here who knows about it, share your knowledge with me. Aside from being an FPS (<3) with some RPGish elements (<3 <3) and having Diablo-grade loot (<3 <3 <3) and being an online co-op (<3 <3 <3 <3), what is there about this game which gets you all hot and bothered? I mean... that's only like 10 <3s right there. I need at least 11 to be convinced I need this game.
Go!
[Games] Twile's PC Game of the Year, 1995-2008
Posted 16 years agoBecause I can. These are titles which you should at least be familiar with, if you haven't played them. In most cases they are among the best games I have ever played. When the running is really close for some years, I'll give one-sentence runner up descriptions too.
1995: MechWarrior 2
Although this game mainly made the list because it's the only title from 1995 that came to mind, that doesn't make it any less awesome. Probably my first 3D PC game and certainly my first Battletech game, MechWarrior 2 and its successors also boasted some of the most complex controls of any game I've played. Seriously, probably two dozen keys for movement alone. But still, stomping around in a giant mech is a great pastime, and MechWarrior 2 was my first time doing this. Yay Thor!
1996: Civilization 2
At 13 years old, this is the oldest game that I've seriously played this year. Unfortunately it's a 16-bit application which means 64-bit Windows will be unwilling to play it, but I still keep it around and remember it fondly. The Civilization games defined my entrance and experience in the 4X genre, and I pity anyone who hasn't played them. This is the "just one more turn" series which is so addictive that it takes both willpower and a visible clock to avoid expanding your civilization until your vision is blurry at 5 AM. Seriously, in Civ IV they added an always-visible clock with an alarm feature. It's that addictive.
1997: Total Annihiliation
Featuring some of the most epic game music in an RTS to date, TA landed a full 6 months before Starcraft. Although it would never have the 9.5 million units sold or excellent story and characters of its Blizzard competitor, it's a noteworthy title for actually being a 3D game, hundreds of units, loads of user-made content, and a successor of sorts to hit within the next decade. In many ways, TA and Starcraft were polar opposites: Starcraft was dripping with charm and backstory, TA was fairly bland and the story was "These doods didn't want to have their brains put in robot bodies, so they started a civil war". Starcraft units were limited in numbers but highly memorable, TA units were extremely plentiful but easily forgettable. In Starcraft you could only select a dozen units at a time and build a few dozen good units, in TA you could select every unit on the map, including buildings, and the number of units you could have was in the hundreds and only limited for performance reasons. In short the game supported 3D at big resolutions with big numbers of units stomping around and blasting each other to smoking metal wreckages, which was pretty damn cool. It's also the place where, at the tender age of 10, I learned how to spell 'annihilation'.
1998: Starcraft (Brood Wars)
In many ways the compliment to the previous year's TA, Starcraft belongs on every list of 'best PC games ever'--just ask the Koreans. From its story and cinematics to the units and gameplay, there's just so much to love about this 1998 title and its expansion pack. There aren't a lot of games that people will be rabid for a sequel to more than 10 years after the series' last release--Starcraft is one of them. Even ten years later, who could forget Kerrigan, the banishment of the Dark Templar, and building additional pylons? The RTS genre at its finest, folks.
Runner up: Half Life
Certainly a noteworthy game, Half Life defined a great model for FPS and told a mildly compelling story, starting the Valve tradition of raising more questions than they answer.
1999: Homeworld
Another RTS with absolutely beautiful music, Homeworld took the genre into space. With a highly motivating story (you're a civilization on a shitty planet, you discover a massive spaceship derelict in a desert and realize you're a race of exiles, and you fight your way back to your homeworld against the tyrranical empire that exiled you) and colorful, imaginative locations in deep space, the game was a delight to play. It also featured a lot of gameplay elements which still aren't too common in games: your resources, army and researched technologies are persistent from one level to the next so you can't just narrowly complete a level. You can use salvage tugs to capture enemy vessels--depriving them of the units, getting free ones yourself, avoiding losing your own ships, saving yourself huge sums of resources and getting the chance to reverse-engineer their technologies for yourself. There is a single unified resource (the Resource Unit), given that your technology lets you break objects down on the atomic level. All units can be repaired. All these things just make sense in a space-faring RTS of this sort, and make the game both a challenge and a delight to play.
Runner up: MechWarrior 3
Awesome for the same reasons as MechWarrior 2 but so much prettier and with some of the "you're on your own, your equipment and teammates are persistent from one level to the next" elements that made Homeworld cool, MW3 was an awesome game which was the pinnacle of the series, in my opinion.
2000: Deus Ex
Without a doubt, this is one of the best games I have ever played. With so many things just done so right, it's hard to know where to begin. The game takes place around 2050 when nanite-based augmentation is able to grant superhuman abilities to those with access to the tech. Ripe with cyberpunk and conspiracy theory themes, the game has a compelling story which takes you all over the globe. Although the entire game (except chat cutscenes) is an FPS, there are a lot of RPG elements in the game--you can spend your experience points on upgrading your skills, you have a 2D inventory which forces you to carefully consider what equipment you bring along with you, weapons are upgradeable, there are multiple approaches to most situations (stealth, hacking, combat), your game actions and dialog options impact each other (saying certain things may trigger combat, taking non-violent approaches may earn or lose you favor from certain NPCs), and there are multiple endings. This game is difficult, entertaining, and so goddamn fun. Ten bucks on Steam, buy it if you have any respect for the FPS genre or sci-fi and conspiracy theories.
Runner up: Homeworld: Cataclysm
Building off the success of its 1999 predecessor, Cataclysm refined many of the imperfections of the series, improved balancing, had an even MORE compelling story, and featured an upgradeable mothership which slowly transforms from self-sufficient mining vessel to war ship.
2001: Max Payne
The only game I've ever bought with a mousepad and the only mousepad I've ever used for more than a year, Max Payne is my favorite Third Person Shooter ever. It's gritty, it's got bullet time, and it's freaking MAX PAYNE. It also has such awesome theme music. From the thugs to the drugs to the cults, this game is all kinds of dark. And diving to cover while slow-motion firing your assault rifle at some punks just never gets old. And did I mention that the game is dark and gritty? The game STARTS with your wife and baby being murdered by some drugged up criminals. Max Payne is all kinds of good.
Runner up: Aliens vs Predator 2
One of the few 'flawless' FPS I've played, AVP2 tries to do so many things well and absolutely succeeds--from the space marine on the run from aliens to said wall-scurrying, shadow-dwelling aliens to the stealthy and advanced Predators, the game offers three distinct and entertaining ways to play.
2002: No One Lives Forever 2
Another of the 'flawless' FPS of the early 2000s, No One Lives Forever 2 puts you in the shoes of a less pornstarish female version of Austin Powers. Set during the Cold War, you play as a British secret agent who works for UNITY, trying to stop the diabolical and slightly silly organization of HARM from igniting the third World War (get it? You're in HARM's way). The premise is fairly serious but the game is just riddled with humor. Instead of a proximity mine, you use a small robotic cat which pounces on anyone who walks by and then explodes. You can throw bananas so people will trip on them. You can hit bunnies while on a snowmobile. Evil henchmen have conversations about the importance of having a really cool secret lair. You have a swordfight with a ninja in a house that's been sucked up by a tornado. And in the culmination of all things silly, you wield a tommygun with infinite ammo as you ride on the back of a large Irishman who is peddling away on a tricycle and chasing after the unicycle-riding French midget Mime King through the streets of India.. That's not to say that the game is just an exercise in silliness, it's also an exceptionally good FPS. There are elements of stealth, although you don't have to be stealthy, and like Deus Ex you are rewarded for your achievements with experience points that you can spend on your various skills. The gameplay, story, sound, and graphics are all just so polished that even 7 years later it's an absolute delight to play.
Runner up: Morrowind
'Live another life' was the marketing tag of this '02 first-person RPG, and at the time it was one of the best examples that I'd seen of the concept--from picking your race and skills to equipment, missions and dialog, it was perhaps the first game where I was able to and wanted to explore the wilderness freely and find everything there was to see and do.
2003: Warcraft 3
Although there were many good titles to hit in 2003, the best one I think was good ol' WC3. Although this may be in part because it was the last major push for the awesome Warcraft realm into the RTS genre, it was also just an exceptionally good game. The missions ranged from huge battles to small, tactical dungeon-crawlers, there was the standard Blizzard sprinkling of humor, the graphics were colorful and playful, and the story was memorable and motivating. And the cinematics, dear lord--I remember a friend who downloaded all of those before the game itself, because they were just so pretty. An excellent game which was well-balanced, fun, and a good balance of serious and joking, WC3 still holds up to this day.
Runners up: Unreal 2, Freelancer
The final 'flawless' FPS of the early 2000s, Unreal 2 was a great title which excelled in all the areas an FPS should--story, inventive weapons, exotic locations, and graphics--it's also one of the few FPS to ever make me genuinely cry. Freelancer was a great first/third person dogfighting and trading space sim which I still want to get into playing with friends to this day, and which defined easy and powerful controls for its genre.
2004: Half Life 2
Although the original Half Life didn't make it to be one of my top games because its genre-defining impact was lost on someone who didn't play it until after the impact had swept the industry, I was there for HL2 and thus can appreciate all the things it did so right. Continuing the Valve tradition of answering a question and raising ten, HL2 told fragments of a story with as many unspoken cues (settings, character expressions, general atmosphere) as it did through direct explanation provided to the player. It also brought us huge levels and outdoor environments with vehicles, physics puzzles, and the foundations for Garry's Mod. An excellent game in pretty much all respects, it's also notable for helping to usher in the era of widescreen gaming in a time when most displays were still 4:3 or even 5:4 and the Xbox 360 was still a year out.
Runner up: Far Cry
Lacking the memorable characters and story of HL2, this game is still notable for having very impressive graphics for the time, looking like Doom 3 while offering lush, sprawling tropical environments--months before HL2 and Doom 3 and from a developer nobody had ever heard of (also it was fun).
2005: Civilization 4
Much like Civilization 2 from 9 years before, Civ IV offered a highly addictive and entertaining nation-building experience. This was its first foray into true 3D graphics, and it did so beautifully. Many refinements were made over previous games, removing the things which most annoyed players and streamlining pointlessly meanial tasks. There isn't really a whole lot to say about Civ IV, other than that it is the highest point of the series--and that it features an awesome main theme song.
2006: Oblivion
Oblivion and I have a love-hate relationship. I've put hundreds of hours into it, played through a couple times, even finished the game once--and yet it's just got so many flaws. Dozens, hundreds of little things that make no sense. Like how merchants will only permit X gold per transaction, but unlimited transactions in a row. Or how merchants are able to tell that an apple from 15 miles away is stolen, but thieves will eagerly buy up things you just stole from them. At the end of the day, though, the game offers a whole lot of fun and can be modded to remove a lot of the bugs and build off of the pretty strong platform it offers. And let's not forget the Triumphant Execution of the adoring fan.
2007: Team Fortress 2
I don't see how TF2 could NOT be the game of 2007--it's probably the most-played title from that year at present. Sure, Portal got critical acclaim and Mass Effect was very good and popular, but at the end of the day TF2 is what people play after work and before bed. Managing to inject humor and personality into purely multiplayer characters is no small feat, but Valve managed to do it, using just a handful of cinematics (which are purely optional, and not all complete) to provide more than 4 second sound clips from any given character. Not everyone enjoys class-based online FPS, but for everyone else TF2 is a blast to play and is positively dripping with potential for YouTube videos.
Runner up: Call of Duty 4 Modern Warfare
The game which brought the CoD series away from its WWII roots and possibly made true on Infinity Ward's earlier desire to make the best FPS, Modern Warfare provided excellent singleplayer and multiplayer action and did more for the irradiated Chernobyl setting in a single level than the entirety of the S.T.A.L.K.E.R. series.
2008: Fallout 3
Offering a blend of Oblivion-like "explore, steal, kill everything ugly" gameplay S.T.A.L.K.E.R. irradiated post-nuclear holocaust wastelands with black humor and somewhat turn-based combat, Fallout 3 was something that I knew all of nothing about until it launched. Once I looked into it I immediately understood why people were wetting themselves over it, and I must admit I wasted many weeks exploring and grinding and stealing everything I could. I really don't have a lot to complain about when it comes to this game, which is more than I can say for most these days. My biggest gripe is that they didn't include a set of armor which is hands-down the best, which makes me have to irritably weigh pros and cons. That's saying something.
Runner up: Left 4 Dead
Despite the controversy over L4D2, the original FPS zombie survival game is still a lot of fun and very replayable if you've got good friends willing to get your back.
1995: MechWarrior 2
Although this game mainly made the list because it's the only title from 1995 that came to mind, that doesn't make it any less awesome. Probably my first 3D PC game and certainly my first Battletech game, MechWarrior 2 and its successors also boasted some of the most complex controls of any game I've played. Seriously, probably two dozen keys for movement alone. But still, stomping around in a giant mech is a great pastime, and MechWarrior 2 was my first time doing this. Yay Thor!
1996: Civilization 2
At 13 years old, this is the oldest game that I've seriously played this year. Unfortunately it's a 16-bit application which means 64-bit Windows will be unwilling to play it, but I still keep it around and remember it fondly. The Civilization games defined my entrance and experience in the 4X genre, and I pity anyone who hasn't played them. This is the "just one more turn" series which is so addictive that it takes both willpower and a visible clock to avoid expanding your civilization until your vision is blurry at 5 AM. Seriously, in Civ IV they added an always-visible clock with an alarm feature. It's that addictive.
1997: Total Annihiliation
Featuring some of the most epic game music in an RTS to date, TA landed a full 6 months before Starcraft. Although it would never have the 9.5 million units sold or excellent story and characters of its Blizzard competitor, it's a noteworthy title for actually being a 3D game, hundreds of units, loads of user-made content, and a successor of sorts to hit within the next decade. In many ways, TA and Starcraft were polar opposites: Starcraft was dripping with charm and backstory, TA was fairly bland and the story was "These doods didn't want to have their brains put in robot bodies, so they started a civil war". Starcraft units were limited in numbers but highly memorable, TA units were extremely plentiful but easily forgettable. In Starcraft you could only select a dozen units at a time and build a few dozen good units, in TA you could select every unit on the map, including buildings, and the number of units you could have was in the hundreds and only limited for performance reasons. In short the game supported 3D at big resolutions with big numbers of units stomping around and blasting each other to smoking metal wreckages, which was pretty damn cool. It's also the place where, at the tender age of 10, I learned how to spell 'annihilation'.
1998: Starcraft (Brood Wars)
In many ways the compliment to the previous year's TA, Starcraft belongs on every list of 'best PC games ever'--just ask the Koreans. From its story and cinematics to the units and gameplay, there's just so much to love about this 1998 title and its expansion pack. There aren't a lot of games that people will be rabid for a sequel to more than 10 years after the series' last release--Starcraft is one of them. Even ten years later, who could forget Kerrigan, the banishment of the Dark Templar, and building additional pylons? The RTS genre at its finest, folks.
Runner up: Half Life
Certainly a noteworthy game, Half Life defined a great model for FPS and told a mildly compelling story, starting the Valve tradition of raising more questions than they answer.
1999: Homeworld
Another RTS with absolutely beautiful music, Homeworld took the genre into space. With a highly motivating story (you're a civilization on a shitty planet, you discover a massive spaceship derelict in a desert and realize you're a race of exiles, and you fight your way back to your homeworld against the tyrranical empire that exiled you) and colorful, imaginative locations in deep space, the game was a delight to play. It also featured a lot of gameplay elements which still aren't too common in games: your resources, army and researched technologies are persistent from one level to the next so you can't just narrowly complete a level. You can use salvage tugs to capture enemy vessels--depriving them of the units, getting free ones yourself, avoiding losing your own ships, saving yourself huge sums of resources and getting the chance to reverse-engineer their technologies for yourself. There is a single unified resource (the Resource Unit), given that your technology lets you break objects down on the atomic level. All units can be repaired. All these things just make sense in a space-faring RTS of this sort, and make the game both a challenge and a delight to play.
Runner up: MechWarrior 3
Awesome for the same reasons as MechWarrior 2 but so much prettier and with some of the "you're on your own, your equipment and teammates are persistent from one level to the next" elements that made Homeworld cool, MW3 was an awesome game which was the pinnacle of the series, in my opinion.
2000: Deus Ex
Without a doubt, this is one of the best games I have ever played. With so many things just done so right, it's hard to know where to begin. The game takes place around 2050 when nanite-based augmentation is able to grant superhuman abilities to those with access to the tech. Ripe with cyberpunk and conspiracy theory themes, the game has a compelling story which takes you all over the globe. Although the entire game (except chat cutscenes) is an FPS, there are a lot of RPG elements in the game--you can spend your experience points on upgrading your skills, you have a 2D inventory which forces you to carefully consider what equipment you bring along with you, weapons are upgradeable, there are multiple approaches to most situations (stealth, hacking, combat), your game actions and dialog options impact each other (saying certain things may trigger combat, taking non-violent approaches may earn or lose you favor from certain NPCs), and there are multiple endings. This game is difficult, entertaining, and so goddamn fun. Ten bucks on Steam, buy it if you have any respect for the FPS genre or sci-fi and conspiracy theories.
Runner up: Homeworld: Cataclysm
Building off the success of its 1999 predecessor, Cataclysm refined many of the imperfections of the series, improved balancing, had an even MORE compelling story, and featured an upgradeable mothership which slowly transforms from self-sufficient mining vessel to war ship.
2001: Max Payne
The only game I've ever bought with a mousepad and the only mousepad I've ever used for more than a year, Max Payne is my favorite Third Person Shooter ever. It's gritty, it's got bullet time, and it's freaking MAX PAYNE. It also has such awesome theme music. From the thugs to the drugs to the cults, this game is all kinds of dark. And diving to cover while slow-motion firing your assault rifle at some punks just never gets old. And did I mention that the game is dark and gritty? The game STARTS with your wife and baby being murdered by some drugged up criminals. Max Payne is all kinds of good.
Runner up: Aliens vs Predator 2
One of the few 'flawless' FPS I've played, AVP2 tries to do so many things well and absolutely succeeds--from the space marine on the run from aliens to said wall-scurrying, shadow-dwelling aliens to the stealthy and advanced Predators, the game offers three distinct and entertaining ways to play.
2002: No One Lives Forever 2
Another of the 'flawless' FPS of the early 2000s, No One Lives Forever 2 puts you in the shoes of a less pornstarish female version of Austin Powers. Set during the Cold War, you play as a British secret agent who works for UNITY, trying to stop the diabolical and slightly silly organization of HARM from igniting the third World War (get it? You're in HARM's way). The premise is fairly serious but the game is just riddled with humor. Instead of a proximity mine, you use a small robotic cat which pounces on anyone who walks by and then explodes. You can throw bananas so people will trip on them. You can hit bunnies while on a snowmobile. Evil henchmen have conversations about the importance of having a really cool secret lair. You have a swordfight with a ninja in a house that's been sucked up by a tornado. And in the culmination of all things silly, you wield a tommygun with infinite ammo as you ride on the back of a large Irishman who is peddling away on a tricycle and chasing after the unicycle-riding French midget Mime King through the streets of India.. That's not to say that the game is just an exercise in silliness, it's also an exceptionally good FPS. There are elements of stealth, although you don't have to be stealthy, and like Deus Ex you are rewarded for your achievements with experience points that you can spend on your various skills. The gameplay, story, sound, and graphics are all just so polished that even 7 years later it's an absolute delight to play.
Runner up: Morrowind
'Live another life' was the marketing tag of this '02 first-person RPG, and at the time it was one of the best examples that I'd seen of the concept--from picking your race and skills to equipment, missions and dialog, it was perhaps the first game where I was able to and wanted to explore the wilderness freely and find everything there was to see and do.
2003: Warcraft 3
Although there were many good titles to hit in 2003, the best one I think was good ol' WC3. Although this may be in part because it was the last major push for the awesome Warcraft realm into the RTS genre, it was also just an exceptionally good game. The missions ranged from huge battles to small, tactical dungeon-crawlers, there was the standard Blizzard sprinkling of humor, the graphics were colorful and playful, and the story was memorable and motivating. And the cinematics, dear lord--I remember a friend who downloaded all of those before the game itself, because they were just so pretty. An excellent game which was well-balanced, fun, and a good balance of serious and joking, WC3 still holds up to this day.
Runners up: Unreal 2, Freelancer
The final 'flawless' FPS of the early 2000s, Unreal 2 was a great title which excelled in all the areas an FPS should--story, inventive weapons, exotic locations, and graphics--it's also one of the few FPS to ever make me genuinely cry. Freelancer was a great first/third person dogfighting and trading space sim which I still want to get into playing with friends to this day, and which defined easy and powerful controls for its genre.
2004: Half Life 2
Although the original Half Life didn't make it to be one of my top games because its genre-defining impact was lost on someone who didn't play it until after the impact had swept the industry, I was there for HL2 and thus can appreciate all the things it did so right. Continuing the Valve tradition of answering a question and raising ten, HL2 told fragments of a story with as many unspoken cues (settings, character expressions, general atmosphere) as it did through direct explanation provided to the player. It also brought us huge levels and outdoor environments with vehicles, physics puzzles, and the foundations for Garry's Mod. An excellent game in pretty much all respects, it's also notable for helping to usher in the era of widescreen gaming in a time when most displays were still 4:3 or even 5:4 and the Xbox 360 was still a year out.
Runner up: Far Cry
Lacking the memorable characters and story of HL2, this game is still notable for having very impressive graphics for the time, looking like Doom 3 while offering lush, sprawling tropical environments--months before HL2 and Doom 3 and from a developer nobody had ever heard of (also it was fun).
2005: Civilization 4
Much like Civilization 2 from 9 years before, Civ IV offered a highly addictive and entertaining nation-building experience. This was its first foray into true 3D graphics, and it did so beautifully. Many refinements were made over previous games, removing the things which most annoyed players and streamlining pointlessly meanial tasks. There isn't really a whole lot to say about Civ IV, other than that it is the highest point of the series--and that it features an awesome main theme song.
2006: Oblivion
Oblivion and I have a love-hate relationship. I've put hundreds of hours into it, played through a couple times, even finished the game once--and yet it's just got so many flaws. Dozens, hundreds of little things that make no sense. Like how merchants will only permit X gold per transaction, but unlimited transactions in a row. Or how merchants are able to tell that an apple from 15 miles away is stolen, but thieves will eagerly buy up things you just stole from them. At the end of the day, though, the game offers a whole lot of fun and can be modded to remove a lot of the bugs and build off of the pretty strong platform it offers. And let's not forget the Triumphant Execution of the adoring fan.
2007: Team Fortress 2
I don't see how TF2 could NOT be the game of 2007--it's probably the most-played title from that year at present. Sure, Portal got critical acclaim and Mass Effect was very good and popular, but at the end of the day TF2 is what people play after work and before bed. Managing to inject humor and personality into purely multiplayer characters is no small feat, but Valve managed to do it, using just a handful of cinematics (which are purely optional, and not all complete) to provide more than 4 second sound clips from any given character. Not everyone enjoys class-based online FPS, but for everyone else TF2 is a blast to play and is positively dripping with potential for YouTube videos.
Runner up: Call of Duty 4 Modern Warfare
The game which brought the CoD series away from its WWII roots and possibly made true on Infinity Ward's earlier desire to make the best FPS, Modern Warfare provided excellent singleplayer and multiplayer action and did more for the irradiated Chernobyl setting in a single level than the entirety of the S.T.A.L.K.E.R. series.
2008: Fallout 3
Offering a blend of Oblivion-like "explore, steal, kill everything ugly" gameplay S.T.A.L.K.E.R. irradiated post-nuclear holocaust wastelands with black humor and somewhat turn-based combat, Fallout 3 was something that I knew all of nothing about until it launched. Once I looked into it I immediately understood why people were wetting themselves over it, and I must admit I wasted many weeks exploring and grinding and stealing everything I could. I really don't have a lot to complain about when it comes to this game, which is more than I can say for most these days. My biggest gripe is that they didn't include a set of armor which is hands-down the best, which makes me have to irritably weigh pros and cons. That's saying something.
Runner up: Left 4 Dead
Despite the controversy over L4D2, the original FPS zombie survival game is still a lot of fun and very replayable if you've got good friends willing to get your back.
[Furry] Join the Fuzzy Dragon Pile
Posted 16 years agoAttendees must be 18 or older because fuzzy dragons do not wear clothes ~_~
I had a chatroom where I turned people into fuzzydragons for the evening with the intention of having a furpile. It turned into a fur-plus-dicking pile x'D
[Games] Help test a new TF2 map!
Posted 16 years ago
draco18s has been working on a map for TF2, and he needs some help testing it out to see how it's balanced and how it plays. As such, starting at 7 PM EST on Thursday (October 1st, this week) he will be holding a test session.I know a lot of you's furries play TF2, and if you'd be willing to come by and play, possibly give some feedback and tell him about problems you notice, it'd be really helpful for him. He's hoping to have 12+ people at a time, so if you've got free time and TF2 installed, please consider helping out and bringing a friend or two!
Anyone interested can give me a note, give him a note, comment here with their Steam ID, etc and more details will be given as the event approaches.
Screenshots of the map can be found [here].
[Games] On Consoles and FPS
Posted 16 years agoIn trying to find people to play World at War with me, I found out first-hand that a lot of World at War players play the game on PS3 and 360. This disturbs me.
I understand the usefulness of consoles to some gamers, really I do. For people who have, say, $200-300 to spare on a gaming system but not $1000-2000, a console is not just a smart choice, it's the only choice. Half a decade ago, I used to make the argument that every household should already have a decent computer, and a graphics card costs the same as a console. Today I won't make that same argument, because with the exception of PC gaming, there aren't many compelling reasons to own a desktop over a laptop. So, consoles win the price battle.
And then some genre are just better-suited for consoles or console controllers. Games which involve a single shared view for several players, such as Smash Bros, make more sense on a single large screen--and even if you hooked a computer up to a TV, there would still be the complications of dealing with 4 different controllers at once. Games which deal with button combos, such as fighters, are better with console controllers because it's easier to have muscle memory for a smaller shape you wrap your hand around than a larger flat surface with a uniform button pattern over the whole thing (and fighters tend to use a shared view, as I already discussed). And games which utilize multiple analog inputs, such as racing titles, are far better suited for analog controllers--you can get these for PC of course, but the standard console controller has that right out of the box.
Not a single category I just listed applies to FPS. Not a damned one. In fact, if anything, they make FPS ill-suited for the platform. Shared screens? Many FPS run at 600p on consoles, and if you're doing 3 or 4 player stuff, that means you get 533x300 to work with. That's approximately the same resolution as an iPhone, half of standard-def. And hey, people can see your screen too. Multiple analog inputs? Yes, your fire and jump or crouch buttons really benefit from being analog. Oh wait, they don't. Your walking is analog too, so you're able to carefully pick the particular rate at which you slow-motion walk. Brilliant. And your aiming suffers because you're using a control stick.
Yes, that's my main beef with FPS on consoles: the control stick. This is not a new complaint, but it is one that I'm going to emphasize because I hear 'A mouse isn't good for games' more than my tolerance level (one utterance per decade). If you use a mouse and get used to it, you will be a better FPS player. You can whip around in an instant, aim a single pixel to the side, or do anything in between. You are faster and more precise with your turning. Why do you think they have targeting systems and slight auto-aim in some console shooters? Why do you think PC versions of FPS are more fast-paced? Because everyone is a much better shot. Because in using your entire arm to give you 4 square FEET of potential workspace (realistically closer to 1 square foot) rather than just your thumb to give you 4 square INCHES of workspace, you're able to make much smaller or greater adjustments. Because in the grand scheme of things, being able to AIM down to the pixel is more important than being able to control how fast you walk down to the foot per second (although when I watch people play, they typically just walk at full speed 99% of the time anyway). And if you REALLY care about your movement speed that much, increasingly in games you're able to use a 360 controller, and some people use it just in their left hand, while using a mouse in their right (said people also sometimes use a Rockband drum pedal for crouching >_>).
If you don't want to or can't spend $1k on a gaming system, I'm not going to bite your head off, but if you try to otherwise defend FPS on consoles... well, may the best argument win.
I understand the usefulness of consoles to some gamers, really I do. For people who have, say, $200-300 to spare on a gaming system but not $1000-2000, a console is not just a smart choice, it's the only choice. Half a decade ago, I used to make the argument that every household should already have a decent computer, and a graphics card costs the same as a console. Today I won't make that same argument, because with the exception of PC gaming, there aren't many compelling reasons to own a desktop over a laptop. So, consoles win the price battle.
And then some genre are just better-suited for consoles or console controllers. Games which involve a single shared view for several players, such as Smash Bros, make more sense on a single large screen--and even if you hooked a computer up to a TV, there would still be the complications of dealing with 4 different controllers at once. Games which deal with button combos, such as fighters, are better with console controllers because it's easier to have muscle memory for a smaller shape you wrap your hand around than a larger flat surface with a uniform button pattern over the whole thing (and fighters tend to use a shared view, as I already discussed). And games which utilize multiple analog inputs, such as racing titles, are far better suited for analog controllers--you can get these for PC of course, but the standard console controller has that right out of the box.
Not a single category I just listed applies to FPS. Not a damned one. In fact, if anything, they make FPS ill-suited for the platform. Shared screens? Many FPS run at 600p on consoles, and if you're doing 3 or 4 player stuff, that means you get 533x300 to work with. That's approximately the same resolution as an iPhone, half of standard-def. And hey, people can see your screen too. Multiple analog inputs? Yes, your fire and jump or crouch buttons really benefit from being analog. Oh wait, they don't. Your walking is analog too, so you're able to carefully pick the particular rate at which you slow-motion walk. Brilliant. And your aiming suffers because you're using a control stick.
Yes, that's my main beef with FPS on consoles: the control stick. This is not a new complaint, but it is one that I'm going to emphasize because I hear 'A mouse isn't good for games' more than my tolerance level (one utterance per decade). If you use a mouse and get used to it, you will be a better FPS player. You can whip around in an instant, aim a single pixel to the side, or do anything in between. You are faster and more precise with your turning. Why do you think they have targeting systems and slight auto-aim in some console shooters? Why do you think PC versions of FPS are more fast-paced? Because everyone is a much better shot. Because in using your entire arm to give you 4 square FEET of potential workspace (realistically closer to 1 square foot) rather than just your thumb to give you 4 square INCHES of workspace, you're able to make much smaller or greater adjustments. Because in the grand scheme of things, being able to AIM down to the pixel is more important than being able to control how fast you walk down to the foot per second (although when I watch people play, they typically just walk at full speed 99% of the time anyway). And if you REALLY care about your movement speed that much, increasingly in games you're able to use a 360 controller, and some people use it just in their left hand, while using a mouse in their right (said people also sometimes use a Rockband drum pedal for crouching >_>).
If you don't want to or can't spend $1k on a gaming system, I'm not going to bite your head off, but if you try to otherwise defend FPS on consoles... well, may the best argument win.
[Games] Who here plays World at War?
Posted 16 years agoPC version, of course. If so, lemmie know so we can hook up and get some game on. I've been itching to do Nazi Zombies but I don't last more than round 8 or so by myself ;-;
For those of you who somehow don't know what it is and are too lazy to Wiki, it's a Call of Duty game back in the good ol' WWII era. I know what you're thinking. WWII? That's been beaten to death. Yeah well suck it, WaW is fun as hell and unlike modern games, the guns have FLAVOR to them. Marv says that modern cars all look like electric razors, I say that modern guns all feel like M16s. Back in the '40s, it wasn't about composites and 30 rounds a second, it was about metal shortages and mass production. And it was okay to have a gun that needed the bullets to be loaded one at a time :<
For those of you who somehow don't know what it is and are too lazy to Wiki, it's a Call of Duty game back in the good ol' WWII era. I know what you're thinking. WWII? That's been beaten to death. Yeah well suck it, WaW is fun as hell and unlike modern games, the guns have FLAVOR to them. Marv says that modern cars all look like electric razors, I say that modern guns all feel like M16s. Back in the '40s, it wasn't about composites and 30 rounds a second, it was about metal shortages and mass production. And it was okay to have a gun that needed the bullets to be loaded one at a time :<
[Tech] Cool Windows 7 Tricks
Posted 16 years agoLet's share them.
I just discovered this one today:
* Middle Click or Shift-Click on an entry in the taskbar to bring up a new window for said program, when applicable. For web browsers it brings up a new browser window, for chat programs it brings up the window with your contacts list, for Explorer it brings up an Explorer window of your Libraries. Similarly, middle-click (Shift-Click doesn't work here) on a a taskbar thumbnail of a window to kill that window. Bigger target than the red X button.
These others are generally documented but you still might not know about some of them:
* Got a small 16:9 display or just really value your screen real estate? You can use small icons in the taskbar to reduce the height of the task bar to a mere 30 pixels high.
* Prefer the classic taskbar look? Set the taskbar buttons to 'Combine when taskbar is full' or 'Never combine' to show labels for active programs, and unpin the default applications from the taskbar (right-click the entries to see the option).
* Windows Key + Left and Windows Key + Right will cause the currently active window to move and resize so that it fills the left or right half of the screen it is on. This is handy if you want to work with two windows but be able to see both at once. You can cycle through the different positions (Left half, standard window, Right half) by continuing to push the button to move the window in the desired direction. Also, if you have multiple monitors, this will cycle through into the other monitors (if you have a secondary monitor to the right of the primary, Windows Key + Right will move a window from being on the right half of the left monitor to the left half of the right monitor).
* Alternately, you can get the half-maximize effect by moving the window to the desired side until your mouse cursor hits the edge of the screen. You'll see an expanding animation, release the mouse to get your half-maximized window. This is difficult/impossible along some sides of multi-monitor setups, which is why the Windows Key + Left/Right is useful. When you exit these half-maximized states, just like when you exit maximized or minimized states, the window remembers its previous size and location.
* Windows Key + Up maximizes, Windows Key + Down minimizes.
* Grabbing the title bar for a window and shaking it will cause all other windows to minimize. Repeating the action will cause the other windows to go back to where they were. Handy if things are getting cluttered, I guess.
* If you right click on a taskbar entry (or, for the sake of touchscreen users, left-click and drag up) you'll get a potentially more rich list of things to do with that application, not just Close. Applications which support these so-called Jump Lists offer support for things like "Open new tab" (for web browsers), "Go to store" (for iTunes), "Play all music" (for media players) and more. By default most programs that you open files in will give you a "Recent" list of recently opened documents. Application support for this will hopefully grow. Essentially it merges some features of the tray with the taskbar.
* If you want an image to be your background, make sure you make a JPEG version for that purpose. If you use anything except JPEG, Windows will make a crappy-quality JPEG version and use that instead. Don't ask me why.
* Thumbnails for currently very few programs will have controls below them. Windows Media Player 11 and iTunes 9 let you pause/play, previous track and next track from three small buttons below the thumbnails.
That's all I can think of for now! As I said, please add any tricks that you've read about (and tested) or discovered on your own here. In a bit over a month this will be the new OS standard, so the sooner you learn these tricks, the sooner you'll start saving seconds here and there.
I just discovered this one today:
* Middle Click or Shift-Click on an entry in the taskbar to bring up a new window for said program, when applicable. For web browsers it brings up a new browser window, for chat programs it brings up the window with your contacts list, for Explorer it brings up an Explorer window of your Libraries. Similarly, middle-click (Shift-Click doesn't work here) on a a taskbar thumbnail of a window to kill that window. Bigger target than the red X button.
These others are generally documented but you still might not know about some of them:
* Got a small 16:9 display or just really value your screen real estate? You can use small icons in the taskbar to reduce the height of the task bar to a mere 30 pixels high.
* Prefer the classic taskbar look? Set the taskbar buttons to 'Combine when taskbar is full' or 'Never combine' to show labels for active programs, and unpin the default applications from the taskbar (right-click the entries to see the option).
* Windows Key + Left and Windows Key + Right will cause the currently active window to move and resize so that it fills the left or right half of the screen it is on. This is handy if you want to work with two windows but be able to see both at once. You can cycle through the different positions (Left half, standard window, Right half) by continuing to push the button to move the window in the desired direction. Also, if you have multiple monitors, this will cycle through into the other monitors (if you have a secondary monitor to the right of the primary, Windows Key + Right will move a window from being on the right half of the left monitor to the left half of the right monitor).
* Alternately, you can get the half-maximize effect by moving the window to the desired side until your mouse cursor hits the edge of the screen. You'll see an expanding animation, release the mouse to get your half-maximized window. This is difficult/impossible along some sides of multi-monitor setups, which is why the Windows Key + Left/Right is useful. When you exit these half-maximized states, just like when you exit maximized or minimized states, the window remembers its previous size and location.
* Windows Key + Up maximizes, Windows Key + Down minimizes.
* Grabbing the title bar for a window and shaking it will cause all other windows to minimize. Repeating the action will cause the other windows to go back to where they were. Handy if things are getting cluttered, I guess.
* If you right click on a taskbar entry (or, for the sake of touchscreen users, left-click and drag up) you'll get a potentially more rich list of things to do with that application, not just Close. Applications which support these so-called Jump Lists offer support for things like "Open new tab" (for web browsers), "Go to store" (for iTunes), "Play all music" (for media players) and more. By default most programs that you open files in will give you a "Recent" list of recently opened documents. Application support for this will hopefully grow. Essentially it merges some features of the tray with the taskbar.
* If you want an image to be your background, make sure you make a JPEG version for that purpose. If you use anything except JPEG, Windows will make a crappy-quality JPEG version and use that instead. Don't ask me why.
* Thumbnails for currently very few programs will have controls below them. Windows Media Player 11 and iTunes 9 let you pause/play, previous track and next track from three small buttons below the thumbnails.
That's all I can think of for now! As I said, please add any tricks that you've read about (and tested) or discovered on your own here. In a bit over a month this will be the new OS standard, so the sooner you learn these tricks, the sooner you'll start saving seconds here and there.
[RL] Bank Recommendations
Posted 16 years agoI need to move to a new bank, because my old one only has a branch up in NY where I no longer live. Anyone have recommendations/testimonials for/against any major banks? D=
[Tech] Your very own server
Posted 16 years agoThis journal was originally rather lengthy. In hopes of more people not Ctrl+W-ing the tab, I've cut through the fluff [;_;] to get to the meaty center.
At least once a month, somebody I'm watching posts a journal about a hard drive death taking a whole swath of their digital life down with it. Invariably, these people don't have their files backed up. All they can do is rage while they hear their hard drive clicking and laying waste to gigabytes of carefully organized music, years of personal logs, artwork which isn't available anywhere else, and whatever else they might have. The common solution offered by folks like
dragoneer is to get an external hard drive and back up your files regularly. And you know, that works fine if you've got a single computer. But if you're under the age of 18 or above the age of 22, and possibly even if you're in that age range, you've got multiple computers on your home network. What do you do? One external drive for each computer? One external drive for a computer that's always on? And what if you have files you want to share between those computers, too? Like a music collection, or video files, or personal documents?
Windows Home Server is a handy ~$100 OS which you can put on old desktops or new systems built specifically for the role. It allows you to pack a computer case to its proverbial gills with hard drives--be they old 80 gig ones you have sitting around, or massive 1500 gig ones purchased to maximize storage--and integrate that storage into your home network. You can make virtual folders which span multiple drives, letting you store terabytes of files in a single 'directory' even if you've only got 320 GB drives. These virtual folders can be accessible by any computer on your home network, or restricted so that only certain users can change the contents or even just read the contents. The server will perform regular (in my case, nightly) backups of all your systems to make sure you never lose more than a day's work, and it does it intelligently so duplicate files only get stored once--no need to keep around gigabytes of duplicate Windows files or MP3s that you've got copies of on each computer. Although while you're at it, why not dump all your MP3s into a single directory on the server, so you can access your music more easily? With a planned update for Home Server, it'll even integrate your files automatically into the Windows 7 libraries, so you don't have to care about where they are.
And if the stuff you're storing on your server is really important to you, personal files or carefully organized music you'd rather not lose, you can enable data redundancy for the virtual folders. The server keeps an extra copy of the files in those folders on more than one drive, so even if one of your hard drives does crash, any files you really care about which were on it exist somewhere else in the system and will be promptly copied over to other drives to again keep them protected. It's not RAID, where you decide up front to commit a certain amount of a certain number of drives to always duplicate data. It's a folder-level redundancy which lets you use two 500 GB drives for 1 TB of non-backed-up data (what you'd typically do), 500 GB of backed-up data (something else you could easily do), or any flexible combination of redundant and non-redundant storage. Starting out with 700 gigs of anime that you don't care too much about losing? No problem. Add in 3 GB of personal documents and music which you want to be backed up and protected from drive failure? All you have to do is click a checkbox. It's a degree of flexibility that you aren't going to get with an external hard drive, NAS system, or Apple's Time Capsule.
So whether you just want your files to automatically be backed up, or you want a comprehensive system with indexed searching across redundant multi-terabyte network shares, Windows Home Server has your networked storage covered. But it doesn't stop there. Indeed, some of the important benefits you'll get will be side-effects of just having the system set up. You don't have to wonder about where you should put your documents or media, put it on the server. If you're having a LAN party and everyone's going to install a particular game or patch, don't put it on DVDs or flash discs--dump it on the server in a public directory. The server becomes your general-purpose file-dump. And because you're keeping most of your bulky files off of your personal computer, you'll probably be fine with a single hard drive, even a 750 gig drive will store a hundred modern games. Backups are smaller and faster. Everything feels more connected and organized. You feel secure in knowing that everything is backed up and an inevitable drive failure will just mean you have a few hundred fewer gigs of storage space until the manufacturer sends a replacement. Knowing this, you might even do riskier things with your computer--putting a couple drives in RAID 0 for example, for dramatically faster hard drive speeds. Who cares if it's more likely to experience a 'cripling' drive crash? You just pop in a new one and restore from your previous night's backup. Trust me. The security you feel from having all your stuff backed up, and the convenience you enjoy by having thousands of gigabytes of networked storage... they're well worth the cost of the OS.
I wrote this up (again, if you'll believe it, this is the shortened version) for several reasons, for those who are curious. First, two years back I had some journals discussing Windows Home Server. Now I have experience with it, and I'd like to share that experience with you. Second, a lot of people still don't have a backup solution, and this is a rather sexy one. Finally, as of late both for myself and some people, I've been seeking out components for a mature, modern home network. In a recent journal I detailed how to turn Blu-ray discs into gorgeous 1080p rips in admittedly large 10-30 gig files. Even a pretty large 1 TB drive will only hold a few dozen, a hundred at most, before it's full up. So where do you store these files in a cost-effective manner? Your WHS box. Grab the most cost-effective or largest drives you can get (1500 GB are good right now) and toss them in for terabytes of space. Sepf and I have over 9000 GB right now, approximately half of the 18000 we could have if we upgraded to exclusively 1500 GB drives.
If you want to make a cheap but good WHS box you will want (assuming you don't have other parts laying around) :
Windows Home Server, $95
A cheap 35-watt CPU, $40
A cheap 6-SATA, 4-PCI-Express port motherboard, $80
2x512 MB DDR2 modules, $28
A generic case with 6 HDD spots, expandable to 12, with 500W power supply, $70
If you've got access to a PCI-Express graphics card and a DVD-ROM drive for about an hour, you can put them in the server for installation, and your grand total is about $300 for the hardware and OS. If not, grab the cheapest graphics card ($25) and the cheapest DVD drive ($20) you can find, and still remove them after installation (they will draw power, after all).
That hardware up there, with your hard drive of choice, will give you a functional system that can house up to 6 hard drives. After that point, you'll need a 3x5.25"-to-4x3.5" drive bay ($23) and a 4 SATA port controller card ($39) for each additional 4 drives you want.
The first case will max out at 12 hard drives, but there's nothing stopping you from getting a second case and power supply, and filling that one up with hard drives as well. Connect the drives to the power supply in the second case, and use long SATA or eSATA cables to connect the drives to the server, simple as that. The practical limit of drives you can physically connect to the motherboard is 6 onboard + 3x4 (PCI-based) and 3x2 (PCI-Express-based) = 24, which is conveniently as many as you can fit in two full towers. Really, if you need any more than that, I think this guide is beyond your scope =P
At least once a month, somebody I'm watching posts a journal about a hard drive death taking a whole swath of their digital life down with it. Invariably, these people don't have their files backed up. All they can do is rage while they hear their hard drive clicking and laying waste to gigabytes of carefully organized music, years of personal logs, artwork which isn't available anywhere else, and whatever else they might have. The common solution offered by folks like
dragoneer is to get an external hard drive and back up your files regularly. And you know, that works fine if you've got a single computer. But if you're under the age of 18 or above the age of 22, and possibly even if you're in that age range, you've got multiple computers on your home network. What do you do? One external drive for each computer? One external drive for a computer that's always on? And what if you have files you want to share between those computers, too? Like a music collection, or video files, or personal documents?Windows Home Server is a handy ~$100 OS which you can put on old desktops or new systems built specifically for the role. It allows you to pack a computer case to its proverbial gills with hard drives--be they old 80 gig ones you have sitting around, or massive 1500 gig ones purchased to maximize storage--and integrate that storage into your home network. You can make virtual folders which span multiple drives, letting you store terabytes of files in a single 'directory' even if you've only got 320 GB drives. These virtual folders can be accessible by any computer on your home network, or restricted so that only certain users can change the contents or even just read the contents. The server will perform regular (in my case, nightly) backups of all your systems to make sure you never lose more than a day's work, and it does it intelligently so duplicate files only get stored once--no need to keep around gigabytes of duplicate Windows files or MP3s that you've got copies of on each computer. Although while you're at it, why not dump all your MP3s into a single directory on the server, so you can access your music more easily? With a planned update for Home Server, it'll even integrate your files automatically into the Windows 7 libraries, so you don't have to care about where they are.
And if the stuff you're storing on your server is really important to you, personal files or carefully organized music you'd rather not lose, you can enable data redundancy for the virtual folders. The server keeps an extra copy of the files in those folders on more than one drive, so even if one of your hard drives does crash, any files you really care about which were on it exist somewhere else in the system and will be promptly copied over to other drives to again keep them protected. It's not RAID, where you decide up front to commit a certain amount of a certain number of drives to always duplicate data. It's a folder-level redundancy which lets you use two 500 GB drives for 1 TB of non-backed-up data (what you'd typically do), 500 GB of backed-up data (something else you could easily do), or any flexible combination of redundant and non-redundant storage. Starting out with 700 gigs of anime that you don't care too much about losing? No problem. Add in 3 GB of personal documents and music which you want to be backed up and protected from drive failure? All you have to do is click a checkbox. It's a degree of flexibility that you aren't going to get with an external hard drive, NAS system, or Apple's Time Capsule.
So whether you just want your files to automatically be backed up, or you want a comprehensive system with indexed searching across redundant multi-terabyte network shares, Windows Home Server has your networked storage covered. But it doesn't stop there. Indeed, some of the important benefits you'll get will be side-effects of just having the system set up. You don't have to wonder about where you should put your documents or media, put it on the server. If you're having a LAN party and everyone's going to install a particular game or patch, don't put it on DVDs or flash discs--dump it on the server in a public directory. The server becomes your general-purpose file-dump. And because you're keeping most of your bulky files off of your personal computer, you'll probably be fine with a single hard drive, even a 750 gig drive will store a hundred modern games. Backups are smaller and faster. Everything feels more connected and organized. You feel secure in knowing that everything is backed up and an inevitable drive failure will just mean you have a few hundred fewer gigs of storage space until the manufacturer sends a replacement. Knowing this, you might even do riskier things with your computer--putting a couple drives in RAID 0 for example, for dramatically faster hard drive speeds. Who cares if it's more likely to experience a 'cripling' drive crash? You just pop in a new one and restore from your previous night's backup. Trust me. The security you feel from having all your stuff backed up, and the convenience you enjoy by having thousands of gigabytes of networked storage... they're well worth the cost of the OS.
I wrote this up (again, if you'll believe it, this is the shortened version) for several reasons, for those who are curious. First, two years back I had some journals discussing Windows Home Server. Now I have experience with it, and I'd like to share that experience with you. Second, a lot of people still don't have a backup solution, and this is a rather sexy one. Finally, as of late both for myself and some people, I've been seeking out components for a mature, modern home network. In a recent journal I detailed how to turn Blu-ray discs into gorgeous 1080p rips in admittedly large 10-30 gig files. Even a pretty large 1 TB drive will only hold a few dozen, a hundred at most, before it's full up. So where do you store these files in a cost-effective manner? Your WHS box. Grab the most cost-effective or largest drives you can get (1500 GB are good right now) and toss them in for terabytes of space. Sepf and I have over 9000 GB right now, approximately half of the 18000 we could have if we upgraded to exclusively 1500 GB drives.
If you want to make a cheap but good WHS box you will want (assuming you don't have other parts laying around) :
Windows Home Server, $95
A cheap 35-watt CPU, $40
A cheap 6-SATA, 4-PCI-Express port motherboard, $80
2x512 MB DDR2 modules, $28
A generic case with 6 HDD spots, expandable to 12, with 500W power supply, $70
If you've got access to a PCI-Express graphics card and a DVD-ROM drive for about an hour, you can put them in the server for installation, and your grand total is about $300 for the hardware and OS. If not, grab the cheapest graphics card ($25) and the cheapest DVD drive ($20) you can find, and still remove them after installation (they will draw power, after all).
That hardware up there, with your hard drive of choice, will give you a functional system that can house up to 6 hard drives. After that point, you'll need a 3x5.25"-to-4x3.5" drive bay ($23) and a 4 SATA port controller card ($39) for each additional 4 drives you want.
The first case will max out at 12 hard drives, but there's nothing stopping you from getting a second case and power supply, and filling that one up with hard drives as well. Connect the drives to the power supply in the second case, and use long SATA or eSATA cables to connect the drives to the server, simple as that. The practical limit of drives you can physically connect to the motherboard is 6 onboard + 3x4 (PCI-based) and 3x2 (PCI-Express-based) = 24, which is conveniently as many as you can fit in two full towers. Really, if you need any more than that, I think this guide is beyond your scope =P
[Misc] Filler journal
Posted 16 years agoUntil I have something less massive to occupy my front page, this journal shall serve to keep the page length down.
On the subject of filler, what is your favorite type of pie?
On the subject of filler, what is your favorite type of pie?
[Tech] Twile's Ultimate Look at Apple Hating
Posted 16 years agoThere are many things people shout back and forth when it comes to Mac vs. PC. "Macs are too expensive!" "Well what's the point of buying a computer if it gets a virus and stops working? Plus Windows sucks!" "Macs don't have any games, though!" People argue until their tongues or fingers fall off, but of course, pretty much everything that is talked about has another side to it, so their arguments are kinda moot points to me.
I do not hate Apple for the conventional reasons--in fact, I like a lot of the designs they implement, and understand their pricing. And I don't like Microsoft for making 'great' software, in reality it pisses me off on a regular basis. No, I choose to support or hate these companies based on how their competitive practices will impact the computer and electronics markets.
I originally wrote this because, about a month ago, during the span of several days I was confronted on multiple occasions about why it was that I have so much spite for Apple. It is my hope that this will be the last time for a good long while that I'll need to write more than a paragraph about why I disapprove of Apple and anyone who supports the company financially. I have written this in the format of myself addressing a logically-minded individual who is curious about why I'm so polarized against Apple--something that I never get from actual Apple fans. The whole thing is over 6 pages and 3600 words, and although I wish you'd all read it, you can get the core of my complaint by reading Parts A, B and G (the very last one). Alternately, only read the bolded sections. It's like a mini essay!
Special note: Do not post any ignorant "____ sucks"-type comments to this journal. Even though you're mostly just here to see furry dragon porn, I'm still going to have to ask you to be respectful and thoughtful in my little corner of this website.
Part A: Summary
-- Twile, why do you dislike Apple so much?
I dislike Apple because they are an anti-competitive force in the technology market.
-- What do you mean by that? Isn't Microsoft like that as well? Why do you support them?
Apple uses business practices and strong software-hardware ties to create an anti-competitive segment of the market. Microsoft is only concerned with selling you their OS, because Zune and Xbox aside, they don't sell any hardware (I'll get back to those devices later). Simply put, both Microsoft and Apple want people to use their computer OS, and their OS only. However, only Apple is trying to push their hardware on you as well.
Part B: Elaboration
-- Why is that a bad thing? Aren't other companies, like Dell or HP, also trying to get you to buy their hardware?
Yes, they are. But importantly, they all use the same OS. See, right now I have an ASUS laptop, and I'm not very happy with the build quality of it. When I get another laptop, I can choose to get one which is not by ASUS. I can make that migration as easily as changing toilet paper brands, because I know all my software will still work on my next computer, because it will also run Windows. That's not the case with Apple products. If I buy a Mac once, most of the software I get used to using, or even purchase, will only run on a Mac. So if, for any reason, I decide I want a different brand of computer in the future, I have to give up all the stuff I was using.
-- Why does that really matter, though? I mean, Apple makes really nice hardware.
They do indeed. However, they don't provide all the sorts of things that you get in the PC world. Say I really get into gaming and I want something with more oomph than the Radeon 4850 in the iMac. You know, I want to play games on a 30" monitor, or a trio of 22" displays, and the iMac just doesn't have the power to push out the pixels I want. What do I do then? Apple doesn't make products for serious gamers. Or what if I want something with a carbon fiber or magnesium alloy shell, or a 1080p laptop with a swappable battery? I can choose between picking a PC that uses those and dumping all my previous software, or sighing and sticking with Apple until they hopefully make something more to my liking. Furthermore, as long as Apple is gaining marketshare from customers making the switch and very few people are able to switch back (for the reasons just stated), they will inevitably become a very large market force. In a worst-case scenario, they would become the dominant computer and OS manufacturer.
-- That doesn't sound like a worst-case scenario to me, really. Everyone's computers would just work. Viruses would be totally obliterated. Everyone would be able to benefit from the latest technologies.
Yeah, and computer technicians and hardware designers would lose their jobs, new sorts of malware would be developed, and people who couldn't afford new Macs would be stuck on hand-me-downs, right? No, in all seriousness, it would be a bad situation. You see, Apple is able to provide some of the finest hardware because of the PC industry.
-- What do you mean?
The PC industry works as follows. For every major computer part--CPU, GPU, hard drive, motherboard, etc--there are at least two manufacturers. They compete to make the best products which store the most, compute the fastest, take the least power, put out the least heat, take up the least space, and cost the least. Intel puts out their Pentium 4, AMD puts out their Athlon64. Intel puts out their Core and Core 2, AMD puts out the Phenom. Intel puts out the i7... and so on and so forth. PC manufacturers can pick and choose freely between these parts to make systems that they think will best suit the needs of their customers--be they small, low-power systems or massive computing powerhouses. It's never really the case that all of a company's products are worse in every respect to their competitors, so they can still sell things to finance R&D needed to try and take back the markets they aren't doing well in.
-- Yes yes, that's all very straightforward and generic. What does this have to do with Apple?
Apple does mostly what other companies do. They pick the best parts available to them, and put those in their systems. Now, I shouldn't have to tell you that lack of competition can make companies lazy--especially hardware companies. If they have no competition, they don't even have to make products which are significantly better, they can just wait for their previous ones to slowly die off and need replacing. Of course, if you make a processor that can last >5 years, putting out faster ones every few years to convince people to buy new systems will earn you more money... but still. Less competition means less drive to innovate. It also means less favorable pricing. There are reasons why we have laws to prevent monopolies.
-- I still don't see how this reflects badly on Apple.
Well, if Apple were the only major hardware provider, what would happen to the companies that didn't provide the absolute best products? What would happen to AMD, who makes approximately a third of the CPUs in gaming computers? Apple decided a few years ago that they were going to switch to Intel CPUs, because the upcoming Core series was just such a good thing. If there weren't other big computer manufacturers, AMD wouldn't be able to keep selling desktop CPUs. They would die, and without their competition, Intel wouldn't innovate to its full potential. Imagine that happening to every part in the computer.
-- Oh. That would be bad.
Yeah. And it's worse for Apple than any other hardware provider. See, most companies offer a little bit from this guy and a little bit from that guy, in different models. They provide Intel and AMD chips. Radeon and GeForce graphics parts. Toshiba and Seagate hard drives. After all, the parts are either generic enought to be interchangable, or they have their own drivers anyway. But Apple, for the sake of their unified ecosystem, tends to pick one provider whenever possible. And they stick with that provider, for all of their hardware, for as long as that hardware is on top. They optimize their software and their drivers for it. After all, at the end of the day, the big difference between the hardware in a Mac (or game console) and a PC is that Macs have consistent hardware and optimized support for it, while PCs have a lot of hardware variety and much less optimization.
Part C: The Apple Difference
-- So it's harder for them to switch their system components.
Yes. And that can bite them in the ass. Look back about 5 years. Apple had been using PowerPC chips, from IBM, for a decade. However, IBM was unable to provide chips that would fit the needs of the PowerBooks. I think they put out too much heat and needed too much power to be faster than the ones they were using. Anyway, the situation was so crappy, and the Core series was so promising, that Apple switched CPU brands (a switch which was no small undertaking). Imagine if PCs had died off in the 1990s, and Intel wasn't around to provide the Core series, because Apple was buying PowerPC chips. Or if AMD had died off and Intel didn't need to provide the Core series because they didn't have to address the Athlon64's growing popularity. Apple, and all its customers (us) would've been stuck with the PowerPC chips. The entire computer industry would've had to wait for IBM to get their shit together. That sort of stuff doesn't happen when you have multiple companies competing. And that competition wouldn't happen if Apple were the only hardware provider.
-- Alright. I get it. Having Apple as the computer company would lead to the market stagnating. But it's not like they can do anything about that. You don't get pissed off at Dell because if all other companies died off, they might only buy hardware from the best providers.
Well, there's a difference, and there is something Apple could do. As I said before, Dell and other companies use Windows. Apple and only Apple uses OS X.
-- But Macs can have Windows installed on them.
That they can. However, most people buy Macs for the whole experience, not just to wipe them and put Windows on. Only luxury-hardware-obsessed Windows users do that. Besides, if you already spent over 2 grand on a laptop, do you really want to shell out $100+ for another OS which you've heard is slower and less secure? Probably not.
-- What do you expect Apple to do about that? Stop selling Mac OS?
No, not at all. In fact, quite the opposite. They need to sell Mac OS even more. See, when Microsoft sells an OS, they let anyone install it on any hardware. When Apple sells an OS, you're not allowed to put it on a non-Mac. Even though there are non-Macs which can actually run Mac OS (known as Hackintoshes,
kipfox has one), it's a violation of the license to put Mac OS on them. That would be why nobody sells PCs with OS X on them. It's not because nobody would buy them--hell, people will buy PCs with wood panelling on them (no joke). It's because they would get their asses sued.
Part D: The Solution
-- So you want Apple to let anyone install OS X on any computer?
Precisely. If they sold OEM copies (OEM = Original Equipment Manufacturer... companies like Dell), other companies would be able to purchase them and sell OS X systems. You could see a Mac laptop retailing for $500, or a Mac desktop retailing for $300--half what Apple charges for entry-level stuff. If you're a light computer user on a budget, that could be a godsend. And companies could make actual dedicated gaming Macs, systems closer to the Mac Pro than the iMac in form factor, which can take one or two user-removable graphics cards and handle a quad core CPU for easily $1000 cheaper than Apple will sell a Pro to you for.
-- So... you don't like Apple because they won't let you use OS X without a Mac.
As silly as that may sound, yes. Although I have no particular desire to use OS X, I would much prefer it for a market shift in that direction to not mean that Apple gains a monopoly over the computer hardware market.
-- One of the problems that PCs have with reliability is that there are a lot of components out there which each have their own advantages and disadvantages, quirks and driver bugs, random incompatibilities with each other. If Apple allowed what you suggest, this might bring that problem over to the "Macintosh" realm.
That is true, and it's quite possible that OS X systems might end up having some of the same sorts of issues that Windows systems have today. However, assuming that OS X is as finely-engineered as Apple claims it to be, and as advanced and efficient as their press releases trumpet, I'd think that OS X wouldn't particularly be worse than Windows when it comes to avoiding such problems--unless working with their own tightly-controlled hardware specs has made them lazy with flexibility. In any case, Apple could still sell their Macintosh brand computers, for people who want to be as certain as possible that they'll have a finely-tuned computing experience. For people willing to pay the price premium, they would offer the assurance that every square millimeter of the hardware was poured over by the same people who work on the OS and software, to make sure it all fits together well.
-- So in essence, you're proposing that Apple allow itself to take on the role of Microsoft, with all the benefits and flaws their approach has.
Yes. See, there are currently three possible ways the OS market can go. Mac OS can die, and Windows will become the OS of choice for home and work computers. Windows can die and Mac OS can become the champ. Or they can remain locked in a bitter war which forces consumers to choose a side and invest money in that side until annoyances with their computer push them over to the other side. The last option, frankly, is the worst. Resources are wasted on advertising and re-developing the same features on two incompatible platforms--or intellectual property laws and pride keep genuinely good features from making it to the competing platform, so you get some good ideas on Macs and some good ones on PCs. In some cases the wheel has to be reinvented, and sometimes the new wheel is square because the round one is too close to the original patent. One platform really needs to become the victor for consumers to win. If Microsoft does with their current strategy, the hardware market will continue to thrive and compete as it does now. If Apple does with their current strategy, well, as I already described it will spell doom for the productive market we know and love. Having one possible okay outcome is nice, having two possible okay outcomes is better. So yes, Apple really needs to take Microsoft's approach to OS licensing. It will benefit everyone.
Part E: Implications
-- Apple will miss out on hardware sales, and those missed sales will have inferior hardware that is a nuisance for customers. How does that help anyone?
Apple may sell fewer units if cheaper systems can provide a comparable experience, but for every system they sell, they'll bag $100 marginal profit from the OS license. And if they're able to sway the >1 billion Windows users onto their platform as a result, winning a lifetime contract for $100 billion from new OS sales every 3-5 years when the whole market upgrades, that could be a pretty big win. Besides, with people migrated to the Mac OS platform, it would be much easier for customers to make the switch to Apple hardware, under the assurances of top-notch construction, 8+ hour batteries, green manufacturing, and aesthetic perfection. Pushing that stuff is a bit harder when it means every phrase, program, and icon that a user is familiar with is going to be turned upside down because they have to switch OS as well.
And customers would have huge wins. More selection can only be good here. For those who just want Apple hardware, the increased hardware competition might drive prices down. For those who want cheaper hardware, it would suddenly be a viable option to have a new OS X-based system, with display and keyboard and all, for under $500. For those who want higher-end hardware, like gamers, that would suddenly be available too. Today, quad-core chips have been affordably priced for well over a year. Indeed, you can get a quad-core i7 processor for under $300. That's better than a Core 2 Quad, which is twice as good as the Core 2 Duo which is in the fastest consumer Mac desktops. A shift like this would allow OS X to be a viable platform for gaming and computationally-intensive work, as well as selling a system to every Mom and Pop.
-- This still strikes me as something that would weaken the integrity and tightly-woven nature of the Mac ecosystem.
It wouldn't necessarily make everything absolutely better in every way. Just the fact that customers using OS X would have more to choose than Portable/Desktop, Cheap/Average/Expensive, and 250 or 500 GB, could be viewed as a disadvantage. But if this doesn't happen, and Apple does win the software war, it'll cripple the hardware market. Likely, people would see how unreasonably anticompetitive it is, and Apple would get its ass sued for monopolistic business practices from their restrictive licensing rules. Whether that would happen before or after the hardware market got fucked over, I couldn't guess.
Part F: Beyond Mac
-- So, you're opposed to Macs. But Apple doesn't just sell Macs. They also sell iPods, Apple TVs, iPhones, movie rentals, etc. What about the iPod? Where's the lock-in with that? iTunes music is DRM free.
It's DRM free, now. After Amazon and other services already provided sub-dollar pricings for high-quality MP3s without DRM. Yes, you can take your iTunes-purchased music and put it on other devices. But you can't do that with your movies, TV shows, music videos, or apps.
-- Now you're just being unfair, and holding Apple to a higher standard than other companies. You can't play your Nintendo DS games on other systems, or your PSP games on other devices, or your mobile phone games on other phones. Game consoles in general are like that, if you decide you're through with Sony, you can't play your PS3 games on other systems. Why should you expect that sort of stuff from Apple?
I don't approve of closed hardware systems like that in general. That includes the Xbox and the Zune, Microsoft's babies. And they do make a game platform with open hardware specs--the PC. They also make movie ownership and rental systems which can be played on products from dozens of companies--the DVD. And mobile phones don't let you migrate your applications from one device to another? That's why I support OSes which bridge more than one phone, such as Windows Mobile.
Don't get me wrong. These choices aren't always the most convenient. PCs have many more things that can go wrong (or so very right!) than game consoles. Windows Mobile is much less tightly integrated than the Palm Pre or the iPhone. DVDs, whether they're viewed normally or ripped to a computer, are much less convenient than end-to-end digital distribution. This isn't just a matter of picking the most convenient and elegant solution, for me. As a responsible consumer, I feel obligated to support the solution which allows for the most competition and the least lock-in. That is the solution which is the most viable in the long term.
-- Alright. You've explained why you are opposed to the support of platforms like the Mac and iPhone. But you always seem to take things so personally. Why is that?
Well, to clarify, I'm opposed to the support of pretty much anything Apple until they significantly clean up their act. Even buying one of their monitors, or an iPod just to listen to music you already own, gives them money--the fuel of any company. But the reason I take it so personally and get so riled up about all this stuff is that, well, the scenarios I envision 5 or 10 or 15 years down the road are things which will affect me. If Apple becomes the dominant provider of computer hardware and OSes, and PCs as we know them dwindle into obscurity, or if everyone trades in their phone for an iPhone tomorrow, that will slash their competition and give me the painful option of entering their proprietary hardware ecosystem, or buying unpopular, poorly-supported fringe devices.
This isn't somebody buying a Dell laptop, and maybe moving to an HP in 3 years if they see another model they like. This is people buying into a hardware trap that few people leave, whether it's because they don't want to or can't. It's like people voting to change the government to a Monarchy, so they don't have to worry about voting in the future. If that Monarchy ends up coming to power and I have to bow down before it, I'll be pissed the fuck off.
Don't get me wrong. There are many days that go by when I say to myself, "For my next laptop I really want something like the Macbook Pro, though maybe with a few little tweaks here and there. Still, the part selection and aesthetics are top-notch and I can't find anything which is all-around better." However, for reasons I've just described, I can't buy a Macbook Pro and put Windows on it, even if that would give me the sort of computing experience I crave.
Part G: Wrapup
-- Wow. Just, wow. You're a closet Mac-lover so paranoid that you'll lose your freedom to choose how many graphics cards you have 10 years down the road, that your very moral fibers have been bent to see Apple-supporting as evil. You're so worried about harming the computer market that you won't even buy stuff that you want on some level.
And you guys think you have it bad hearing me talk about this a few times a year; I get to listen to me talk about it a few times a day.
Think about it this way. It might be more immediately gratifying to just eat junk food every day, but it can have terrible health considerations later on in life, so you don't do it. It might seem like a good idea at the time to steal something, or try to run from the cops to avoid a speeding ticket, but we're blessed with forethought so we don't do those things either. This is just applying critical thinking and planning to something else I care about. I care a lot about computers. So even though I might be able to get a laptop which is a little tidier and a little thinner from Apple today, I'd prefer to have more than just one hardware provider when I'm 30, so I don't support Apple.
I do not hate Apple for the conventional reasons--in fact, I like a lot of the designs they implement, and understand their pricing. And I don't like Microsoft for making 'great' software, in reality it pisses me off on a regular basis. No, I choose to support or hate these companies based on how their competitive practices will impact the computer and electronics markets.
I originally wrote this because, about a month ago, during the span of several days I was confronted on multiple occasions about why it was that I have so much spite for Apple. It is my hope that this will be the last time for a good long while that I'll need to write more than a paragraph about why I disapprove of Apple and anyone who supports the company financially. I have written this in the format of myself addressing a logically-minded individual who is curious about why I'm so polarized against Apple--something that I never get from actual Apple fans. The whole thing is over 6 pages and 3600 words, and although I wish you'd all read it, you can get the core of my complaint by reading Parts A, B and G (the very last one). Alternately, only read the bolded sections. It's like a mini essay!
Special note: Do not post any ignorant "____ sucks"-type comments to this journal. Even though you're mostly just here to see furry dragon porn, I'm still going to have to ask you to be respectful and thoughtful in my little corner of this website.
Part A: Summary
-- Twile, why do you dislike Apple so much?
I dislike Apple because they are an anti-competitive force in the technology market.
-- What do you mean by that? Isn't Microsoft like that as well? Why do you support them?
Apple uses business practices and strong software-hardware ties to create an anti-competitive segment of the market. Microsoft is only concerned with selling you their OS, because Zune and Xbox aside, they don't sell any hardware (I'll get back to those devices later). Simply put, both Microsoft and Apple want people to use their computer OS, and their OS only. However, only Apple is trying to push their hardware on you as well.
Part B: Elaboration
-- Why is that a bad thing? Aren't other companies, like Dell or HP, also trying to get you to buy their hardware?
Yes, they are. But importantly, they all use the same OS. See, right now I have an ASUS laptop, and I'm not very happy with the build quality of it. When I get another laptop, I can choose to get one which is not by ASUS. I can make that migration as easily as changing toilet paper brands, because I know all my software will still work on my next computer, because it will also run Windows. That's not the case with Apple products. If I buy a Mac once, most of the software I get used to using, or even purchase, will only run on a Mac. So if, for any reason, I decide I want a different brand of computer in the future, I have to give up all the stuff I was using.
-- Why does that really matter, though? I mean, Apple makes really nice hardware.
They do indeed. However, they don't provide all the sorts of things that you get in the PC world. Say I really get into gaming and I want something with more oomph than the Radeon 4850 in the iMac. You know, I want to play games on a 30" monitor, or a trio of 22" displays, and the iMac just doesn't have the power to push out the pixels I want. What do I do then? Apple doesn't make products for serious gamers. Or what if I want something with a carbon fiber or magnesium alloy shell, or a 1080p laptop with a swappable battery? I can choose between picking a PC that uses those and dumping all my previous software, or sighing and sticking with Apple until they hopefully make something more to my liking. Furthermore, as long as Apple is gaining marketshare from customers making the switch and very few people are able to switch back (for the reasons just stated), they will inevitably become a very large market force. In a worst-case scenario, they would become the dominant computer and OS manufacturer.
-- That doesn't sound like a worst-case scenario to me, really. Everyone's computers would just work. Viruses would be totally obliterated. Everyone would be able to benefit from the latest technologies.
Yeah, and computer technicians and hardware designers would lose their jobs, new sorts of malware would be developed, and people who couldn't afford new Macs would be stuck on hand-me-downs, right? No, in all seriousness, it would be a bad situation. You see, Apple is able to provide some of the finest hardware because of the PC industry.
-- What do you mean?
The PC industry works as follows. For every major computer part--CPU, GPU, hard drive, motherboard, etc--there are at least two manufacturers. They compete to make the best products which store the most, compute the fastest, take the least power, put out the least heat, take up the least space, and cost the least. Intel puts out their Pentium 4, AMD puts out their Athlon64. Intel puts out their Core and Core 2, AMD puts out the Phenom. Intel puts out the i7... and so on and so forth. PC manufacturers can pick and choose freely between these parts to make systems that they think will best suit the needs of their customers--be they small, low-power systems or massive computing powerhouses. It's never really the case that all of a company's products are worse in every respect to their competitors, so they can still sell things to finance R&D needed to try and take back the markets they aren't doing well in.
-- Yes yes, that's all very straightforward and generic. What does this have to do with Apple?
Apple does mostly what other companies do. They pick the best parts available to them, and put those in their systems. Now, I shouldn't have to tell you that lack of competition can make companies lazy--especially hardware companies. If they have no competition, they don't even have to make products which are significantly better, they can just wait for their previous ones to slowly die off and need replacing. Of course, if you make a processor that can last >5 years, putting out faster ones every few years to convince people to buy new systems will earn you more money... but still. Less competition means less drive to innovate. It also means less favorable pricing. There are reasons why we have laws to prevent monopolies.
-- I still don't see how this reflects badly on Apple.
Well, if Apple were the only major hardware provider, what would happen to the companies that didn't provide the absolute best products? What would happen to AMD, who makes approximately a third of the CPUs in gaming computers? Apple decided a few years ago that they were going to switch to Intel CPUs, because the upcoming Core series was just such a good thing. If there weren't other big computer manufacturers, AMD wouldn't be able to keep selling desktop CPUs. They would die, and without their competition, Intel wouldn't innovate to its full potential. Imagine that happening to every part in the computer.
-- Oh. That would be bad.
Yeah. And it's worse for Apple than any other hardware provider. See, most companies offer a little bit from this guy and a little bit from that guy, in different models. They provide Intel and AMD chips. Radeon and GeForce graphics parts. Toshiba and Seagate hard drives. After all, the parts are either generic enought to be interchangable, or they have their own drivers anyway. But Apple, for the sake of their unified ecosystem, tends to pick one provider whenever possible. And they stick with that provider, for all of their hardware, for as long as that hardware is on top. They optimize their software and their drivers for it. After all, at the end of the day, the big difference between the hardware in a Mac (or game console) and a PC is that Macs have consistent hardware and optimized support for it, while PCs have a lot of hardware variety and much less optimization.
Part C: The Apple Difference
-- So it's harder for them to switch their system components.
Yes. And that can bite them in the ass. Look back about 5 years. Apple had been using PowerPC chips, from IBM, for a decade. However, IBM was unable to provide chips that would fit the needs of the PowerBooks. I think they put out too much heat and needed too much power to be faster than the ones they were using. Anyway, the situation was so crappy, and the Core series was so promising, that Apple switched CPU brands (a switch which was no small undertaking). Imagine if PCs had died off in the 1990s, and Intel wasn't around to provide the Core series, because Apple was buying PowerPC chips. Or if AMD had died off and Intel didn't need to provide the Core series because they didn't have to address the Athlon64's growing popularity. Apple, and all its customers (us) would've been stuck with the PowerPC chips. The entire computer industry would've had to wait for IBM to get their shit together. That sort of stuff doesn't happen when you have multiple companies competing. And that competition wouldn't happen if Apple were the only hardware provider.
-- Alright. I get it. Having Apple as the computer company would lead to the market stagnating. But it's not like they can do anything about that. You don't get pissed off at Dell because if all other companies died off, they might only buy hardware from the best providers.
Well, there's a difference, and there is something Apple could do. As I said before, Dell and other companies use Windows. Apple and only Apple uses OS X.
-- But Macs can have Windows installed on them.
That they can. However, most people buy Macs for the whole experience, not just to wipe them and put Windows on. Only luxury-hardware-obsessed Windows users do that. Besides, if you already spent over 2 grand on a laptop, do you really want to shell out $100+ for another OS which you've heard is slower and less secure? Probably not.
-- What do you expect Apple to do about that? Stop selling Mac OS?
No, not at all. In fact, quite the opposite. They need to sell Mac OS even more. See, when Microsoft sells an OS, they let anyone install it on any hardware. When Apple sells an OS, you're not allowed to put it on a non-Mac. Even though there are non-Macs which can actually run Mac OS (known as Hackintoshes,
kipfox has one), it's a violation of the license to put Mac OS on them. That would be why nobody sells PCs with OS X on them. It's not because nobody would buy them--hell, people will buy PCs with wood panelling on them (no joke). It's because they would get their asses sued.Part D: The Solution
-- So you want Apple to let anyone install OS X on any computer?
Precisely. If they sold OEM copies (OEM = Original Equipment Manufacturer... companies like Dell), other companies would be able to purchase them and sell OS X systems. You could see a Mac laptop retailing for $500, or a Mac desktop retailing for $300--half what Apple charges for entry-level stuff. If you're a light computer user on a budget, that could be a godsend. And companies could make actual dedicated gaming Macs, systems closer to the Mac Pro than the iMac in form factor, which can take one or two user-removable graphics cards and handle a quad core CPU for easily $1000 cheaper than Apple will sell a Pro to you for.
-- So... you don't like Apple because they won't let you use OS X without a Mac.
As silly as that may sound, yes. Although I have no particular desire to use OS X, I would much prefer it for a market shift in that direction to not mean that Apple gains a monopoly over the computer hardware market.
-- One of the problems that PCs have with reliability is that there are a lot of components out there which each have their own advantages and disadvantages, quirks and driver bugs, random incompatibilities with each other. If Apple allowed what you suggest, this might bring that problem over to the "Macintosh" realm.
That is true, and it's quite possible that OS X systems might end up having some of the same sorts of issues that Windows systems have today. However, assuming that OS X is as finely-engineered as Apple claims it to be, and as advanced and efficient as their press releases trumpet, I'd think that OS X wouldn't particularly be worse than Windows when it comes to avoiding such problems--unless working with their own tightly-controlled hardware specs has made them lazy with flexibility. In any case, Apple could still sell their Macintosh brand computers, for people who want to be as certain as possible that they'll have a finely-tuned computing experience. For people willing to pay the price premium, they would offer the assurance that every square millimeter of the hardware was poured over by the same people who work on the OS and software, to make sure it all fits together well.
-- So in essence, you're proposing that Apple allow itself to take on the role of Microsoft, with all the benefits and flaws their approach has.
Yes. See, there are currently three possible ways the OS market can go. Mac OS can die, and Windows will become the OS of choice for home and work computers. Windows can die and Mac OS can become the champ. Or they can remain locked in a bitter war which forces consumers to choose a side and invest money in that side until annoyances with their computer push them over to the other side. The last option, frankly, is the worst. Resources are wasted on advertising and re-developing the same features on two incompatible platforms--or intellectual property laws and pride keep genuinely good features from making it to the competing platform, so you get some good ideas on Macs and some good ones on PCs. In some cases the wheel has to be reinvented, and sometimes the new wheel is square because the round one is too close to the original patent. One platform really needs to become the victor for consumers to win. If Microsoft does with their current strategy, the hardware market will continue to thrive and compete as it does now. If Apple does with their current strategy, well, as I already described it will spell doom for the productive market we know and love. Having one possible okay outcome is nice, having two possible okay outcomes is better. So yes, Apple really needs to take Microsoft's approach to OS licensing. It will benefit everyone.
Part E: Implications
-- Apple will miss out on hardware sales, and those missed sales will have inferior hardware that is a nuisance for customers. How does that help anyone?
Apple may sell fewer units if cheaper systems can provide a comparable experience, but for every system they sell, they'll bag $100 marginal profit from the OS license. And if they're able to sway the >1 billion Windows users onto their platform as a result, winning a lifetime contract for $100 billion from new OS sales every 3-5 years when the whole market upgrades, that could be a pretty big win. Besides, with people migrated to the Mac OS platform, it would be much easier for customers to make the switch to Apple hardware, under the assurances of top-notch construction, 8+ hour batteries, green manufacturing, and aesthetic perfection. Pushing that stuff is a bit harder when it means every phrase, program, and icon that a user is familiar with is going to be turned upside down because they have to switch OS as well.
And customers would have huge wins. More selection can only be good here. For those who just want Apple hardware, the increased hardware competition might drive prices down. For those who want cheaper hardware, it would suddenly be a viable option to have a new OS X-based system, with display and keyboard and all, for under $500. For those who want higher-end hardware, like gamers, that would suddenly be available too. Today, quad-core chips have been affordably priced for well over a year. Indeed, you can get a quad-core i7 processor for under $300. That's better than a Core 2 Quad, which is twice as good as the Core 2 Duo which is in the fastest consumer Mac desktops. A shift like this would allow OS X to be a viable platform for gaming and computationally-intensive work, as well as selling a system to every Mom and Pop.
-- This still strikes me as something that would weaken the integrity and tightly-woven nature of the Mac ecosystem.
It wouldn't necessarily make everything absolutely better in every way. Just the fact that customers using OS X would have more to choose than Portable/Desktop, Cheap/Average/Expensive, and 250 or 500 GB, could be viewed as a disadvantage. But if this doesn't happen, and Apple does win the software war, it'll cripple the hardware market. Likely, people would see how unreasonably anticompetitive it is, and Apple would get its ass sued for monopolistic business practices from their restrictive licensing rules. Whether that would happen before or after the hardware market got fucked over, I couldn't guess.
Part F: Beyond Mac
-- So, you're opposed to Macs. But Apple doesn't just sell Macs. They also sell iPods, Apple TVs, iPhones, movie rentals, etc. What about the iPod? Where's the lock-in with that? iTunes music is DRM free.
It's DRM free, now. After Amazon and other services already provided sub-dollar pricings for high-quality MP3s without DRM. Yes, you can take your iTunes-purchased music and put it on other devices. But you can't do that with your movies, TV shows, music videos, or apps.
-- Now you're just being unfair, and holding Apple to a higher standard than other companies. You can't play your Nintendo DS games on other systems, or your PSP games on other devices, or your mobile phone games on other phones. Game consoles in general are like that, if you decide you're through with Sony, you can't play your PS3 games on other systems. Why should you expect that sort of stuff from Apple?
I don't approve of closed hardware systems like that in general. That includes the Xbox and the Zune, Microsoft's babies. And they do make a game platform with open hardware specs--the PC. They also make movie ownership and rental systems which can be played on products from dozens of companies--the DVD. And mobile phones don't let you migrate your applications from one device to another? That's why I support OSes which bridge more than one phone, such as Windows Mobile.
Don't get me wrong. These choices aren't always the most convenient. PCs have many more things that can go wrong (or so very right!) than game consoles. Windows Mobile is much less tightly integrated than the Palm Pre or the iPhone. DVDs, whether they're viewed normally or ripped to a computer, are much less convenient than end-to-end digital distribution. This isn't just a matter of picking the most convenient and elegant solution, for me. As a responsible consumer, I feel obligated to support the solution which allows for the most competition and the least lock-in. That is the solution which is the most viable in the long term.
-- Alright. You've explained why you are opposed to the support of platforms like the Mac and iPhone. But you always seem to take things so personally. Why is that?
Well, to clarify, I'm opposed to the support of pretty much anything Apple until they significantly clean up their act. Even buying one of their monitors, or an iPod just to listen to music you already own, gives them money--the fuel of any company. But the reason I take it so personally and get so riled up about all this stuff is that, well, the scenarios I envision 5 or 10 or 15 years down the road are things which will affect me. If Apple becomes the dominant provider of computer hardware and OSes, and PCs as we know them dwindle into obscurity, or if everyone trades in their phone for an iPhone tomorrow, that will slash their competition and give me the painful option of entering their proprietary hardware ecosystem, or buying unpopular, poorly-supported fringe devices.
This isn't somebody buying a Dell laptop, and maybe moving to an HP in 3 years if they see another model they like. This is people buying into a hardware trap that few people leave, whether it's because they don't want to or can't. It's like people voting to change the government to a Monarchy, so they don't have to worry about voting in the future. If that Monarchy ends up coming to power and I have to bow down before it, I'll be pissed the fuck off.
Don't get me wrong. There are many days that go by when I say to myself, "For my next laptop I really want something like the Macbook Pro, though maybe with a few little tweaks here and there. Still, the part selection and aesthetics are top-notch and I can't find anything which is all-around better." However, for reasons I've just described, I can't buy a Macbook Pro and put Windows on it, even if that would give me the sort of computing experience I crave.
Part G: Wrapup
-- Wow. Just, wow. You're a closet Mac-lover so paranoid that you'll lose your freedom to choose how many graphics cards you have 10 years down the road, that your very moral fibers have been bent to see Apple-supporting as evil. You're so worried about harming the computer market that you won't even buy stuff that you want on some level.
And you guys think you have it bad hearing me talk about this a few times a year; I get to listen to me talk about it a few times a day.
Think about it this way. It might be more immediately gratifying to just eat junk food every day, but it can have terrible health considerations later on in life, so you don't do it. It might seem like a good idea at the time to steal something, or try to run from the cops to avoid a speeding ticket, but we're blessed with forethought so we don't do those things either. This is just applying critical thinking and planning to something else I care about. I care a lot about computers. So even though I might be able to get a laptop which is a little tidier and a little thinner from Apple today, I'd prefer to have more than just one hardware provider when I'm 30, so I don't support Apple.
[FA] Furry Dragon Roundup!
Posted 16 years agoRespond here with links to all the furry dragon characters you know of on FA (other than me, of course). I'll arrange them chromatically in the journal and see if we can't get a rainbow of dragon fuzz.
Notes:
1) Furry, in this case, means fur/fuzz/fluff/soft hair-based pelt on the majority of their bodies. Not that they're in the furry fandom.
2) Manes alone aren't enough, they're like extended hair!
Reds
twile
dragonorca
shirou14
amyth
Oranges
goddy
kokuhane
aellynh
Yellows
sashaws
zementh
Greens
Lucca_manadragon
soul4hdwn
fanur
Blues
fredrik
dr4g0nl0v3r
losian
rwwriter
draggy
sly
yuumei
nacht's NightRavenger
kentam
Purples
sovy
Whites
kyu
furrydrake
maliksr
sang
shiranes
mansonsdragon
busterdrag
dragonicar
the-thunderous-silver-one
manojalpa
Gray/silver
rex
sidian
talakestreal
tyrantdragon0
falindelstan
Blacks (that sounds so bad...)
zeocin
erete
ladysin
dragonkid
azimuthdragon
darkvyce
zay
Browns
bazz
kraz
karashata
diodrac
Notes:
1) Furry, in this case, means fur/fuzz/fluff/soft hair-based pelt on the majority of their bodies. Not that they're in the furry fandom.
2) Manes alone aren't enough, they're like extended hair!
Reds
twile
dragonorca
shirou14
amythOranges
goddy
kokuhane
aellynhYellows
sashaws
zementhGreens
Lucca_manadragon
soul4hdwn
fanurBlues
fredrik
dr4g0nl0v3r
losian
rwwriter
draggy
sly
yuumei
nacht's NightRavenger
kentamPurples
sovyWhites
kyu
furrydrake
maliksr
sang
shiranes
mansonsdragon
busterdrag
dragonicarthe-thunderous-silver-one
manojalpa
Gray/silver
rex
sidian
talakestreal
tyrantdragon0
falindelstan
Blacks (that sounds so bad...)
zeocin
erete
ladysin
dragonkid
azimuthdragon
darkvyce
zay
Browns
bazz
kraz
karashata
diodrac
[FA] Nothing to see here.
Posted 16 years agoUSTREAM ToS wrote:(i) License Grant. Ustream.tv does not claim ownership rights in your User Submissions. However, by uploading, streaming, submitting, emailing, posting, publishing or otherwise transmitting any User Submission to Ustream.tv or on the Site, you hereby grant Ustream.tv a non-exclusive, worldwide, royalty-free, sublicensable, perpetual and irrevocable right and license to use, reproduce, modify, adapt, prepare derivative works based on, perform, display, publish, distribute, transmit, broadcast and otherwise exploit such User Submissions in any form, medium or technology now known or later developed, including without limitation on the Site and third party websites. You represent and warrant that you own or have the necessary licenses, rights, consents and permissions to grant the foregoing licenses to Ustream.tv. Ustream.tv will own all right, title and interest in and to all derivative works and compilations of User Submissions that are created by Ustream.tv, including all worldwide intellectual property rights therein. You agree to execute and deliver such documents and provide all assistance reasonably requested by Ustream.tv to give to Ustream.tv the full benefit of the rights granted to Ustream.tv by you.
I read that such things aren't uncommon on other free hosting sites, but um, screw that that. Any ToS which gives itself the right to exploit something can go suck a fuck.
Besides, you're not allowed to draw porn on USTREAM. No, really. Not even in private channels. You can have mature-rated channels, but no porn, which leaves... graphic displays of violence? But, no porn. Bloody brilliant.
I realize that the site needs to give itself the right to store, re-compress, and rebroadcast user content for people to be able to view it, but I really just don't like the wording they use. It leaves too many open doors. Perpetual and irrevocable? What if you decide you want to take down your content? Modify, adapt, and prepare derivative works from? Aside from standard algorithmic compression schemes, why do they need to modify it?
Thanks to
crux for pointing this out.
Apparently it doesn't bother anybody that Ustream retains unlimited and indefinite rights to modify and reproduce anything you submit on the site for any reason they see fit on any medium or technology that may ever be developed. Because that's what other video streaming sites do, so it must be okay, right? I thought this was the same group of people who got their panties in a twist when their art is reposted by somebody else, and suddenly they don't care about unlimited, royalty-free reproduction? Fuck it, there goes that journal.
I read that such things aren't uncommon on other free hosting sites, but um, screw that that. Any ToS which gives itself the right to exploit something can go suck a fuck.
Besides, you're not allowed to draw porn on USTREAM. No, really. Not even in private channels. You can have mature-rated channels, but no porn, which leaves... graphic displays of violence? But, no porn. Bloody brilliant.
I realize that the site needs to give itself the right to store, re-compress, and rebroadcast user content for people to be able to view it, but I really just don't like the wording they use. It leaves too many open doors. Perpetual and irrevocable? What if you decide you want to take down your content? Modify, adapt, and prepare derivative works from? Aside from standard algorithmic compression schemes, why do they need to modify it?
Thanks to
crux for pointing this out.Apparently it doesn't bother anybody that Ustream retains unlimited and indefinite rights to modify and reproduce anything you submit on the site for any reason they see fit on any medium or technology that may ever be developed. Because that's what other video streaming sites do, so it must be okay, right? I thought this was the same group of people who got their panties in a twist when their art is reposted by somebody else, and suddenly they don't care about unlimited, royalty-free reproduction? Fuck it, there goes that journal.
[Tech] Twile's Cheap(ish) Blu-Ray Enjoyment Guide
Posted 16 years agoLet's break up this streak of [Furry] journals with a nice [Tech] one.
We're more than halfway through 2009 and Blu-ray is, for better or worse, the official victor of the HD war. Despite that, many people don't have Blu-ray players. Maybe you don't have a nice high-def TV. Maybe you're not willing to shell out $200+ for a Blu-ray player. Maybe you're used to torrenting things because Blu-ray price premiums are unacceptable to you. Regardless of the case, I've got a nice guide here on how you can enjoy the benefits of 1080p movies, as good or better than the Blu-ray experience, in your very house, potentially for very cheap.
Here's what I'm aiming for. A single video file per movie which contains the exact same quality as a Blu-ray movie, including the options for multiple audio streams and subtitle tracks, without the need for physical media, which can be played at any time on any computer. No inferior-quality torrented rips which could get you in legal trouble. No mucking around with a several-hundred-dollar player which takes a couple minutes to start up. No DRM. Flexible subtitles which can be customized in appearance as you watch, rather than ugly subtitles that you get on the disc.
With the proper hardware ($70, the only unavoidable cost) and software (free, with some trickery or piracy), it's entirely possible to do just what I described. A Blu-ray quality multi-audio-track rip can take as much as 50 GB per movie, but in my experience sizes of 15-20 GB are much more common, even for live-action 2+ hour movies. That's barely over $1 in hard drive storage, if you buy a 1500 GB drive.
$70 to get yourself started, $1-2 to store each movie, and you'll never have to futz around with discs or Blu-ray players. Let's get right to it.
Here's what you'll need to get started:
* A Blu-ray drive for your computer, $70
* A Blu-ray disc, $20-30 on Amazon, $0 to borrow from a friend, $1 from Netflix (see footnote #1)
* An installation of AnyDVD HD, $0-150 (see footnote #2)
* mkvtoolnix, free
* RipBot264, and the programs it requires to run, all free
Optionally, you can also use:
* SupRip, free.
Now, here's what you do:
1) Install and configure all the needed hardware and software.
2) Pop your Blu-ray disc into your drive.
3) Use AnyDVD HD to decrypt and save the disc contents to your hard drive. This typically takes an hour or more.
4) Open RipBot264, start a new job, and direct it to any file in the BDMV\STREAM subdirectory from where you ripped the files to. Wait for it to analyze the file.
5) Pick the "playlist" you want (usually the longest one, i.e. the movie itself), then pick the appropriate audio stream and subtitles (if desired)
6) Press 'Ok' and wait for it to demux the movie. This involves reading all the ripped files, and writing out processed copies of them, so ideally you'll do it from one hard drive to a different one (even then, if you get a decent 60 MB/sec between drives, it'll take over 5 minutes to demux a 20 GB rip). Once that's done, ignore the re-encoding settings and press Done or OK or whatever to confirm that window.
Please note, at this stage, if you intend to re-encode the audio and video to save space you should actually look at the settings and pick things which are good enough quality for your liking. From this point on, RipBot can handle everything else. If you want the 100% Blu-ray quality experience, keep reading.
7) Locate the Temp folder on your computer. In my case, this was on the D: drive. Inside it you should find a temp folder for RipBot264. In that, there's a folder for each 'job' you queue up in RipBot. For now, there should just be job1. Open the folder.
8) In that folder you should see maybe two dozen files. The important files are video.mkv, which stores the ripped, Blu-ray quality video stream, a large audio file (which might be audio1.core.dts, or audio.ac3, or something similar), a chapters.txt file, and one or more .sup files with subtitles. Open up mkvmerge GUI (aka, mmg.exe in the MKVtoolnix installation directory) and drop the audio, video, and subtitle files into the box for "Input files".
9) You should get some things popping up in the "Tracks, chapters and tags" box below that. You should have a video stream, subtitles, and an audio stream. You can give names to each track by selecting them and entering a name (and language) below. This is useful when you have, for example, multiple audio tracks and you want to be able to keep track of which is which.
10) Pick an output location and file name and hit Start muxing. As before, ideally this is on a different drive, because you're copying those files all over again. This may take another 5-10 minutes.
11) If you want chapter markers in your vidoe file, use the Chapters menu in mkvmerge to load the chapters.txt file, then pick Save to MKV from the Chapters menu and navigate to the MKV file you just made. It will take several minutes to add the chapter data.
12) Play your newly outputted video file using a program like Media Player Classic Homecinema, or anything else which will do MKV files while providing GPU acceleration for decoding.
Extra 1) If you want multiple audio tracks, such as other languages or director's commentary, you'll need to repeat steps 5 and 6, once per audio track. It's a bitch that it has to re-rip the video and store a temp copy of it every time, but that's just how things work. Because I tend to work with several movie rips at a time, to keep things organized I make a folder for each rip where I put copies of the files I'm going to mux together into the mkv. Audio tracks, the video, chapter data, subtitles. If there are multiple audio tracks, I name them appropriately so I remember what's what later on.
Extra 2) If the Blu-ray style subtitles aren't to your liking (I know they aren't to mine), you can run your .sup file through Suprip. This program examines the images of the subtitles and tries to extract the corresponding text into ASCII characters. It usually requires a bit of training, and it can get confused when letters are too close to each other, but with some luck it can churn through 1000 or 2000 subtitles (about what you get in a movie) in a few minutes and give you a .srt file. The subtitles will often be full of errors, mixing up upper-case i with lower case L for example, or putting spaces where they don't belong. If you're a perfectionist like me you can run it through a spell-checker, do find-and-replace (replace ' L ' with ' I '), or even manually go through every line to ensure correctness (or swap random words with 'penis'). Most people will probably be fine with a very quick spell-check and a mass-replace of 'Lt' with 'It', etc. Especially if they haven't seen the movie and don't want the plot ruined by reading all the subtitles.
Extra 3) More automated subtitle tools can be found in SubRip, a DVD subtitle ripping tool. If you feed it the .srt file that Suprip made, you can tell it to do automatic corrections, such as eliminating spaces between numbers, swapping out '' with ", and so on.
Extra 4) The processed subtitles in the .srt file can be merged into the .mkv file with mkvmerge GUI, in the same way the audio and video were originally combined. You'll want to remove the .sup file you ripped from the disc, if you put it in the .mkv file to begin with.
Footnote 1: Netflix service for 2 discs at a time with Blu-ray service goes for about $17 a month, and if you're close to a distribution center and get your discs in the mail the morning after they arrive, you can get up to 4 discs a week. At slightly over 4 weeks a month, that comes out to ~$1 per disc.
Footnote 2: AnyDVD HD is not a cheap program. A 21-day free trial is offered, after which point you'll need to spend $90 for one year of use, $150 for lifetime use, or somewhere in between for 2, 3 or 4 years. If you install it on a desktop you don't use, and just restore it to a pre-AnyDVD HD backup once the trial runs out, you should be able to use it for free indefinitely. Alternately, one can pirate the program, but because it receives frequent updates to support new movie releases, you'll likely have to re-download and reinstall just as often.
So there you have it. The entire process should take about as long as the movie is, although most of it is automated stuff--letting the files be ripped, demuxed, remuxed, etc.
It is considerably faster than re-encoding the audio and video streams, which on a quad core CPU can easily take 5-10 hours and leave your computer nigh-unusable. The downside is that you'll be sticking with the encoding from the disc, which might be things your computer/software doesn't understand (VC-1 or DTS, for example), and probably leave you with a 20 gig movie rather than a 10 gig movie. However, when you factor in the cost of the electricity required to keep a computer on for that encoding work versus the cost of storing the larger, perfect-quality rip, you'll probably find that the two are very close to each other. So for about the same cost you can keep the original audio and video quality, which should give peace of mind that nothing but trailers, menu fluff, and languages you don't speak were lost in the process.
For people using Windows Vista and 7, VC-1 and/or H.264 support should exist right out of the box, so all you have to do now is use a program, like MPC Hometheater, which respects chapter markers, lets you switch audio streams, has support for soft subtitles, and gives you GPU acceleration to smooth your video playback, and you're set!
Final note: This is not meant to be a public forum for discussing the legality or morality of ripping Blu-ray movies that you purchase, rent, or borrow.
We're more than halfway through 2009 and Blu-ray is, for better or worse, the official victor of the HD war. Despite that, many people don't have Blu-ray players. Maybe you don't have a nice high-def TV. Maybe you're not willing to shell out $200+ for a Blu-ray player. Maybe you're used to torrenting things because Blu-ray price premiums are unacceptable to you. Regardless of the case, I've got a nice guide here on how you can enjoy the benefits of 1080p movies, as good or better than the Blu-ray experience, in your very house, potentially for very cheap.
Here's what I'm aiming for. A single video file per movie which contains the exact same quality as a Blu-ray movie, including the options for multiple audio streams and subtitle tracks, without the need for physical media, which can be played at any time on any computer. No inferior-quality torrented rips which could get you in legal trouble. No mucking around with a several-hundred-dollar player which takes a couple minutes to start up. No DRM. Flexible subtitles which can be customized in appearance as you watch, rather than ugly subtitles that you get on the disc.
With the proper hardware ($70, the only unavoidable cost) and software (free, with some trickery or piracy), it's entirely possible to do just what I described. A Blu-ray quality multi-audio-track rip can take as much as 50 GB per movie, but in my experience sizes of 15-20 GB are much more common, even for live-action 2+ hour movies. That's barely over $1 in hard drive storage, if you buy a 1500 GB drive.
$70 to get yourself started, $1-2 to store each movie, and you'll never have to futz around with discs or Blu-ray players. Let's get right to it.
Here's what you'll need to get started:
* A Blu-ray drive for your computer, $70
* A Blu-ray disc, $20-30 on Amazon, $0 to borrow from a friend, $1 from Netflix (see footnote #1)
* An installation of AnyDVD HD, $0-150 (see footnote #2)
* mkvtoolnix, free
* RipBot264, and the programs it requires to run, all free
Optionally, you can also use:
* SupRip, free.
Now, here's what you do:
1) Install and configure all the needed hardware and software.
2) Pop your Blu-ray disc into your drive.
3) Use AnyDVD HD to decrypt and save the disc contents to your hard drive. This typically takes an hour or more.
4) Open RipBot264, start a new job, and direct it to any file in the BDMV\STREAM subdirectory from where you ripped the files to. Wait for it to analyze the file.
5) Pick the "playlist" you want (usually the longest one, i.e. the movie itself), then pick the appropriate audio stream and subtitles (if desired)
6) Press 'Ok' and wait for it to demux the movie. This involves reading all the ripped files, and writing out processed copies of them, so ideally you'll do it from one hard drive to a different one (even then, if you get a decent 60 MB/sec between drives, it'll take over 5 minutes to demux a 20 GB rip). Once that's done, ignore the re-encoding settings and press Done or OK or whatever to confirm that window.
Please note, at this stage, if you intend to re-encode the audio and video to save space you should actually look at the settings and pick things which are good enough quality for your liking. From this point on, RipBot can handle everything else. If you want the 100% Blu-ray quality experience, keep reading.
7) Locate the Temp folder on your computer. In my case, this was on the D: drive. Inside it you should find a temp folder for RipBot264. In that, there's a folder for each 'job' you queue up in RipBot. For now, there should just be job1. Open the folder.
8) In that folder you should see maybe two dozen files. The important files are video.mkv, which stores the ripped, Blu-ray quality video stream, a large audio file (which might be audio1.core.dts, or audio.ac3, or something similar), a chapters.txt file, and one or more .sup files with subtitles. Open up mkvmerge GUI (aka, mmg.exe in the MKVtoolnix installation directory) and drop the audio, video, and subtitle files into the box for "Input files".
9) You should get some things popping up in the "Tracks, chapters and tags" box below that. You should have a video stream, subtitles, and an audio stream. You can give names to each track by selecting them and entering a name (and language) below. This is useful when you have, for example, multiple audio tracks and you want to be able to keep track of which is which.
10) Pick an output location and file name and hit Start muxing. As before, ideally this is on a different drive, because you're copying those files all over again. This may take another 5-10 minutes.
11) If you want chapter markers in your vidoe file, use the Chapters menu in mkvmerge to load the chapters.txt file, then pick Save to MKV from the Chapters menu and navigate to the MKV file you just made. It will take several minutes to add the chapter data.
12) Play your newly outputted video file using a program like Media Player Classic Homecinema, or anything else which will do MKV files while providing GPU acceleration for decoding.
Extra 1) If you want multiple audio tracks, such as other languages or director's commentary, you'll need to repeat steps 5 and 6, once per audio track. It's a bitch that it has to re-rip the video and store a temp copy of it every time, but that's just how things work. Because I tend to work with several movie rips at a time, to keep things organized I make a folder for each rip where I put copies of the files I'm going to mux together into the mkv. Audio tracks, the video, chapter data, subtitles. If there are multiple audio tracks, I name them appropriately so I remember what's what later on.
Extra 2) If the Blu-ray style subtitles aren't to your liking (I know they aren't to mine), you can run your .sup file through Suprip. This program examines the images of the subtitles and tries to extract the corresponding text into ASCII characters. It usually requires a bit of training, and it can get confused when letters are too close to each other, but with some luck it can churn through 1000 or 2000 subtitles (about what you get in a movie) in a few minutes and give you a .srt file. The subtitles will often be full of errors, mixing up upper-case i with lower case L for example, or putting spaces where they don't belong. If you're a perfectionist like me you can run it through a spell-checker, do find-and-replace (replace ' L ' with ' I '), or even manually go through every line to ensure correctness (or swap random words with 'penis'). Most people will probably be fine with a very quick spell-check and a mass-replace of 'Lt' with 'It', etc. Especially if they haven't seen the movie and don't want the plot ruined by reading all the subtitles.
Extra 3) More automated subtitle tools can be found in SubRip, a DVD subtitle ripping tool. If you feed it the .srt file that Suprip made, you can tell it to do automatic corrections, such as eliminating spaces between numbers, swapping out '' with ", and so on.
Extra 4) The processed subtitles in the .srt file can be merged into the .mkv file with mkvmerge GUI, in the same way the audio and video were originally combined. You'll want to remove the .sup file you ripped from the disc, if you put it in the .mkv file to begin with.
Footnote 1: Netflix service for 2 discs at a time with Blu-ray service goes for about $17 a month, and if you're close to a distribution center and get your discs in the mail the morning after they arrive, you can get up to 4 discs a week. At slightly over 4 weeks a month, that comes out to ~$1 per disc.
Footnote 2: AnyDVD HD is not a cheap program. A 21-day free trial is offered, after which point you'll need to spend $90 for one year of use, $150 for lifetime use, or somewhere in between for 2, 3 or 4 years. If you install it on a desktop you don't use, and just restore it to a pre-AnyDVD HD backup once the trial runs out, you should be able to use it for free indefinitely. Alternately, one can pirate the program, but because it receives frequent updates to support new movie releases, you'll likely have to re-download and reinstall just as often.
So there you have it. The entire process should take about as long as the movie is, although most of it is automated stuff--letting the files be ripped, demuxed, remuxed, etc.
It is considerably faster than re-encoding the audio and video streams, which on a quad core CPU can easily take 5-10 hours and leave your computer nigh-unusable. The downside is that you'll be sticking with the encoding from the disc, which might be things your computer/software doesn't understand (VC-1 or DTS, for example), and probably leave you with a 20 gig movie rather than a 10 gig movie. However, when you factor in the cost of the electricity required to keep a computer on for that encoding work versus the cost of storing the larger, perfect-quality rip, you'll probably find that the two are very close to each other. So for about the same cost you can keep the original audio and video quality, which should give peace of mind that nothing but trailers, menu fluff, and languages you don't speak were lost in the process.
For people using Windows Vista and 7, VC-1 and/or H.264 support should exist right out of the box, so all you have to do now is use a program, like MPC Hometheater, which respects chapter markers, lets you switch audio streams, has support for soft subtitles, and gives you GPU acceleration to smooth your video playback, and you're set!
Final note: This is not meant to be a public forum for discussing the legality or morality of ripping Blu-ray movies that you purchase, rent, or borrow.
[Furry] What Do I Think Of You?
Posted 16 years agoFirst, a wrap-up of my previous journal:
After reading all the responses, it seems that the general opinion of me, that people were willing to voice in public is thus. I'm pretty nice, shy and worried about what people think about me. I'm also opinionated and sometimes stubborn, arrogant, and a show-off. Fair enough assessment of me, I guess.
Oh, and as for my fursona, people think that it's pretty hawt but perhaps a bit too red.
Now for the real meat of the journal. As promised on my previous journal, I'm now going to give people the opportunity to ask me, publicly or in notes, what my opinion is of them. A few disclaimers first.
1) Just because I'm offering up my opinion of people doesn't mean I somehow think it's more important than other people's opinions of them. I'm simply willing to share it for those who are curious, in the same way that I was curious about how others viewed me.
2) I will be polite, but honest, and fair to the best of my ability. If I don't know enough about you based on our online/offline interactions, I'll say that.
With that all said, readysetgo! Ask away, and I'll tell you what I think of you.
After reading all the responses, it seems that the general opinion of me, that people were willing to voice in public is thus. I'm pretty nice, shy and worried about what people think about me. I'm also opinionated and sometimes stubborn, arrogant, and a show-off. Fair enough assessment of me, I guess.
Oh, and as for my fursona, people think that it's pretty hawt but perhaps a bit too red.
Now for the real meat of the journal. As promised on my previous journal, I'm now going to give people the opportunity to ask me, publicly or in notes, what my opinion is of them. A few disclaimers first.
1) Just because I'm offering up my opinion of people doesn't mean I somehow think it's more important than other people's opinions of them. I'm simply willing to share it for those who are curious, in the same way that I was curious about how others viewed me.
2) I will be polite, but honest, and fair to the best of my ability. If I don't know enough about you based on our online/offline interactions, I'll say that.
With that all said, readysetgo! Ask away, and I'll tell you what I think of you.
[Furry] What Do You Think Of Me?
Posted 16 years agoHonesty and transparency can cause drama, but bottled up feelings can cause it ten times as bad. So let's lay all our cards on the table.
Tell me what you think of me. You can do it in two parts if you want, talking both about my character and me, the player... but what I'm most after is the latter. I want to know what sort of vibe I give off to people. Am I a quiet guy who mostly keeps to his own space and watches politely? Am I an opinionated jerk who seeks out beehives to stir up? Why do you think that? I promise to take any negative feedback or rumors in stride.
Respond in a comment if you want. Or in a note if you have stuff you don't want to mention publicly. Or email ( twiledragon @ gmail . com ) from an account which I won't be able to tie back to your FA account, if you want to be totally anonymous.
For my next journal, I'll do the opposite--people can comment and I'll tell them what I think of them and why.
Tell me what you think of me. You can do it in two parts if you want, talking both about my character and me, the player... but what I'm most after is the latter. I want to know what sort of vibe I give off to people. Am I a quiet guy who mostly keeps to his own space and watches politely? Am I an opinionated jerk who seeks out beehives to stir up? Why do you think that? I promise to take any negative feedback or rumors in stride.
Respond in a comment if you want. Or in a note if you have stuff you don't want to mention publicly. Or email ( twiledragon @ gmail . com ) from an account which I won't be able to tie back to your FA account, if you want to be totally anonymous.
For my next journal, I'll do the opposite--people can comment and I'll tell them what I think of them and why.
FA+
