[Misc] Happy Anniversary and Independence Day
Posted 15 years agoI'm not good with words about feelings and shit so I'll try to put this in sensible terms D:
sepffuzzball, Hurry up and rebuild your computer so we can do romantic shit like slay zombies by candlelight while blasting Pendulum <3
To everyone else, happy 4th and try to keep all your fingers attached!
sepffuzzball, Hurry up and rebuild your computer so we can do romantic shit like slay zombies by candlelight while blasting Pendulum <3To everyone else, happy 4th and try to keep all your fingers attached!
[Tech] Twile on Intellectual Property
Posted 15 years agoThis is a subject I've wanted to speak on at length for a very long time now, because it impacts many of the things that I care passionately about--everything from home entertainment to furry porn and technology. I've tried writing this journal a dozen times over, and this is my final attempt. Bear in mind that the following is a combination of my opinion and reasoning, and isn't grounded in actual factual law. I frankly don't care what's legal and not when it comes to such thought exercises.
When it comes to the topic of digital media, games, software, etc, I am a firm believer that you deserve to be fairly compensated for your work until you adopt unfair practices. Or in other words, your right to not have people rip off your content and sell it for their own profit ends when you start to profit on it. They're not supposed to profit on it? You're not supposed to profit on it!
We're supposed to live in a largely capitalist society. We have free competition to provide people with the goods they want, and to keep prices down: if I'm able to sell my oranges with $5 profit per crate, somebody else will get in the orange-growing business and sell theirs for $4 profit per crate. Then I lower to $3 profit per crate, and we do this until we hit no profit. Everyone wins the same amount: I can maintain my business and get an industry-standard wage doing something I (hopefully) love, you get oranges at the lowest price possible. Right? And we have government involvement to prevent abuses from companies: if companies are organizing to fix prices at artificially high points, they get raped. If there's no competition, measures are generally taken to prevent customer abuses as well.
This falls apart completely when it comes to creative stuff of the digital variety, with no marginal cost (what it costs to make one more unit available). If you make Lord of the Rings for $280 million and earn $2800 million, you just earned disgusting profit on an admittedly excellent movie trilogy. But still, disgusting levels of profit. Now clearly, you need to cover your $280 million bill, but after that point? You really should be giving your product away. Not because it's worthless, but because it would then fit into the model that everyone else has to play by: cover your expenses and no more.
But movie studios don't do this. They don't have to do this, they don't want to do this. I'm sure theater-goers would love it--free movies, just pay for popcorn! Theaters would love it too, as ticket sales don't give them money... concessions do. More people would see more movies. Isn't that what this creative business is supposed to be about, making content that evokes emotional responses and sharing it with people?
There's my first beef with groups like movie studios. After they cover their costs for a product, they start making pure profit that nobody can cut into, which isn't how things are supposed to work. People say that you can see another movie instead of (fill in the blank) but for most of the movies that you should want to see, that's just not an acceptable substitute. The Matrix trilogy is about as good of a replacement for The Lord of the Rings trilogy as grapes are a replacement for lemons. They're a different flavor/experience even if they're both in the same rough category. Seriously, how far do you honor such an argument? "One company might own all the TV stations in the world and charge out the ass for them, but it's not a monopoly because you can go outside and play frisbee. They're both entertainment activities, right?"
But Lord of the Rings doesn't get off the hook so quickly. After making $2.5 billion in theaters, they released the DVDs, and then Extended Edition DVDs. Now, to their credit these were within 9-12 months of each movie coming out in theaters, which wasn't so bad at the time (now it's usually 4-6). My complain is in their pricing. If they're charging more than manufacturing costs, they're overcharging. We as a people already paid for the movies ten times over. No more money needs to be made. While almost giving away DVDs might seem silly, think about it. Did you not already pay for the right to see the movie? Perhaps several times? Several times each? Who exactly still needs to be paid for those $20/title discs? Torrenting and ripping should be totally acceptable for movies like Lord of the Rings--no distribution costs is exactly what you need to drive the marginal price of a movie ownership to zero.
And the worst offense comes to us in modern times. Return of the King was out in December 2003, the Extended Edition DVD was out in December 2004. Blu-ray discs started shipping in 2006. Where'd Lord of the Rings? Well, it's digital and lossless on a hard drive somewhere, and it's broadcast on HDTV now and then, but not on Blu-ray. Maybe they didn't want to launch on a format that might not win? Blu-ray was the official HD optical disc winner in early 2008 when HD-DVD admitted defeat. And yet it took 2 years to get the theatrical editions of the Lord of the Rings films out. 2 years, when the average Blu-ray mastering takes on the order of a month or two. And in stores they charge $80+ for the trilogy, from people who already paid that for the DVD trilogy, and again for the Extended Edition trilogy, and once more for all their theater viewings. And guess what? That's not the Extended Edition. Those come out in another 18-20 months. So you'll potentially have people (stupid, but still, plausible fans) who will pay $50-80 for each of: theater viewings, DVD, Extended DVD, Blu-ray, Extended Blu-ray. Numbers-wise, the person paid for their portion of the movie development about 56 minutes into the first movie (10% of the trilogy runtime, because only 1/10th of the revenue was to cover the film budget). The only reasons I can come up with for such behavior is financial motivation to release the movies later when there's a bigger Blu-ray install base, a desire to spread out more sales over time (I'm sure they sold at least a few DVDs between 2006 and 2010, some of which wouldn't have happened if Blu-ray were an option during that time frame), or trying to rekindle excitement about the franchise closer to the release of the Hobbit movies.
To be completely frank, it doesn't matter whether it was greeiness or laziness on the behalf of this or any other studio. This shit arises because studios get to control prices, distribution methods/formats and release schedules. Let me explain what I mean.If, after its theater run, somebody had gotten access to the digital master copy of the full-resolution Lord of the Rings trilogy, we wouldn't have these problems. Free copies of the movies would be available as portable-ready sub-1 GB files, burned DVDs, Blu-rays, and a variety of all-digital copies for computer playback. Everyone could enjoy it at any time. And why not? The studio was fairly compensated for its investment.
Not that I'm saying I expect studios to just give stuff away. But let's say I could go to them, sign a contract saying I'll pay them licensing fees for each copy of their movies I sell, and in return I get access to their lossless digital master copy. I make my own re-encode and sell it at my own price, and they get... I don't know, $1 or something, for each copy I sell. I promise you that people wouldn't be coughing up $20 per DVD for long. And 1080p Extended Editions of the Lord of the Rings series would've been shipping before 2010-2012.
You know the ultimate solution to this shit for games, TV shows, movies, music, all such things? Make a digital store that sells licenses for low costs, not much more than you need to stay profitable. $50 million game which is expected to move 10 million units? Try $5-10 per copy. Then let users download DRM-free copies of their content. Use P2P networks (think: BitTorrent) to make distribution essentially free. Can people take the content they bought and redistribute it to friends or to anyone via hard drive transfers, DVDs, or BitTorrent? Sure. But there are enough honest people out there willing to pay $5-10 for a game or movie with no restrictions, no fear of lawsuits, and peak quality (especially when they'd normally pay $20 for a movie, 60 for a game) to cover development costs. And to top it off, if a better master of a movie ever comes out (think: 4K resolution instead of 1080p, 1080p instead of 480p, etc), they just add an option to get that better version on their digital store. You already have the license for the content so you don't have to pay over again. That's how it should be. Better tech which allows for higher-grade distribution (Blu-ray vs DVD, whatever we use for 4K vs 1080p) should never be used as a crutch to push more sales on an old product.
To the credit of some companies, not all practices are shit. Sometimes the costs for DVD sales are justified when movies didn't break even in the box office (see: Serenity). Sometimes that cost is justified when, say, Disney does a total remaster of a decaying classic. Sometimes a company will offer to take an old DVD and upgrade you to a new Blu-ray for a small processing and production fee, essentially acknowledging that you have a license to view the content and not double-charging you for the right to see it in its gher quality. And for many movies these days, you can enjoy a home viewing "just" 4 months after the theater run starts (another annoyance: I want home viewing of movies on launch day as part of the same fee I pay for the lifetime viewing license. I loved How to Train Your Dragon, and I want to watch the flying scenes in 1080p in my apartment, now). Still, gross profits being made on media sales are the rule instead of the exception. And that's lame.
Really, that's why I feel no guilt in the idea of ripping movies, such as rentals. I see no compelling reason why anyone should have to give their money to a studio for something that no longer needs to be paid for. Imagine if you had to pay to keep owning your car after production costs had been met. Absurd, right? That you continue to enjoy it doesn't override the fact that the company has no further need to charge you for it.
I'll end this with something for you to consider and respond on. If I were to watch a movie on TV and walk out of the room during commercials, I'd be enjoying it for free. Furthermore if I had a nigh-photographic memory, I could recall the movie at will. I wouldn't have to pay anything for it, ever. And yet the second I want to play it on a screen for myself instead of in my mind for myself, I have to pay. Why is this distinction made? Is it because there's something inherent about having a movie played to me versus reconstructing it in my mind which makes it able to be charged for, suddenly? Or is it just because they haven't yet figured out how to prove that you were thinking about a movie and charge you appropriately? Why is using the camera in my mind to replay a movie I saw in theaters totally fine, but using my cellphone camera to do the same will get me in trouble with the law?
When it comes to the topic of digital media, games, software, etc, I am a firm believer that you deserve to be fairly compensated for your work until you adopt unfair practices. Or in other words, your right to not have people rip off your content and sell it for their own profit ends when you start to profit on it. They're not supposed to profit on it? You're not supposed to profit on it!
We're supposed to live in a largely capitalist society. We have free competition to provide people with the goods they want, and to keep prices down: if I'm able to sell my oranges with $5 profit per crate, somebody else will get in the orange-growing business and sell theirs for $4 profit per crate. Then I lower to $3 profit per crate, and we do this until we hit no profit. Everyone wins the same amount: I can maintain my business and get an industry-standard wage doing something I (hopefully) love, you get oranges at the lowest price possible. Right? And we have government involvement to prevent abuses from companies: if companies are organizing to fix prices at artificially high points, they get raped. If there's no competition, measures are generally taken to prevent customer abuses as well.
This falls apart completely when it comes to creative stuff of the digital variety, with no marginal cost (what it costs to make one more unit available). If you make Lord of the Rings for $280 million and earn $2800 million, you just earned disgusting profit on an admittedly excellent movie trilogy. But still, disgusting levels of profit. Now clearly, you need to cover your $280 million bill, but after that point? You really should be giving your product away. Not because it's worthless, but because it would then fit into the model that everyone else has to play by: cover your expenses and no more.
But movie studios don't do this. They don't have to do this, they don't want to do this. I'm sure theater-goers would love it--free movies, just pay for popcorn! Theaters would love it too, as ticket sales don't give them money... concessions do. More people would see more movies. Isn't that what this creative business is supposed to be about, making content that evokes emotional responses and sharing it with people?
There's my first beef with groups like movie studios. After they cover their costs for a product, they start making pure profit that nobody can cut into, which isn't how things are supposed to work. People say that you can see another movie instead of (fill in the blank) but for most of the movies that you should want to see, that's just not an acceptable substitute. The Matrix trilogy is about as good of a replacement for The Lord of the Rings trilogy as grapes are a replacement for lemons. They're a different flavor/experience even if they're both in the same rough category. Seriously, how far do you honor such an argument? "One company might own all the TV stations in the world and charge out the ass for them, but it's not a monopoly because you can go outside and play frisbee. They're both entertainment activities, right?"
But Lord of the Rings doesn't get off the hook so quickly. After making $2.5 billion in theaters, they released the DVDs, and then Extended Edition DVDs. Now, to their credit these were within 9-12 months of each movie coming out in theaters, which wasn't so bad at the time (now it's usually 4-6). My complain is in their pricing. If they're charging more than manufacturing costs, they're overcharging. We as a people already paid for the movies ten times over. No more money needs to be made. While almost giving away DVDs might seem silly, think about it. Did you not already pay for the right to see the movie? Perhaps several times? Several times each? Who exactly still needs to be paid for those $20/title discs? Torrenting and ripping should be totally acceptable for movies like Lord of the Rings--no distribution costs is exactly what you need to drive the marginal price of a movie ownership to zero.
And the worst offense comes to us in modern times. Return of the King was out in December 2003, the Extended Edition DVD was out in December 2004. Blu-ray discs started shipping in 2006. Where'd Lord of the Rings? Well, it's digital and lossless on a hard drive somewhere, and it's broadcast on HDTV now and then, but not on Blu-ray. Maybe they didn't want to launch on a format that might not win? Blu-ray was the official HD optical disc winner in early 2008 when HD-DVD admitted defeat. And yet it took 2 years to get the theatrical editions of the Lord of the Rings films out. 2 years, when the average Blu-ray mastering takes on the order of a month or two. And in stores they charge $80+ for the trilogy, from people who already paid that for the DVD trilogy, and again for the Extended Edition trilogy, and once more for all their theater viewings. And guess what? That's not the Extended Edition. Those come out in another 18-20 months. So you'll potentially have people (stupid, but still, plausible fans) who will pay $50-80 for each of: theater viewings, DVD, Extended DVD, Blu-ray, Extended Blu-ray. Numbers-wise, the person paid for their portion of the movie development about 56 minutes into the first movie (10% of the trilogy runtime, because only 1/10th of the revenue was to cover the film budget). The only reasons I can come up with for such behavior is financial motivation to release the movies later when there's a bigger Blu-ray install base, a desire to spread out more sales over time (I'm sure they sold at least a few DVDs between 2006 and 2010, some of which wouldn't have happened if Blu-ray were an option during that time frame), or trying to rekindle excitement about the franchise closer to the release of the Hobbit movies.
To be completely frank, it doesn't matter whether it was greeiness or laziness on the behalf of this or any other studio. This shit arises because studios get to control prices, distribution methods/formats and release schedules. Let me explain what I mean.If, after its theater run, somebody had gotten access to the digital master copy of the full-resolution Lord of the Rings trilogy, we wouldn't have these problems. Free copies of the movies would be available as portable-ready sub-1 GB files, burned DVDs, Blu-rays, and a variety of all-digital copies for computer playback. Everyone could enjoy it at any time. And why not? The studio was fairly compensated for its investment.
Not that I'm saying I expect studios to just give stuff away. But let's say I could go to them, sign a contract saying I'll pay them licensing fees for each copy of their movies I sell, and in return I get access to their lossless digital master copy. I make my own re-encode and sell it at my own price, and they get... I don't know, $1 or something, for each copy I sell. I promise you that people wouldn't be coughing up $20 per DVD for long. And 1080p Extended Editions of the Lord of the Rings series would've been shipping before 2010-2012.
You know the ultimate solution to this shit for games, TV shows, movies, music, all such things? Make a digital store that sells licenses for low costs, not much more than you need to stay profitable. $50 million game which is expected to move 10 million units? Try $5-10 per copy. Then let users download DRM-free copies of their content. Use P2P networks (think: BitTorrent) to make distribution essentially free. Can people take the content they bought and redistribute it to friends or to anyone via hard drive transfers, DVDs, or BitTorrent? Sure. But there are enough honest people out there willing to pay $5-10 for a game or movie with no restrictions, no fear of lawsuits, and peak quality (especially when they'd normally pay $20 for a movie, 60 for a game) to cover development costs. And to top it off, if a better master of a movie ever comes out (think: 4K resolution instead of 1080p, 1080p instead of 480p, etc), they just add an option to get that better version on their digital store. You already have the license for the content so you don't have to pay over again. That's how it should be. Better tech which allows for higher-grade distribution (Blu-ray vs DVD, whatever we use for 4K vs 1080p) should never be used as a crutch to push more sales on an old product.
To the credit of some companies, not all practices are shit. Sometimes the costs for DVD sales are justified when movies didn't break even in the box office (see: Serenity). Sometimes that cost is justified when, say, Disney does a total remaster of a decaying classic. Sometimes a company will offer to take an old DVD and upgrade you to a new Blu-ray for a small processing and production fee, essentially acknowledging that you have a license to view the content and not double-charging you for the right to see it in its gher quality. And for many movies these days, you can enjoy a home viewing "just" 4 months after the theater run starts (another annoyance: I want home viewing of movies on launch day as part of the same fee I pay for the lifetime viewing license. I loved How to Train Your Dragon, and I want to watch the flying scenes in 1080p in my apartment, now). Still, gross profits being made on media sales are the rule instead of the exception. And that's lame.
Really, that's why I feel no guilt in the idea of ripping movies, such as rentals. I see no compelling reason why anyone should have to give their money to a studio for something that no longer needs to be paid for. Imagine if you had to pay to keep owning your car after production costs had been met. Absurd, right? That you continue to enjoy it doesn't override the fact that the company has no further need to charge you for it.
I'll end this with something for you to consider and respond on. If I were to watch a movie on TV and walk out of the room during commercials, I'd be enjoying it for free. Furthermore if I had a nigh-photographic memory, I could recall the movie at will. I wouldn't have to pay anything for it, ever. And yet the second I want to play it on a screen for myself instead of in my mind for myself, I have to pay. Why is this distinction made? Is it because there's something inherent about having a movie played to me versus reconstructing it in my mind which makes it able to be charged for, suddenly? Or is it just because they haven't yet figured out how to prove that you were thinking about a movie and charge you appropriately? Why is using the camera in my mind to replay a movie I saw in theaters totally fine, but using my cellphone camera to do the same will get me in trouble with the law?
[Tech] Twile's (Sexy) Movie Database Project
Posted 15 years agoAs those who know me well are probably aware, I have at my disposal a lot of movies. I have for a long time sought a good way to browse these movies: something smooth and interactive with thumbnails and search filters and the ability to see movies with unusual amounts of actor overlaps, all kind of things. Unfortunately, for even the simpler things, I'm simply not pleased with what exists in the world. The closest I ever found was My Movies, which was partly community effort to build up information about movies, partly a way to organize and start playback for your movies. But My Movies had problems abound, and I am so absurdly picky.
What it boils down to is this: I want an excellent movie database solution. No IMDB, no Rotten Tomatoes--I want something that keeps track of what movies I have, lets me browse them in any way I can think up, and most importantly lets me start playing them with the press of a button. Thus, I'm creating an AJAX-rich application, to be viewed locally or over the internet, for browsing my movie selection.
For most of the past week, I've been working on the new project. I took my existing movie listing, an Excel spreadsheet, and imported the data into a mySQL database. This required me to brush up on mySQL. Then I created some simple pages to pull info from the database. This required me to brush up on PHP. Then I added the ability to edit and update information on the page without reloading. This required me to brush up on AJAX. Just last night, I added the ability to browse the movies with a search filter that filters the results in real time, updating with every keystroke (at least, every keystroke that changes the value of the search box! Arrow keys, replacing a letter with the same letter, don't set off an update). At this point I'm making screencaps of title screens to spice up what's otherwise a bland text listing (albeit searchable and with pagination). So far, it's looking pretty sexy, just with the 1/8th 1080p thumbnails. I can't wait until I've got enough functionality in place to not feel guilty about adding some style sheets.
Your thoughts? Are you interested in having a copy (of the code and instructions, not the movies, you doofs) when I'm done? Is this a totally pointless project that just replicates some awesome thing that's already been made? Let me know! :3
What it boils down to is this: I want an excellent movie database solution. No IMDB, no Rotten Tomatoes--I want something that keeps track of what movies I have, lets me browse them in any way I can think up, and most importantly lets me start playing them with the press of a button. Thus, I'm creating an AJAX-rich application, to be viewed locally or over the internet, for browsing my movie selection.
For most of the past week, I've been working on the new project. I took my existing movie listing, an Excel spreadsheet, and imported the data into a mySQL database. This required me to brush up on mySQL. Then I created some simple pages to pull info from the database. This required me to brush up on PHP. Then I added the ability to edit and update information on the page without reloading. This required me to brush up on AJAX. Just last night, I added the ability to browse the movies with a search filter that filters the results in real time, updating with every keystroke (at least, every keystroke that changes the value of the search box! Arrow keys, replacing a letter with the same letter, don't set off an update). At this point I'm making screencaps of title screens to spice up what's otherwise a bland text listing (albeit searchable and with pagination). So far, it's looking pretty sexy, just with the 1/8th 1080p thumbnails. I can't wait until I've got enough functionality in place to not feel guilty about adding some style sheets.
Your thoughts? Are you interested in having a copy (of the code and instructions, not the movies, you doofs) when I'm done? Is this a totally pointless project that just replicates some awesome thing that's already been made? Let me know! :3
[Tech] Newegg on PC vs. Console
Posted 15 years agoI came across this a couple weeks ago and thought it was a surprisingly balanced, intelligent look at the practical ups and downs of PC vs Console gaming: Newegg's PC vs Console Gaming Page.
Edit: Page contents copy-pasted because it only displays for Chrome users:
www.newegg.com wrote:PC Gaming vs. Console Gaming
Owning a PC gaming rig and a console gaming machine can be an expensive proposition, which is why most gamers choose one or the other. Since many popular software titles are commonly released for both PCs and consoles, it typically isn’t financially wise to invest in both sides. Newegg can help you weigh the pros and cons so you can decide on the gaming lifestyle that suits you best.
PC Gaming
THE GOOD: The majority of households have a computer of some kind. Chances are that you’re reading this on a personal computer and have passed some time playing Freecell or Solitaire. If your system is older or you have a business machine, a few simple upgrades, like faster RAM or a new video card, can turn your PC into a gaming rig. PCs can also display your games in higher resolution than a console. Even if a gaming console is hooked up to an HDTV many games don’t take advantage of the full 1080p high-definition resolution. Just make sure you have a good monitor with fast refresh rates or you may notice some stutter in your graphics. As an added bonus, PC software titles are typically $10 less expensive than console software titles. Also, popular massively multiplayer online games have so far been made exclusively for the PC. Finally, only PC gaming allows for true personalization. With a wide range of computer cases and input devices, you can truly stand out and show off your gaming personality. This customizability extends to the games themselves. Programming-savvy gamers are constantly creating new maps, plug-ins, patches and sometimes entire new games for existing titles.
THE BAD: Some technical know-how is necessary if you plan on keeping up with hardware demands. Not only do you need to know how to install upgrades, but you also have to understand if your components are compatible with each other. Staying on the cutting edge of hardware can also be very expensive, especially with multiple graphics cards and advanced cooling systems to prevent heat buildup from slowing performance. Additionally, since everyone’s computer is different, game developers have a harder time creating a title that will work flawlessly on every computer system. This reality has driven many software companies to develop exclusively for consoles where the hardware is uniform, resulting in less selection for the PC market. Additionally, since software piracy is rampant on the Internet, PC game developers have been forced to set up annoying anti-piracy measures that are typically just an inconvenience for paying customers and not a significant deterrent for pirates.
THE BOTTOM LINE: If you don’t mind doing a little research and doing your own upgrades, PC gaming can offer some of the most rewarding entertainment experiences around. The controls are more precise, the graphics are clearer and the online community is more mature. Best of all, when something breaks you can typically take care of it yourself.
Console Gaming
THE GOOD: Nothing beats the gaming console for affordable, high-quality gaming – especially after you wait a couple of years. While most households have a computer, even more have televisions. That means all you have to buy is a basic system and some cables to connect to your television and you can start gaming immediately. The price for a basic console is relatively affordable when compared to a high-end gaming PC. Technical know-how is kept to a minimum and network connections are simple to set up. Console gamers also get the widest selection of software titles, including games for younger players that are almost completely absent on the PC. The gaming console also enjoys uniformity in design that eliminates any guesswork in what the game will actually look like when you start playing. Best of all, you can rent console games at most video rental stores and can trade-in your purchased games at many video game shops for store credit.
THE BAD: When something breaks there is little you can do to fix it except send your console in for service if it’s still under warranty. Software titles are typically more expensive than PC titles by about $10. Furthermore, there isn’t much customization for your games. Some titles offer map editors, but the modifications available are severely limited. Many gamers may also be irritated with the frequent and typically long load times that plague console games. Some really great titles are also missing from the console market – namely anything worthwhile in the massively multiplayer online category as well as almost all real-time strategy games. Additionally, your game selection may further be limited to the console you choose since some titles are console-exclusive. Also keep in mind that the graphics of gaming consoles only really look good on an HDTV. If you don’t already own one, you’ll have to factor in that cost into your purchase if you want graphics that can match a gaming PC. Finally, expect to find a less mature online experience as you are assaulted with racial slurs, accusations about your sexual orientation and insults like “your mom is a dude!”
THE BOTTOM LINE: Console gaming is more affordable, simple to set up and easy to get into. The ability to rent and trade games in reduces your overall cost even more if you don’t care about owning anything. You’ll also have access to the widest selection of games, excepting a few popular categories that are next to impossible to play without a keyboard and mouse. Just make sure you have an HDTV or you maybe disappointed in the visuals.
I would've figured they'd be pushing PC gaming heavier because I'm sure the parts for a gaming rig bring in more money, but there you have it, balanced and intelligent points for and against each.
The few points I'm finding that seem amusingly wrong:
* It's not "many" console games that don't take advantage of 1080p, it's almost all of them
* You don't need a good monitor with a fast refresh rate to avoid stutter
* Heat buildup doesn't slow performance, it just makes your computer restart and sometimes die completely
Other than that though, spot on :o
Edit: Page contents copy-pasted because it only displays for Chrome users:
www.newegg.com wrote:PC Gaming vs. Console Gaming
Owning a PC gaming rig and a console gaming machine can be an expensive proposition, which is why most gamers choose one or the other. Since many popular software titles are commonly released for both PCs and consoles, it typically isn’t financially wise to invest in both sides. Newegg can help you weigh the pros and cons so you can decide on the gaming lifestyle that suits you best.
PC Gaming
THE GOOD: The majority of households have a computer of some kind. Chances are that you’re reading this on a personal computer and have passed some time playing Freecell or Solitaire. If your system is older or you have a business machine, a few simple upgrades, like faster RAM or a new video card, can turn your PC into a gaming rig. PCs can also display your games in higher resolution than a console. Even if a gaming console is hooked up to an HDTV many games don’t take advantage of the full 1080p high-definition resolution. Just make sure you have a good monitor with fast refresh rates or you may notice some stutter in your graphics. As an added bonus, PC software titles are typically $10 less expensive than console software titles. Also, popular massively multiplayer online games have so far been made exclusively for the PC. Finally, only PC gaming allows for true personalization. With a wide range of computer cases and input devices, you can truly stand out and show off your gaming personality. This customizability extends to the games themselves. Programming-savvy gamers are constantly creating new maps, plug-ins, patches and sometimes entire new games for existing titles.
THE BAD: Some technical know-how is necessary if you plan on keeping up with hardware demands. Not only do you need to know how to install upgrades, but you also have to understand if your components are compatible with each other. Staying on the cutting edge of hardware can also be very expensive, especially with multiple graphics cards and advanced cooling systems to prevent heat buildup from slowing performance. Additionally, since everyone’s computer is different, game developers have a harder time creating a title that will work flawlessly on every computer system. This reality has driven many software companies to develop exclusively for consoles where the hardware is uniform, resulting in less selection for the PC market. Additionally, since software piracy is rampant on the Internet, PC game developers have been forced to set up annoying anti-piracy measures that are typically just an inconvenience for paying customers and not a significant deterrent for pirates.
THE BOTTOM LINE: If you don’t mind doing a little research and doing your own upgrades, PC gaming can offer some of the most rewarding entertainment experiences around. The controls are more precise, the graphics are clearer and the online community is more mature. Best of all, when something breaks you can typically take care of it yourself.
Console Gaming
THE GOOD: Nothing beats the gaming console for affordable, high-quality gaming – especially after you wait a couple of years. While most households have a computer, even more have televisions. That means all you have to buy is a basic system and some cables to connect to your television and you can start gaming immediately. The price for a basic console is relatively affordable when compared to a high-end gaming PC. Technical know-how is kept to a minimum and network connections are simple to set up. Console gamers also get the widest selection of software titles, including games for younger players that are almost completely absent on the PC. The gaming console also enjoys uniformity in design that eliminates any guesswork in what the game will actually look like when you start playing. Best of all, you can rent console games at most video rental stores and can trade-in your purchased games at many video game shops for store credit.
THE BAD: When something breaks there is little you can do to fix it except send your console in for service if it’s still under warranty. Software titles are typically more expensive than PC titles by about $10. Furthermore, there isn’t much customization for your games. Some titles offer map editors, but the modifications available are severely limited. Many gamers may also be irritated with the frequent and typically long load times that plague console games. Some really great titles are also missing from the console market – namely anything worthwhile in the massively multiplayer online category as well as almost all real-time strategy games. Additionally, your game selection may further be limited to the console you choose since some titles are console-exclusive. Also keep in mind that the graphics of gaming consoles only really look good on an HDTV. If you don’t already own one, you’ll have to factor in that cost into your purchase if you want graphics that can match a gaming PC. Finally, expect to find a less mature online experience as you are assaulted with racial slurs, accusations about your sexual orientation and insults like “your mom is a dude!”
THE BOTTOM LINE: Console gaming is more affordable, simple to set up and easy to get into. The ability to rent and trade games in reduces your overall cost even more if you don’t care about owning anything. You’ll also have access to the widest selection of games, excepting a few popular categories that are next to impossible to play without a keyboard and mouse. Just make sure you have an HDTV or you maybe disappointed in the visuals.
I would've figured they'd be pushing PC gaming heavier because I'm sure the parts for a gaming rig bring in more money, but there you have it, balanced and intelligent points for and against each.
The few points I'm finding that seem amusingly wrong:
* It's not "many" console games that don't take advantage of 1080p, it's almost all of them
* You don't need a good monitor with a fast refresh rate to avoid stutter
* Heat buildup doesn't slow performance, it just makes your computer restart and sometimes die completely
Other than that though, spot on :o
[Me] Work Rant
Posted 15 years agoAs I've explained previously, I have a job as a User Experience strategist for making websites. That means I find out who the users are, what features they want, what language makes sense to them, what organization they'll understand, and I use that to create the contents of every webpage on the site. Figuring out whether there are separate News and Events sections, or whether we combine them into News and Events, that's me. Deciding what to put on the homepage, how to organize it all, whether to use buttons or links, placing images and text, that's all me. I go from just a vision of "We want a website for people who want to travel to this country" and through much collaboration and research I make dozens of pages which just need to be duplicated in HTML and styled with final images inserted.
The problem I face it work is that everyone, including the clients, think that they're designers. They think that just because they're paying us to make the site, they can dictate what should go where. And can they? Yes. Should they? No. Why not? Because they're paying us to make the site. If they knew what the hell they were talking about, they would be doing it themselves and saving $150 an hour.
Take yesterday, for example. This company comes to us, and they've got a website for their product, but they want a new website. We burn through 30 hours (among other things) deciding what the top-level nav should be, what the sub-pages should be, what should be on every sub-page, so that users who are interested in their product/service can find all the information logically. We're all pretty happy with it, and we think it'll do well. The clients, however, don't think we get it. They tell us that they want us to use their existing site structure. And site content. So basically they just want to change the colors. Brilliant.
I guess a lot of people just don't realize that if your site layout, content and structure is shit, giving it a new look won't bring in more users. If they can't find the info they're looking for, they'll leave... doesn't matter whether or not the tabs looked shinier.
The problem I face it work is that everyone, including the clients, think that they're designers. They think that just because they're paying us to make the site, they can dictate what should go where. And can they? Yes. Should they? No. Why not? Because they're paying us to make the site. If they knew what the hell they were talking about, they would be doing it themselves and saving $150 an hour.
Take yesterday, for example. This company comes to us, and they've got a website for their product, but they want a new website. We burn through 30 hours (among other things) deciding what the top-level nav should be, what the sub-pages should be, what should be on every sub-page, so that users who are interested in their product/service can find all the information logically. We're all pretty happy with it, and we think it'll do well. The clients, however, don't think we get it. They tell us that they want us to use their existing site structure. And site content. So basically they just want to change the colors. Brilliant.
I guess a lot of people just don't realize that if your site layout, content and structure is shit, giving it a new look won't bring in more users. If they can't find the info they're looking for, they'll leave... doesn't matter whether or not the tabs looked shinier.
[Tech] Server hardware information required!
Posted 15 years agoI want to get two of these for home use. Aside from SAS-to-SATA breakaway cables, does anyone here with the appropriate knowledge have any information about things I'd need to buy to make these work, or reasons I don't want to mess with these at all?
For reference I'm trying to create a massive JBOD setup for my Windows Home Server box to spread dozens of terabytes of storage across dozens of hard drives, and I want the most space and money-efficient way to do that.
For reference I'm trying to create a massive JBOD setup for my Windows Home Server box to spread dozens of terabytes of storage across dozens of hard drives, and I want the most space and money-efficient way to do that.
[Misc/Furry/Tech] 100k Pageviews. Child porn. iPhone OS 4.
Posted 15 years ago100k pageviews
Happened earlier this week. Notable only because powers of ten sound impressive :p
Child porn whatever in the UK
[Because people are misunderstanding and misquoting me, I shall rephrase the following sentence.]
Like it or not, there's nothing wrong with the outcome of child porn, real or otherwise, only the process of getting it. Just like there's nothing wrong with recordings of violence, real or otherwise. The people involved in the acts should be snagged by the law, and anyone recording the acts as a result of planning should be treated appropriately--as a collaborator--but you shouldn't punish people for looking at something even if you find the activity to be unsavory. None of this "if you watch it you might do it" bullshit; you're essentially arguing for fighting thought-crime simply because it could result in real crime. But yeah... if you want something banned just because you think it's gross, you don't deserve to live in a society where you can enjoy something other people think is gross.
If you enjoy watching real rape or violence, well, that's weird. That just makes you a sick person. If it's after the fact and there wasn't anything you could've done and you didn't in any way contribute to the event happening (paying for kiddie porn directly helps it happen, thus should be illegal), then you're not a criminal, just sick in the head. And I don't mean that in an insulting way, but in the literal way. Relative to the norm, there's something wrong with what you like, and it makes you less of a nice person. Nice measuring your ability to coexist peacefully with other people.
And that's real rape, real violence, real violations of real human rights. Make it fictional and all you've got as an argument is that you want to fucking punish people who are in some way mentally unwell and suffering from something they never chose to enjoy. Actively tormenting sick people, now who's fucked up? And, AND, that's assuming that the "sick" person enjoys the fictional content in certain ways--if you like fictional rape art because you like to imagine yourself feeling powerless and embarassed and vulnerable with someone else exerting control over you, that's no longer "you are a bad person" sick like it would be if you enjoyed seeing a real person being abused... it's sick in a "that's strange and I can't imagine what you like about that" way. Kind of like how I feel about that Canadian sport with the brooms and the weight... curling, I think it is. Dear me, did I just indirectly compare curling to child pornography? o_o;
How the law passed in the UK relates to this up in the air. Some say it makes cub porn illegal. Some say it's been in place for a year and doesn't relate to furry stuff at all. In any case I try not to get caught up in actual laws that exist which I can't impact, instead I focus on what's right and wrong regardless of current law.
iPhone OS 4
While at work (yes, to reiterate, I finally have a job now!) I spent about 30 minutes reading bits and pieces about iPhone OS 4.0. While I may have missed big selling points, what I saw did not impress me terribly, and in some ways it disgusted me.
They added multitasking support. They added 7 different kinds of tasks that can be done in the background. 1) Background audio for things like Pandora and GPS, Voice over IP for things like Skype, GPS and cellphone tower-based location reporting for GPS and social networking apps, the ability for programs in the background to get notifications (say, a Twitter client detecting a new Tweet) and also display notifications (the client telling you that a new Tweet is in), the ability to perform tasks in the background (such as uploading photos to an online album), and "fast app switching" which lets you pause an application and resume when it opens back up.
I guess my question for them would be how this is any better, or indeed any different, than current multitasking stuff. At first I was thinking "Okay, it can do some special-purpose things in an effort to provide a solution for common examples of the usefulness of multitasking... GPS, Pandora, AIM, etc., without letting background tasks slow it down." Then I realized that "task completion" is hopelessly vague. If you can complete tasks in the background, then what's different between iPhone OS 4 and other mobile OSes in that regard? What could a program possibly be doing other than a) sending or receiving audio, b) maintaining an active connection with an online service, c) gathering location data, d) receiving information and notifications from an online service, e) displaying interactive notifications to the user, f) performing general tasks in the background and g) responding to user input? I don't get where the computational cycles were trimmed. I don't see how they did anything that justifies not including a basic feature like multitasking since they released the silly device 4 years ago or whatever. The fact that it only works on things newer than gen-2 iPhone/pod hardware means that Apple didn't do its job in making it a performance hit-free solution. I just don't get it.
They added tons of silly little features like a character count when you're sending an SMS (yeah, Apple counts that as one of the 100 new features), or customizable wallpaper. They also had some more notable ones, like a recreation of Xbox Live, the ability to make folders for your applications, and the ability to perform a web search from the regular search field.
Most notably, though, they added the iAd platform. This is the thing that I find to be downright disgusting. Let it be known, I don't like ads. I can't stand it when programs like Yahoo and AOL Instant Messenger give me animated advertisements that make noise and link to webpages and shit. Doing it on a mobile device where you have 8% of the pixel count is unacceptable. And yet that's exactly what Apple's pushing. Up to every 3 minutes, they say, an advertisement might sort of push its way up from the bottom of your screen. Like this. It won't happen in every application, but it's an option for app developers, and the developers get a 60% cut of the ad revenue... Apple takes 40% for bandwidth and offering the service. Of course, that's gonna be a mighty tempting offer for app developers. Free money, why not? The only reason not to include them is because you value a professional feel and you want it to make your app seem better than competitors, but if everyone starts doing it, then it will just make everything ad-swamped. But these aren't just simple ads. They're applications. They can have images, sound, videos, and even games. What a wonderful waste of bandwidth, processing cycles, and screen real estate on a platform where all of those are more precious than gold. And why are they doing this? That's why. Real classy, Apple.
Once more I'll reiterate that Apple isn't out to make any friends: they attack Google's use of ads by offering their own on the OS-level, they attack Adobe's use of Flash for ads, video, and everything, and they insult every mobile OS provider by saying that what those companies are doing is shit, and then doing the same damn thing.
But yes, ads... let's do without them where we can. Mobile applications are one place this is possible, as evidenced by billions of installations of programs prior to iAd.
Net Neutrality
Finally, Net Neutrality shot down in a court case, or something. Which is super lame. Here's hoping that as soon as someone tries to take advantage of it, there will be more legal battles to bring back Net Neutrality D:
Discuss ~_~
Happened earlier this week. Notable only because powers of ten sound impressive :p
Child porn whatever in the UK
[Because people are misunderstanding and misquoting me, I shall rephrase the following sentence.]
Like it or not, there's nothing wrong with the outcome of child porn, real or otherwise, only the process of getting it. Just like there's nothing wrong with recordings of violence, real or otherwise. The people involved in the acts should be snagged by the law, and anyone recording the acts as a result of planning should be treated appropriately--as a collaborator--but you shouldn't punish people for looking at something even if you find the activity to be unsavory. None of this "if you watch it you might do it" bullshit; you're essentially arguing for fighting thought-crime simply because it could result in real crime. But yeah... if you want something banned just because you think it's gross, you don't deserve to live in a society where you can enjoy something other people think is gross.
If you enjoy watching real rape or violence, well, that's weird. That just makes you a sick person. If it's after the fact and there wasn't anything you could've done and you didn't in any way contribute to the event happening (paying for kiddie porn directly helps it happen, thus should be illegal), then you're not a criminal, just sick in the head. And I don't mean that in an insulting way, but in the literal way. Relative to the norm, there's something wrong with what you like, and it makes you less of a nice person. Nice measuring your ability to coexist peacefully with other people.
And that's real rape, real violence, real violations of real human rights. Make it fictional and all you've got as an argument is that you want to fucking punish people who are in some way mentally unwell and suffering from something they never chose to enjoy. Actively tormenting sick people, now who's fucked up? And, AND, that's assuming that the "sick" person enjoys the fictional content in certain ways--if you like fictional rape art because you like to imagine yourself feeling powerless and embarassed and vulnerable with someone else exerting control over you, that's no longer "you are a bad person" sick like it would be if you enjoyed seeing a real person being abused... it's sick in a "that's strange and I can't imagine what you like about that" way. Kind of like how I feel about that Canadian sport with the brooms and the weight... curling, I think it is. Dear me, did I just indirectly compare curling to child pornography? o_o;
How the law passed in the UK relates to this up in the air. Some say it makes cub porn illegal. Some say it's been in place for a year and doesn't relate to furry stuff at all. In any case I try not to get caught up in actual laws that exist which I can't impact, instead I focus on what's right and wrong regardless of current law.
iPhone OS 4
While at work (yes, to reiterate, I finally have a job now!) I spent about 30 minutes reading bits and pieces about iPhone OS 4.0. While I may have missed big selling points, what I saw did not impress me terribly, and in some ways it disgusted me.
They added multitasking support. They added 7 different kinds of tasks that can be done in the background. 1) Background audio for things like Pandora and GPS, Voice over IP for things like Skype, GPS and cellphone tower-based location reporting for GPS and social networking apps, the ability for programs in the background to get notifications (say, a Twitter client detecting a new Tweet) and also display notifications (the client telling you that a new Tweet is in), the ability to perform tasks in the background (such as uploading photos to an online album), and "fast app switching" which lets you pause an application and resume when it opens back up.
I guess my question for them would be how this is any better, or indeed any different, than current multitasking stuff. At first I was thinking "Okay, it can do some special-purpose things in an effort to provide a solution for common examples of the usefulness of multitasking... GPS, Pandora, AIM, etc., without letting background tasks slow it down." Then I realized that "task completion" is hopelessly vague. If you can complete tasks in the background, then what's different between iPhone OS 4 and other mobile OSes in that regard? What could a program possibly be doing other than a) sending or receiving audio, b) maintaining an active connection with an online service, c) gathering location data, d) receiving information and notifications from an online service, e) displaying interactive notifications to the user, f) performing general tasks in the background and g) responding to user input? I don't get where the computational cycles were trimmed. I don't see how they did anything that justifies not including a basic feature like multitasking since they released the silly device 4 years ago or whatever. The fact that it only works on things newer than gen-2 iPhone/pod hardware means that Apple didn't do its job in making it a performance hit-free solution. I just don't get it.
They added tons of silly little features like a character count when you're sending an SMS (yeah, Apple counts that as one of the 100 new features), or customizable wallpaper. They also had some more notable ones, like a recreation of Xbox Live, the ability to make folders for your applications, and the ability to perform a web search from the regular search field.
Most notably, though, they added the iAd platform. This is the thing that I find to be downright disgusting. Let it be known, I don't like ads. I can't stand it when programs like Yahoo and AOL Instant Messenger give me animated advertisements that make noise and link to webpages and shit. Doing it on a mobile device where you have 8% of the pixel count is unacceptable. And yet that's exactly what Apple's pushing. Up to every 3 minutes, they say, an advertisement might sort of push its way up from the bottom of your screen. Like this. It won't happen in every application, but it's an option for app developers, and the developers get a 60% cut of the ad revenue... Apple takes 40% for bandwidth and offering the service. Of course, that's gonna be a mighty tempting offer for app developers. Free money, why not? The only reason not to include them is because you value a professional feel and you want it to make your app seem better than competitors, but if everyone starts doing it, then it will just make everything ad-swamped. But these aren't just simple ads. They're applications. They can have images, sound, videos, and even games. What a wonderful waste of bandwidth, processing cycles, and screen real estate on a platform where all of those are more precious than gold. And why are they doing this? That's why. Real classy, Apple.
Once more I'll reiterate that Apple isn't out to make any friends: they attack Google's use of ads by offering their own on the OS-level, they attack Adobe's use of Flash for ads, video, and everything, and they insult every mobile OS provider by saying that what those companies are doing is shit, and then doing the same damn thing.
But yes, ads... let's do without them where we can. Mobile applications are one place this is possible, as evidenced by billions of installations of programs prior to iAd.
Net Neutrality
Finally, Net Neutrality shot down in a court case, or something. Which is super lame. Here's hoping that as soon as someone tries to take advantage of it, there will be more legal battles to bring back Net Neutrality D:
Discuss ~_~
[Furry] Post-FWA
Posted 15 years agoWe're on our way home from FWA now, so here are my parting thoughts on the con.
People drank too much. People always drink too much. It makes me sad when alcohol and activities surrounding it (room parties and hangovers) tie people up from 9 PM to 1 PM.
Furries in groups make everything difficult. Differing schedules, diets, tastes and lots of social opportunities make it exceptionally difficult to get more than 5 people together for something as simple as food and a movie.
Fursuits are growing on me a bit. Maybe some year I'll get one. Not this year. Complete with cinnamon scent. Get back to me on this in 2011.
At future conventions, I'll make sure I have contact info for -everyone- I want to hang out with. There were a number of people I couldn't find when I wanted to (or, really, at all) because we never shared contact info. Another friend was at the con and I didn't even know. Must share contact info and be more aggressive in pursuing social interactions.
Really, my biggest problem at cons is that I'm way too passive when it comes to social interactions. I'll cruise the con floor every hour and look for familiar faces, try to round people up for meals and movies, but other than that in the absence of planned activities I'll sit and keep my eyes peeled for friendly faces.
Overall, pretty fun. Saw How to Train Your Dragon in 3D, two nights in a row--I loved it that much (has nothing to do with having a dragon character, either... it was just adorable). Went to dinner with motherfuckin' Fel and Fennec (twice for the latter). Got a Moonstalker badge.
People drank too much. People always drink too much. It makes me sad when alcohol and activities surrounding it (room parties and hangovers) tie people up from 9 PM to 1 PM.
Furries in groups make everything difficult. Differing schedules, diets, tastes and lots of social opportunities make it exceptionally difficult to get more than 5 people together for something as simple as food and a movie.
Fursuits are growing on me a bit. Maybe some year I'll get one. Not this year. Complete with cinnamon scent. Get back to me on this in 2011.
At future conventions, I'll make sure I have contact info for -everyone- I want to hang out with. There were a number of people I couldn't find when I wanted to (or, really, at all) because we never shared contact info. Another friend was at the con and I didn't even know. Must share contact info and be more aggressive in pursuing social interactions.
Really, my biggest problem at cons is that I'm way too passive when it comes to social interactions. I'll cruise the con floor every hour and look for familiar faces, try to round people up for meals and movies, but other than that in the absence of planned activities I'll sit and keep my eyes peeled for friendly faces.
Overall, pretty fun. Saw How to Train Your Dragon in 3D, two nights in a row--I loved it that much (has nothing to do with having a dragon character, either... it was just adorable). Went to dinner with motherfuckin' Fel and Fennec (twice for the latter). Got a Moonstalker badge.
[Furry] FWA!
Posted 15 years agoIn the car now with
sepffuzzball and
darkshift. ETA 1:30 PM. We'll be there until Sunday afternoon.
There shall be
kipfox nommage and it shall be fantaaaastic.
sepffuzzball and
darkshift. ETA 1:30 PM. We'll be there until Sunday afternoon.There shall be
kipfox nommage and it shall be fantaaaastic.[Tech] God of Laptops Part II
Posted 15 years agoA follow-up to God of Laptops Part I.
I went to the local Sony Style store a few hours ago to see if they had the 2010 Vaio Z series. For those who don't want to read my full journal last time, it's a high-end laptop--2.66-3.33 GHz dual core i7 processor, 4-8 GB DDR3 @ 1066 MHz, 120 GB SSD with internal RAID 0, carbon fiber shell, backlit keyboard, graphics card with 1 GB dedicated RAM that can play Crysis, and it weighs 3 pounds, the same as a Macbook Air, while still containing an optical drive. It has upgrade options to include 3G + GPS, and outside of the us, its 13.1" screen can be upgraded to a 1080p version.
As I was saying, I went to the local Sony Style store. The 2010 Z was the closest to the door, and god was it sexy. Sepf and I spent easily 10-15 minutes playing around on it. The screen, even at "just" 1600x900 (13.1" mind you!), was gorgeous. Very crisp, great viewing angles, bright, text was nice. A little more glossy than I'm used to, but it's acceptable.
The weight was startling. It's not super tiny like the Air--it's reasonably small, but not pencil thick at any point, so it feels lighter than it is. Sepf and I picked it up and we were (not for the first time) shocked at how light it was. Last time we thought the battery must've just been an insert to keep the weight down for people picking it up to feel the weight in the store, so this time we popped the battery out. Real battery. This fucker, which ties or bests my gaming desktop in every respect but GPU power, weighs less with the battery than you could think possible. It also wasn't hot at all on the bottom.
Fast as hell (slowest component got a 5.9 on the Windows performance index, and that was the RAM) and it was only the i5 version, bloated with pre-installed Vaio stuff. It'll only get faster. Keyboard feels better than the previous one they had in the store, but the keys feel oddly far apart and I'm not used to chiclet style (what Apple and some other companies these days use) either. But I guess I'll adapt to that in time.
When you go from dedicated to integrated graphics cards or back the screen goes black for about 3 seconds. It's a nuisance, I guess, but I won't be switching between the graphics cards very often. I'll just use dedicated until I plan to be mobile for more than a few minutes. Or if integrated is fast enough to do 1080p decoding, I'll use that until I want to play games.
But I'm not getting one yet. The hope/rumor is that once they sell off all the $4500 pre-built ones (they have under 2000 of them) with the 1080p screens, they'll introduce 1080p screen as a standard customization option. They already do this in Europe, where the upgrade is easily under $100 and most definitely worth it. I mean, can you imagine--combined with 802.11n, it will be a portable Blu-ray streaming device at native resolution. A lot of people with 40"+ HDTVs these days don't even have native 1080p.
Yeah, though, it's shaping up to be an extremely sexy laptop. $1900-2500 depending on the configuration (unless you spring for 512 GB SSDs) and amount of warranty you get, but if you use a laptop as much as me, it just might be worth it ;)
I swear that after this I'll stop talking about it until I have one (which might not be for 6 months).
FYI I hate Sony, but at least they're not Apple and nobody else provides this sort of laptop to the best of my knowledge.
I went to the local Sony Style store a few hours ago to see if they had the 2010 Vaio Z series. For those who don't want to read my full journal last time, it's a high-end laptop--2.66-3.33 GHz dual core i7 processor, 4-8 GB DDR3 @ 1066 MHz, 120 GB SSD with internal RAID 0, carbon fiber shell, backlit keyboard, graphics card with 1 GB dedicated RAM that can play Crysis, and it weighs 3 pounds, the same as a Macbook Air, while still containing an optical drive. It has upgrade options to include 3G + GPS, and outside of the us, its 13.1" screen can be upgraded to a 1080p version.
As I was saying, I went to the local Sony Style store. The 2010 Z was the closest to the door, and god was it sexy. Sepf and I spent easily 10-15 minutes playing around on it. The screen, even at "just" 1600x900 (13.1" mind you!), was gorgeous. Very crisp, great viewing angles, bright, text was nice. A little more glossy than I'm used to, but it's acceptable.
The weight was startling. It's not super tiny like the Air--it's reasonably small, but not pencil thick at any point, so it feels lighter than it is. Sepf and I picked it up and we were (not for the first time) shocked at how light it was. Last time we thought the battery must've just been an insert to keep the weight down for people picking it up to feel the weight in the store, so this time we popped the battery out. Real battery. This fucker, which ties or bests my gaming desktop in every respect but GPU power, weighs less with the battery than you could think possible. It also wasn't hot at all on the bottom.
Fast as hell (slowest component got a 5.9 on the Windows performance index, and that was the RAM) and it was only the i5 version, bloated with pre-installed Vaio stuff. It'll only get faster. Keyboard feels better than the previous one they had in the store, but the keys feel oddly far apart and I'm not used to chiclet style (what Apple and some other companies these days use) either. But I guess I'll adapt to that in time.
When you go from dedicated to integrated graphics cards or back the screen goes black for about 3 seconds. It's a nuisance, I guess, but I won't be switching between the graphics cards very often. I'll just use dedicated until I plan to be mobile for more than a few minutes. Or if integrated is fast enough to do 1080p decoding, I'll use that until I want to play games.
But I'm not getting one yet. The hope/rumor is that once they sell off all the $4500 pre-built ones (they have under 2000 of them) with the 1080p screens, they'll introduce 1080p screen as a standard customization option. They already do this in Europe, where the upgrade is easily under $100 and most definitely worth it. I mean, can you imagine--combined with 802.11n, it will be a portable Blu-ray streaming device at native resolution. A lot of people with 40"+ HDTVs these days don't even have native 1080p.
Yeah, though, it's shaping up to be an extremely sexy laptop. $1900-2500 depending on the configuration (unless you spring for 512 GB SSDs) and amount of warranty you get, but if you use a laptop as much as me, it just might be worth it ;)
I swear that after this I'll stop talking about it until I have one (which might not be for 6 months).
FYI I hate Sony, but at least they're not Apple and nobody else provides this sort of laptop to the best of my knowledge.
[Games] Starcraft II works in stereoscopic 3D
Posted 15 years agoI had an Overlord hovering above my keyboard. That wasn't unsettling at all.
...I should build a fleet of Carriers...
Edit: Carriers blew my mind. Or rather, their swarms of interceptors. It was like a cloud of robotic mosquitoes with mounted plasma rifles, going crazy not two feet in front of my nose.
...I should build a fleet of Carriers...
Edit: Carriers blew my mind. Or rather, their swarms of interceptors. It was like a cloud of robotic mosquitoes with mounted plasma rifles, going crazy not two feet in front of my nose.
*howls and rolls around*
Posted 15 years ago;O; Too many tacoooooos *flail*
[Furry] Going to FWA!
Posted 15 years agoSepf and I now have a room and will be there from mid-day Thursday to mid-day Sunday :3
The one other being, of course,
sepffuzzball. Because my bloo fuzz goes everywhere with me.
S'yeah, even though I'm not supposed to get vacation in my first 3 months of working at my new job, they're letting me take off Thursday and Friday for FWA. Sepf and I expect to arrive early to mid Thursday afternoon and leave at some point on Sunday, probably noon or a little thereafter.
Only thing is, because I juust got this job recently, and juust asked about the vacation recently (best not to ask before I'm hired, right?), we don't have hotel space reserved. So, anyone out there with space on Thursday evening through Sunday morning? Sepf and I would, of course, split rooming costs evenly.
If we do get any offers, don't be upset because you offered first and I opted for another. There's a lot of trust that goes with rooming with someone, and if I can find a room with people I know, I'll prefer that--you understand, I hope.
sepffuzzball. Because my bloo fuzz goes everywhere with me.S'yeah, even though I'm not supposed to get vacation in my first 3 months of working at my new job, they're letting me take off Thursday and Friday for FWA. Sepf and I expect to arrive early to mid Thursday afternoon and leave at some point on Sunday, probably noon or a little thereafter.
Only thing is, because I juust got this job recently, and juust asked about the vacation recently (best not to ask before I'm hired, right?), we don't have hotel space reserved. So, anyone out there with space on Thursday evening through Sunday morning? Sepf and I would, of course, split rooming costs evenly.
If we do get any offers, don't be upset because you offered first and I opted for another. There's a lot of trust that goes with rooming with someone, and if I can find a room with people I know, I'll prefer that--you understand, I hope.
[Tech] God of Laptops
Posted 15 years agoWith a little help from my number one folf, I have come across what very well might be the God of Laptops. Something which, for a striking change, I might actually want to purchase.
See, normally trying to find a laptop is like deciding whether I want to poke out an eye, or have my legs broken. I don't really want either of those things to happen. So many laptops these days are just absolute trash, which when you spend at least $1000 for anything decent, is just unacceptable. Plastic? No, sorry, plastic scratches, breaks, discolors and feels like a child's toy. Optical drive? Um, why should I devote a third of the volume of my computer to something which is never necessary after OS installation and can be imitated with an external USB DVD drive? High-resolution screen? Backlit keyboard? Good performance and battery life? It's like you can only pick half of the things you want in a laptop, and instead of saying "I want something which is fast, portable, elegant, and sturdy" you have to say "Wellll, I have a gaming desktop, so even though portable gaming might be nice if I go to a con or on a trip, I don't really neeeed it..." and make sacrifices. It doesn't matter how much money you throw at the problem. But this laptop, this is something different.
Sony Vaio Z series. Think what you will about Sony, they're going somewhere that nobody else seems to explore.
Start with a 13.1" screen, slightly smaller than I have now. Good size. Make it 1080p. Wait, what? Yeah, 1920x1080 with LED backlighting. You know the resolution your 60" TV has? You can carry that around on a 13.1" screen now. I've yet to see that sort of thing on any other device. Around that, build a case from carbon fiber and aluminum, making something which is light, strong, durable, and attractive. Add in a backlit keyboard with ambient light sensor and your standard suite of ports on the side--USB, Gigabit, HDMI, VGA (bleh!), that sort of thing. Speakers too. For good measure, toss in an integrated 3G connection, GPS capability, and of course 802.11-n for wireless. And have an option for no optical drive, so you aren't wasting over a third of the internal volume on something you never actually need to use. So basically, with the exception of a 3D-capable/OLED and/or multitouch display, this thing has all the I/O you could really hope for. Which is sexy. If this was all the device had going for it and it was just standard processors and other parts, I'd be set to buy it already.
But let's look at the real guts. 2.66-3.33 GHz dynamic overclocking (dual core) i7 processor. Nvidia GT 330m discrete graphics with 1 gig of dedicated RAM (up to an extra 3 shared with system memory). 4-8 GB of DDR3 @ 1066 MHz, which actually makes my desktop jealous. Toss in a 120/128 GB solid state drive for good measure, to eliminate spinning parts from the computational side of things (there's probably still a fan inside for cooling). Performance-wise it's somewhat of a monster for its class. It'll play Crysis, L4D2, FEAR 2, Far Cry 2, and other ____ 2 games just fine (Mass Effect 2 and Bioshock 2 might be included in this list). Sure, if you opt for the native 1080p you'll take a performance hit, but the fact that you can run a modern game at close to 30 fps on the highest settings and 1080p, well, that's something you can't even get from consoles these days, and those fuckers have power bricks that weigh more than this thing. Battery-wise you're looking at 3.5-4.5 hours for video playback, 6-6.5 hours for document composition, somewhere between that for web browsing I assume, and 30 seconds for gaming (I kid--I don't know, but does ANYTHING get decent battery life when doing mobile gaming with respectable graphics? No.)
Let me try to recap and put this into perspective for you, just to make sure we're on the same page here. Performance-wise this thing has slightly more CPU oomph than the highest-configured 17" Macbook Pro and 20-50% more GPU performance, specs comparable in every other way (although the screen is 1080 pixels tall instead of the MBP's 1200). And it does this with the size, weight and battery life of the Macbook Air.
There are a few caveats, though. The only version they sell in the US with a 1080p screen is configured all the way up to $4500 (it's a true beast, quad SSD RAID 0 totaling 512 GB of storage space, Blu-ray burner, the works). Don't get me wrong, the 1600x900 version in the store was amazingly fine and beautiful, but there's something sexy about being able to play 1080p content at native resolution, and the resolution support for games would surely be better. They sell a 1080p upgrade for ~$75 in Europe, but to configure it up to what I would want (I'd settle for 4 GB of DDR3 and install my own 120 GB SSD as cost-saving measures) it's going to be $550 more than the American version, ignoring a) the shipping cost, b) worrying about what happens if it breaks, and c) finding someone in the UK who I trust enough to give $2500.
And that's another caveat for both models. The American model would be $2343 with tax and shipping, the UK model would be closer to $2900. Either one is very expensive for a laptop, but when you compare the performance to a Macbook Pro with an SSD, 8 gigs of RAM and 3.06 GHz dual core, you find that the Z offers a good 20% more performance with half the weight and $1000 savings. Regardless of how badly Apple overprices their uber-premium hardware, $2343 isn't a trivial amount of money. But given that I tend to use my laptop on and off for an average of 8 hours a day, every day, for three years... it actually feels a lot more reasonable.
Still, can you think of a more capable mobile device under 6 pounds? It's my belief that it's okay to spend more on something you'll truly love for years, and it's far preferable to saving a few hundred bucks, maybe a thousand, and winding up with a device that pisses you off constantly. This laptop, unlike every other single one I have ever seen, embodies that belief.
Some shots of the Z, to show how pretty it is [link]. When you're looking at it, remember: this thing probably makes your desktop jealous.
Edit: Let it be known that
rinorex is a self-professed troll. Please disregard his comments.
See, normally trying to find a laptop is like deciding whether I want to poke out an eye, or have my legs broken. I don't really want either of those things to happen. So many laptops these days are just absolute trash, which when you spend at least $1000 for anything decent, is just unacceptable. Plastic? No, sorry, plastic scratches, breaks, discolors and feels like a child's toy. Optical drive? Um, why should I devote a third of the volume of my computer to something which is never necessary after OS installation and can be imitated with an external USB DVD drive? High-resolution screen? Backlit keyboard? Good performance and battery life? It's like you can only pick half of the things you want in a laptop, and instead of saying "I want something which is fast, portable, elegant, and sturdy" you have to say "Wellll, I have a gaming desktop, so even though portable gaming might be nice if I go to a con or on a trip, I don't really neeeed it..." and make sacrifices. It doesn't matter how much money you throw at the problem. But this laptop, this is something different.
Sony Vaio Z series. Think what you will about Sony, they're going somewhere that nobody else seems to explore.
Start with a 13.1" screen, slightly smaller than I have now. Good size. Make it 1080p. Wait, what? Yeah, 1920x1080 with LED backlighting. You know the resolution your 60" TV has? You can carry that around on a 13.1" screen now. I've yet to see that sort of thing on any other device. Around that, build a case from carbon fiber and aluminum, making something which is light, strong, durable, and attractive. Add in a backlit keyboard with ambient light sensor and your standard suite of ports on the side--USB, Gigabit, HDMI, VGA (bleh!), that sort of thing. Speakers too. For good measure, toss in an integrated 3G connection, GPS capability, and of course 802.11-n for wireless. And have an option for no optical drive, so you aren't wasting over a third of the internal volume on something you never actually need to use. So basically, with the exception of a 3D-capable/OLED and/or multitouch display, this thing has all the I/O you could really hope for. Which is sexy. If this was all the device had going for it and it was just standard processors and other parts, I'd be set to buy it already.
But let's look at the real guts. 2.66-3.33 GHz dynamic overclocking (dual core) i7 processor. Nvidia GT 330m discrete graphics with 1 gig of dedicated RAM (up to an extra 3 shared with system memory). 4-8 GB of DDR3 @ 1066 MHz, which actually makes my desktop jealous. Toss in a 120/128 GB solid state drive for good measure, to eliminate spinning parts from the computational side of things (there's probably still a fan inside for cooling). Performance-wise it's somewhat of a monster for its class. It'll play Crysis, L4D2, FEAR 2, Far Cry 2, and other ____ 2 games just fine (Mass Effect 2 and Bioshock 2 might be included in this list). Sure, if you opt for the native 1080p you'll take a performance hit, but the fact that you can run a modern game at close to 30 fps on the highest settings and 1080p, well, that's something you can't even get from consoles these days, and those fuckers have power bricks that weigh more than this thing. Battery-wise you're looking at 3.5-4.5 hours for video playback, 6-6.5 hours for document composition, somewhere between that for web browsing I assume, and 30 seconds for gaming (I kid--I don't know, but does ANYTHING get decent battery life when doing mobile gaming with respectable graphics? No.)
Let me try to recap and put this into perspective for you, just to make sure we're on the same page here. Performance-wise this thing has slightly more CPU oomph than the highest-configured 17" Macbook Pro and 20-50% more GPU performance, specs comparable in every other way (although the screen is 1080 pixels tall instead of the MBP's 1200). And it does this with the size, weight and battery life of the Macbook Air.
There are a few caveats, though. The only version they sell in the US with a 1080p screen is configured all the way up to $4500 (it's a true beast, quad SSD RAID 0 totaling 512 GB of storage space, Blu-ray burner, the works). Don't get me wrong, the 1600x900 version in the store was amazingly fine and beautiful, but there's something sexy about being able to play 1080p content at native resolution, and the resolution support for games would surely be better. They sell a 1080p upgrade for ~$75 in Europe, but to configure it up to what I would want (I'd settle for 4 GB of DDR3 and install my own 120 GB SSD as cost-saving measures) it's going to be $550 more than the American version, ignoring a) the shipping cost, b) worrying about what happens if it breaks, and c) finding someone in the UK who I trust enough to give $2500.
And that's another caveat for both models. The American model would be $2343 with tax and shipping, the UK model would be closer to $2900. Either one is very expensive for a laptop, but when you compare the performance to a Macbook Pro with an SSD, 8 gigs of RAM and 3.06 GHz dual core, you find that the Z offers a good 20% more performance with half the weight and $1000 savings. Regardless of how badly Apple overprices their uber-premium hardware, $2343 isn't a trivial amount of money. But given that I tend to use my laptop on and off for an average of 8 hours a day, every day, for three years... it actually feels a lot more reasonable.
Still, can you think of a more capable mobile device under 6 pounds? It's my belief that it's okay to spend more on something you'll truly love for years, and it's far preferable to saving a few hundred bucks, maybe a thousand, and winding up with a device that pisses you off constantly. This laptop, unlike every other single one I have ever seen, embodies that belief.
Some shots of the Z, to show how pretty it is [link]. When you're looking at it, remember: this thing probably makes your desktop jealous.
Edit: Let it be known that
rinorex is a self-professed troll. Please disregard his comments.[Me] I now have a job.
Posted 15 years agoSome of you may be surprised to find out that I haven't been employed since last spring, when I graduated (people who actually talk to me will be all too familiar with this fact, though). That changes here and now, because I turned in my signed acceptance papers for my job today. Or, tomorrow when they get sent out to the payroll office.. Or yesterday when I was given the forms. Maybe Monday when I worked my first day. Or was it Sunday when I found out they intended to hire me? Perhaps last Thursday when they asked me to come in after the first day. Or Wednesday when I had the interview and they said they wanted me to come in for a trial period.
IN ANY CASE, here's what happened. Two weeks ago, the UX Strategist (UX = User Experience, the guy who makes sure websites, software, products in general make sense to users and leave them feeling satisfied and happy with their tasks complete) for a company down here turned in his 2 weeks resignation notice. On Saturday a week and a half ago I read the posting on Craigslist and sent in my resume that night--to my great pleasure at the time, I didn't have to redo my resume from when I last updated it back in September. That's how well the position fit everything that I trained for in college. I could finally boast about where I got my education and what my GPA was, rather than hoping on the application form they didn't ask about college education, because saying where I went would instantly overqualify me. On top of that, because (as you guys know) I value openness and honesty, I wrote a very candid email to accompany my resume, which stated that I had no experience outside of school, no experience with any of the software they listed, but I spent a semester teaching, felt confident on my skills and thought I could learn what I needed to in order to fill the job. After all, if my lack of experience would disqualify me from working for them, it's best to not waste my time and theirs with an interview.
They wrote back on Sunday to schedule an ASAP interview on Wednesday of last week. At the interview they seemed pretty pleased and said they'd be willing to have me in on Thursday and Friday to evaluate my skills and whatnot. The first half of Thursday was uneventful (as a result, I sorta passed out for a while... <._.>; which people noticed ;~; ), the second half I worked with the boss for 3-4 hours. Next day the boss was out so I spent the whole time drinking cappuccino and reading design books. On Sunday they said they wanted me back in Monday, and between then and now I've been doing UX design for 10 hours a day... which is why I've been pretty sparse online as of late.
But yeah, they're hiring me (even though I blatantly fell asleep during my first 4 hours xD). Not to replace the guy who worked his last day today, but as a future asset. I'm a junior UX designer, with in theory fewer responsibilities, intended to learn more and grow into the role over time (though they're still searching for an experienced replacement). In practice they've still been working my pants off. Here, design a website for this retirement investment firm. Now one for that metal parts manufacturer. Now this site that sells surfboards and tubes. We go over some materials from the clients, but other than that, they sit me down at a computer and have me hammer out sample designs for 3-10 different pages on each site. I then annotate them so they can be presented to the clients for feedback. There's basically no review of my work, if I decide I want buttons instead of hyperlinks or whatever, I don't really have anyone questioning it. It's strange to be not a week into working for them and essentially turn a document of what the client wants into a design they can look over.
This sounds all well and good, and it generally is, but there's a bit of a snag. Although I'll get my first paycheck before my next student loan payment is due, it's going to be a paper check, because the Direct Deposit stuff doesn't kick in until the second check. Which means I won't be able to get my check in the mail and to my bank in time for the loans--one of which will auto-deduct and bounce.
So anyone willing to gift or loan me some money for a couple weeks, well, that'd be just super. tweekdragon @ yahoo.com on Paypal, for those who are so interested and feeling generous. If you wanna call it a gift, or a 3-week loan, or money towards a co-commission, I won't complain. $130-ish should let my auto-deducting loan go through, $170 more and I can pay off my second student loan on time--if not, it'll be late and incur a $5 fee (trivial) and possibly result in them reporting my negligence to the important people who calculate credit (not trivial). As soon as I get my paycheck in my bank account I can pay people back via Paypal. No pressure, it'd just help a whole lot.
Basque` has just helped me with the situation though, so yay and many thanks to him ~_~
A bit of poetic justice/irony: the design studio that I work at uses Apple hardware exclusively. Now instead of just loudly bitching about anti-competitive market strategies, I get to deal with the frustration of a mouse that makes all the windows go into Expose mode if I squeeze too hard, icons that bounce when they want my attention, and no middle-click-to-open-in-new-tab for Firefox.
Anyway, that's where things are at, and where they're headed, and why I've been offline on IM programs these past couple weeks... and why I haven't commissioned much in the past year.
IN ANY CASE, here's what happened. Two weeks ago, the UX Strategist (UX = User Experience, the guy who makes sure websites, software, products in general make sense to users and leave them feeling satisfied and happy with their tasks complete) for a company down here turned in his 2 weeks resignation notice. On Saturday a week and a half ago I read the posting on Craigslist and sent in my resume that night--to my great pleasure at the time, I didn't have to redo my resume from when I last updated it back in September. That's how well the position fit everything that I trained for in college. I could finally boast about where I got my education and what my GPA was, rather than hoping on the application form they didn't ask about college education, because saying where I went would instantly overqualify me. On top of that, because (as you guys know) I value openness and honesty, I wrote a very candid email to accompany my resume, which stated that I had no experience outside of school, no experience with any of the software they listed, but I spent a semester teaching, felt confident on my skills and thought I could learn what I needed to in order to fill the job. After all, if my lack of experience would disqualify me from working for them, it's best to not waste my time and theirs with an interview.
They wrote back on Sunday to schedule an ASAP interview on Wednesday of last week. At the interview they seemed pretty pleased and said they'd be willing to have me in on Thursday and Friday to evaluate my skills and whatnot. The first half of Thursday was uneventful (as a result, I sorta passed out for a while... <._.>; which people noticed ;~; ), the second half I worked with the boss for 3-4 hours. Next day the boss was out so I spent the whole time drinking cappuccino and reading design books. On Sunday they said they wanted me back in Monday, and between then and now I've been doing UX design for 10 hours a day... which is why I've been pretty sparse online as of late.
But yeah, they're hiring me (even though I blatantly fell asleep during my first 4 hours xD). Not to replace the guy who worked his last day today, but as a future asset. I'm a junior UX designer, with in theory fewer responsibilities, intended to learn more and grow into the role over time (though they're still searching for an experienced replacement). In practice they've still been working my pants off. Here, design a website for this retirement investment firm. Now one for that metal parts manufacturer. Now this site that sells surfboards and tubes. We go over some materials from the clients, but other than that, they sit me down at a computer and have me hammer out sample designs for 3-10 different pages on each site. I then annotate them so they can be presented to the clients for feedback. There's basically no review of my work, if I decide I want buttons instead of hyperlinks or whatever, I don't really have anyone questioning it. It's strange to be not a week into working for them and essentially turn a document of what the client wants into a design they can look over.
This sounds all well and good, and it generally is, but there's a bit of a snag. Although I'll get my first paycheck before my next student loan payment is due, it's going to be a paper check, because the Direct Deposit stuff doesn't kick in until the second check. Which means I won't be able to get my check in the mail and to my bank in time for the loans--one of which will auto-deduct and bounce.
Basque` has just helped me with the situation though, so yay and many thanks to him ~_~
A bit of poetic justice/irony: the design studio that I work at uses Apple hardware exclusively. Now instead of just loudly bitching about anti-competitive market strategies, I get to deal with the frustration of a mouse that makes all the windows go into Expose mode if I squeeze too hard, icons that bounce when they want my attention, and no middle-click-to-open-in-new-tab for Firefox.
Anyway, that's where things are at, and where they're headed, and why I've been offline on IM programs these past couple weeks... and why I haven't commissioned much in the past year.
[Misc] Help a friend D:
Posted 15 years ago
ishiga-san needs help ._. http://www.furaffinity.net/journal/1221222/[Tech] Gigabit Fiber Optic Internet
Posted 16 years agoNow for something much less controversial.
It's been public knowledge for a while that Google has been buying up "dark fiber" (unused optical fiber). However, it has just been recently that Google has announced their intention to connect between 50,000 and 500,000 homes to their fiber network and to provide 1 gigabit internet connections to those people. Let me restate that for effect: From 1990 to 2000 it was common to have a 56k = 56,000 bit/sec connection, from 2000 to 2010 it was more common to have a 1-10 Mbit = 1,000,000-10,000,000 bit/sec connection, and Google is going to be offering 1,000,000,000 bit/sec. That's a 100-fold increase over what, for example, Sepf and I have right now. And what we have is quite satisfactory for most applications.
It begs the question, why is Google doing this? There are a few reasons. First, as we know from recent dealings in China, Google believes in open, unrestricted access to information on the internet as basically a human right, v2.0. Owning a communications backbone and having ISPs coming to Google to make deals potentially means doing away with some troubling net neutrality issues--we won't have to worry as much about smaller websites with sub-billion dollar revenues having their traffic slowed in favor of bigger sites and services. Second, Google believes heavily in "the cloud", storage and processing services done on servers and then fed to your computer each time you click something in the browser. YouTube is music and videos in the cloud. Google Docs is office work in the cloud. Picasa is photo albums in the cloud. Google Maps and Street View, navigation in the cloud. Cloud stuff is limited by speed, and for many people, a 1 Gbit internet connection is going to offer faster transfer rates to a server than that user would get from their own hard drive. And finally and most importantly, for the first phase at least, Google wants to see what people do with that much bandwidth. Dial-up allowed for email and basic web browsing, but not much else. Cable internet makes sharing photos, viewing videos, and even exchanging large files of questionable legality a much more feasible thing. What will another massive jump in connection speed allow for, Google wonders?
And so do I. What sort of potential uses do you see if you and other people start to have always-on 1 Gbit internet connections? To once again clarify how fast this is, most people still have bottlenecks in their home or college networks that are 100 Mbit (10% of 1 Gbit), and they're probably not even aware of it. 1 Gbit = 116 MB/sec. That's an entire Xbox 360 game in under one minute, a complete Blu-ray disk with all the special features and extra languages in 3-7 minutes (just the movie in the language you want to listen to: 2-4 minutes), a CD-quality music album in 4 seconds (320 kbps MP3 version in just 1 and 1/4 seconds, if you're in a hurry). That's your average piece of furry porn in under 3 milliseconds--less than the blink of an eye by a factor of 100. With that incredibly obscene amount of speed once more emphasized, what sort of uses do you envision for such a thing, if any?
I see it as, first and foremost, the ultimate death blow to optical media--you know, CDs, DVDs, Blu-rays?--in the medium to long-term. If you cruise a site like blu-ray.com as much as I do, you'll note that in a lot of the comments and articles, all the Blu-ray snobs are always thumping their chests about the amount of quality that a Blu-ray disc can store, and how you just can't stream that to someone's computer. Even though those users plan out their purchases a year in advance, feeding off every scrap of news about when The Lord of the Rings or Saving Private Ryan is coming out, in their opinion, nobody ever plans out when they want to buy a movie. I usually point out that your average movie in Blu-ray quality takes up about 20 gigabytes and could be downloaded (through a legal service, of course) in about 5 hours on a fairly common ~10 Mbit connection--and better yet, during the second half of that download, you can watch the movie from the beginning, as the last parts download. Is 3 hours too long to wait between deciding you want a movie and having it in your hands? I would say no, because there are probably several hours between deciding that you want to go out and buy a movie, and actually getting the time to do it--and it's not like the driving, searching, purchasing and return trip are instant either. With a gigabit connection, I no longer have to make this argument. A 2x Blu-ray drive reads at a peak of 72 Mbit/sec, approximately twice the highest rate I've ever seen in the movies themselves (20-30 Mbit). A gigabit connection is easily ten times as fast, and unlike the Blu-ray disc, isn't half the speed for the inner, slower parts of the disc. As I said before, we're looking at 2-7 minutes for a Blu-ray quality movie to download, depending on the length and quality of the movie and whether it's got special features or not. You can't make popcorn in 2 minutes. You probably can't get your shoes on, get to your car, and drive it out of your driveway/apartment complex in 2 minutes. If you tried to make it in and out of a store in 2 minutes, assuming you lived right next to it, they'd probably think you were trying to steal the movie and call security. And not only is the process of obtaining the movie unreasonably quick, with pre-release distribution, it's possible to have the movie unlock the second the clock hits midnight and the release is okay to sell. Although... damnit! I'm thinking so pre-2010! Why do you need to preload the movie at all? Handle it like a YouTube video and stream it from where the user wants to watch, onwards. Write it to the player's storage if they plan on watching it again and you want to save bandwidth, but otherwise, you can download it 10-30 times faster than it can be played--you can say "I want to watch this movie" and literally the next second have it playing. Blu-ray quality, better than Blu-ray quality even. Ever hear about quad-1080p? Four times 1080p resolution. A gigabit internet connection could still stream that. Several of them at once, actually.
Movies, on demand. Games, exactly the same thing. Anyone who uses Steam knows that, barring traffic-related slowdowns, Steam can download games in a couple hours or less, depending on the size of the game. Like with movies, if you had a gigabit connection to their servers, you'd have that done in minutes. Even a monstrous 15 GB game could be on your computer in 2 minutes flat, assuming that your hard drive can even write that fast. Amusingly enough it would probably be faster for most users to not download the game at all, and instead stream the content directly from Steam's servers, just like they might do with movies.
Perhaps less obvious to some people, more to others, this would do insane things for file sharing. The same things I just quoted for movie and game distribution apply the same way to pirated file sharing, even moreso if the files are re-encoded and compressed to be easier on user's hard drives. For most people, thinking up the movies they want to steal would take more time than the actual download process. Actually, in some ways it might hurt piracy efforts--one of the strengths of BitTorrent is that people can exchange pieces of files while they're still downloading, so the slower a download is, the more people with partial copies of the file are forced to stay around and upload their parts. If downloading a movie were a 30 second deal, you'd all but certainly have no leechers to swap pieces with during a download, and it would perhaps bottleneck a bit on the few dedicated seeders.
So, what are YOUR thoughts on the implications of gigabit internet speeds, and/or Google offering them?
It's been public knowledge for a while that Google has been buying up "dark fiber" (unused optical fiber). However, it has just been recently that Google has announced their intention to connect between 50,000 and 500,000 homes to their fiber network and to provide 1 gigabit internet connections to those people. Let me restate that for effect: From 1990 to 2000 it was common to have a 56k = 56,000 bit/sec connection, from 2000 to 2010 it was more common to have a 1-10 Mbit = 1,000,000-10,000,000 bit/sec connection, and Google is going to be offering 1,000,000,000 bit/sec. That's a 100-fold increase over what, for example, Sepf and I have right now. And what we have is quite satisfactory for most applications.
It begs the question, why is Google doing this? There are a few reasons. First, as we know from recent dealings in China, Google believes in open, unrestricted access to information on the internet as basically a human right, v2.0. Owning a communications backbone and having ISPs coming to Google to make deals potentially means doing away with some troubling net neutrality issues--we won't have to worry as much about smaller websites with sub-billion dollar revenues having their traffic slowed in favor of bigger sites and services. Second, Google believes heavily in "the cloud", storage and processing services done on servers and then fed to your computer each time you click something in the browser. YouTube is music and videos in the cloud. Google Docs is office work in the cloud. Picasa is photo albums in the cloud. Google Maps and Street View, navigation in the cloud. Cloud stuff is limited by speed, and for many people, a 1 Gbit internet connection is going to offer faster transfer rates to a server than that user would get from their own hard drive. And finally and most importantly, for the first phase at least, Google wants to see what people do with that much bandwidth. Dial-up allowed for email and basic web browsing, but not much else. Cable internet makes sharing photos, viewing videos, and even exchanging large files of questionable legality a much more feasible thing. What will another massive jump in connection speed allow for, Google wonders?
And so do I. What sort of potential uses do you see if you and other people start to have always-on 1 Gbit internet connections? To once again clarify how fast this is, most people still have bottlenecks in their home or college networks that are 100 Mbit (10% of 1 Gbit), and they're probably not even aware of it. 1 Gbit = 116 MB/sec. That's an entire Xbox 360 game in under one minute, a complete Blu-ray disk with all the special features and extra languages in 3-7 minutes (just the movie in the language you want to listen to: 2-4 minutes), a CD-quality music album in 4 seconds (320 kbps MP3 version in just 1 and 1/4 seconds, if you're in a hurry). That's your average piece of furry porn in under 3 milliseconds--less than the blink of an eye by a factor of 100. With that incredibly obscene amount of speed once more emphasized, what sort of uses do you envision for such a thing, if any?
I see it as, first and foremost, the ultimate death blow to optical media--you know, CDs, DVDs, Blu-rays?--in the medium to long-term. If you cruise a site like blu-ray.com as much as I do, you'll note that in a lot of the comments and articles, all the Blu-ray snobs are always thumping their chests about the amount of quality that a Blu-ray disc can store, and how you just can't stream that to someone's computer. Even though those users plan out their purchases a year in advance, feeding off every scrap of news about when The Lord of the Rings or Saving Private Ryan is coming out, in their opinion, nobody ever plans out when they want to buy a movie. I usually point out that your average movie in Blu-ray quality takes up about 20 gigabytes and could be downloaded (through a legal service, of course) in about 5 hours on a fairly common ~10 Mbit connection--and better yet, during the second half of that download, you can watch the movie from the beginning, as the last parts download. Is 3 hours too long to wait between deciding you want a movie and having it in your hands? I would say no, because there are probably several hours between deciding that you want to go out and buy a movie, and actually getting the time to do it--and it's not like the driving, searching, purchasing and return trip are instant either. With a gigabit connection, I no longer have to make this argument. A 2x Blu-ray drive reads at a peak of 72 Mbit/sec, approximately twice the highest rate I've ever seen in the movies themselves (20-30 Mbit). A gigabit connection is easily ten times as fast, and unlike the Blu-ray disc, isn't half the speed for the inner, slower parts of the disc. As I said before, we're looking at 2-7 minutes for a Blu-ray quality movie to download, depending on the length and quality of the movie and whether it's got special features or not. You can't make popcorn in 2 minutes. You probably can't get your shoes on, get to your car, and drive it out of your driveway/apartment complex in 2 minutes. If you tried to make it in and out of a store in 2 minutes, assuming you lived right next to it, they'd probably think you were trying to steal the movie and call security. And not only is the process of obtaining the movie unreasonably quick, with pre-release distribution, it's possible to have the movie unlock the second the clock hits midnight and the release is okay to sell. Although... damnit! I'm thinking so pre-2010! Why do you need to preload the movie at all? Handle it like a YouTube video and stream it from where the user wants to watch, onwards. Write it to the player's storage if they plan on watching it again and you want to save bandwidth, but otherwise, you can download it 10-30 times faster than it can be played--you can say "I want to watch this movie" and literally the next second have it playing. Blu-ray quality, better than Blu-ray quality even. Ever hear about quad-1080p? Four times 1080p resolution. A gigabit internet connection could still stream that. Several of them at once, actually.
Movies, on demand. Games, exactly the same thing. Anyone who uses Steam knows that, barring traffic-related slowdowns, Steam can download games in a couple hours or less, depending on the size of the game. Like with movies, if you had a gigabit connection to their servers, you'd have that done in minutes. Even a monstrous 15 GB game could be on your computer in 2 minutes flat, assuming that your hard drive can even write that fast. Amusingly enough it would probably be faster for most users to not download the game at all, and instead stream the content directly from Steam's servers, just like they might do with movies.
Perhaps less obvious to some people, more to others, this would do insane things for file sharing. The same things I just quoted for movie and game distribution apply the same way to pirated file sharing, even moreso if the files are re-encoded and compressed to be easier on user's hard drives. For most people, thinking up the movies they want to steal would take more time than the actual download process. Actually, in some ways it might hurt piracy efforts--one of the strengths of BitTorrent is that people can exchange pieces of files while they're still downloading, so the slower a download is, the more people with partial copies of the file are forced to stay around and upload their parts. If downloading a movie were a 30 second deal, you'd all but certainly have no leechers to swap pieces with during a download, and it would perhaps bottleneck a bit on the few dedicated seeders.
So, what are YOUR thoughts on the implications of gigabit internet speeds, and/or Google offering them?
[Tech] Apple, Inc. vs The World
Posted 16 years agoTrigged by this.
Alright alright, I know what you're thinking, I need to shut up about Apple already... but bear with me! This isn't me bitching about them directly, just entertaining an interesting fantasy.
Imagine... if pretty much every major company turned against Apple, all at once, as an organized move. They all have their differences, but they can all agree that Apple is fucking them over.
Google: You said that we're evil, and we compete with you in a number of services (smartphone OS, netbook OS, web browser, photo gallery manager, online photo albums). With you dead, we'll eventually reign supreme on mobile OSes at the least.
Microsoft: You insult us daily, and we compete with you in a fuckton of services (smartphone OS, netbook OS, laptop/desktop OS, portable media player (and quite possibly smartphone) hardware, gaming platforms, movie delivery, home theater integration, office suite, web browser and other countless competing applications). With you dead we'll provide the industry and home standard desktop OS.
Intel: You hold a whole group of people about two years behind in CPU adoption and never offered the Core 2 Quad despite its extreme popularity. Also, you are moving away from x86 processors in favor of ARM, and you use AMD graphics parts in some of your products and Nvidia in the rest. You don't offer Intel integrated graphics of any sort. With you gone we'll sell the same number of equivalent or better parts in the PC market, and not have to put up with your bossy bullshit.
AMD: You chose Intel over us back in '05-'06, and provide an unwavering sale of millions of units on a regular basis for them, courtesy of your die-hard fans who upgrade every other quarter. Although you use some of our GPUs, you mostly use Nvidia GPUs. And you offer no enthusiast GPU solutions, and enthusiasts are important for funding our development. With you gone, we might sell more desktop CPUs (we certainly can't sell fewer than 0 to Apple customers) and we'll definitely sell more low-to-midrange graphics chips, and probably more high-end ones too.
Nvidia: You do use our GPUs, but you also sully an otherwise perfect computer lineup by using AMD graphics cards for the highest-performance offerings with no options for comparable Nvidia parts. You chose to make your own chips for the iPad, rather than using the chance of a new platform to switch all your mobile products over to our Tegra lineup. And like AMD says, you don't offer truly enthusiast solutions. Plus you like Intel, and we don't. If you die, Tegra will eventually become the undisputed platform of choice for mobile devices, and we'll sell more high-end GPUs.
IBM: Go fuck yourselves. PowerPC forever, assholes. When you're gone, we'll feel a lot better, and it's not like we were making sales to you anyway.
Everyone who sells A/V on iTunes: We don't have the freedom to dictate our prices. If you disappear, we can raise prices for premium content (things which are high-quality, like Blu-ray, or expensive to make, like certain TV shows and movies).
Adobe: You refuse to put Flash on any of your mobile devices and are pushing for HTML5 and iPhone Apps to replace video streaming and Flash games. If you died off, the artists using our creative software would just move to Windows and still buy our stuff, and Flash would have one less enemy.
And just like that, Apple would be really feeling the pain. Google sites and services would be patched to be incompatible with Apple's software only. Windows would be patched to commit suicide on Apple hardware, refuse to cooperate with iProds (my term for Apple products, typically iPods, iPhones, Apple TV, etc), and they'd stop selling Mac versions of Office. Intel, AMD, and Nvidia would all stop selling chips to them as soon as their previous agreements had been fulfilled, and IBM wouldn't take them back. Movies, TV shows, and music would be pulled from the iTunes store leaving mostly just iPhone apps, which don't have any other convenient place to be sold. And Adobe would make Flash incompatible with Safari and any other browser installations it could identify as being on Macs, as well as completely pulling all their creative software (Photoshop, Adobe, etc) from the Mac platform. In short, a next-gen Mac, if it existed at all, would be using a scaled up version of the iPad processor for both CPU and GPU work, be incompatible with every browser except their own, not work with some of the most frequented online sites and services, not be able to run Windows, Office or any Microsoft software at all, and be completely stripped of the industry-standard art applications which compel artists to seek out Macs above all else. They'd be stripped down (multitouch-enabled) pale shadows of computers running proprietary software on pussy hardware and be completely unsuitable for anyone who needs to use applications like Office/Photoshop or the Windows platform in general. Which, in a sentence, basically describes the iPad and its problems as a platform.
I'm sure it'd be a highly illegal anti-competitive move, and companies wouldn't risk it, which sucks because it shouldn't be and they totally should go for it. "You wrote your software to be incompatible with Apple's platforms." "And they designed their platforms to be incompatible with our software in the first place." "You're refusing to sell certain products to Apple." "And they're refusing to buy certain products from us." Fair's fair, Apple can make anti-competitive decisions, pick 'n choose who they'll do business with and who they simply shrug off, so other companies should be able to do the same.
And they should do it, too. As I've said many times before, letting a company like Apple who embraces one product from one manufacturer and leaves everyone else high and dry might work when you're dealing in single digit percentages of marketshare, but as those digits go up and they become more dominant, it means less competition in the long run. Sure, (for example) Nvidia might be excited about selling GeForce hardware in every Mac, but they know Apple has no loyalties to them, as evidenced by offering AMD chips for the high-end and going with their own graphics parts in mobile platforms. GeForce sales in the Mac are a sure bet now, but that might not be the case in 3 years, and if when the dust settles there's nobody else to sell your GPUs to and you aren't the preferred supplier, your multi-billion dollar corporation just got headshot'd. All the companies need to be aware that Apple has no problem bossing them around and dictating the terms on which they will or won't do business, and if you think it's bad at 5-10%, imagine what it'll be like at 50-100%.
Alright alright, I know what you're thinking, I need to shut up about Apple already... but bear with me! This isn't me bitching about them directly, just entertaining an interesting fantasy.
Imagine... if pretty much every major company turned against Apple, all at once, as an organized move. They all have their differences, but they can all agree that Apple is fucking them over.
Google: You said that we're evil, and we compete with you in a number of services (smartphone OS, netbook OS, web browser, photo gallery manager, online photo albums). With you dead, we'll eventually reign supreme on mobile OSes at the least.
Microsoft: You insult us daily, and we compete with you in a fuckton of services (smartphone OS, netbook OS, laptop/desktop OS, portable media player (and quite possibly smartphone) hardware, gaming platforms, movie delivery, home theater integration, office suite, web browser and other countless competing applications). With you dead we'll provide the industry and home standard desktop OS.
Intel: You hold a whole group of people about two years behind in CPU adoption and never offered the Core 2 Quad despite its extreme popularity. Also, you are moving away from x86 processors in favor of ARM, and you use AMD graphics parts in some of your products and Nvidia in the rest. You don't offer Intel integrated graphics of any sort. With you gone we'll sell the same number of equivalent or better parts in the PC market, and not have to put up with your bossy bullshit.
AMD: You chose Intel over us back in '05-'06, and provide an unwavering sale of millions of units on a regular basis for them, courtesy of your die-hard fans who upgrade every other quarter. Although you use some of our GPUs, you mostly use Nvidia GPUs. And you offer no enthusiast GPU solutions, and enthusiasts are important for funding our development. With you gone, we might sell more desktop CPUs (we certainly can't sell fewer than 0 to Apple customers) and we'll definitely sell more low-to-midrange graphics chips, and probably more high-end ones too.
Nvidia: You do use our GPUs, but you also sully an otherwise perfect computer lineup by using AMD graphics cards for the highest-performance offerings with no options for comparable Nvidia parts. You chose to make your own chips for the iPad, rather than using the chance of a new platform to switch all your mobile products over to our Tegra lineup. And like AMD says, you don't offer truly enthusiast solutions. Plus you like Intel, and we don't. If you die, Tegra will eventually become the undisputed platform of choice for mobile devices, and we'll sell more high-end GPUs.
IBM: Go fuck yourselves. PowerPC forever, assholes. When you're gone, we'll feel a lot better, and it's not like we were making sales to you anyway.
Everyone who sells A/V on iTunes: We don't have the freedom to dictate our prices. If you disappear, we can raise prices for premium content (things which are high-quality, like Blu-ray, or expensive to make, like certain TV shows and movies).
Adobe: You refuse to put Flash on any of your mobile devices and are pushing for HTML5 and iPhone Apps to replace video streaming and Flash games. If you died off, the artists using our creative software would just move to Windows and still buy our stuff, and Flash would have one less enemy.
And just like that, Apple would be really feeling the pain. Google sites and services would be patched to be incompatible with Apple's software only. Windows would be patched to commit suicide on Apple hardware, refuse to cooperate with iProds (my term for Apple products, typically iPods, iPhones, Apple TV, etc), and they'd stop selling Mac versions of Office. Intel, AMD, and Nvidia would all stop selling chips to them as soon as their previous agreements had been fulfilled, and IBM wouldn't take them back. Movies, TV shows, and music would be pulled from the iTunes store leaving mostly just iPhone apps, which don't have any other convenient place to be sold. And Adobe would make Flash incompatible with Safari and any other browser installations it could identify as being on Macs, as well as completely pulling all their creative software (Photoshop, Adobe, etc) from the Mac platform. In short, a next-gen Mac, if it existed at all, would be using a scaled up version of the iPad processor for both CPU and GPU work, be incompatible with every browser except their own, not work with some of the most frequented online sites and services, not be able to run Windows, Office or any Microsoft software at all, and be completely stripped of the industry-standard art applications which compel artists to seek out Macs above all else. They'd be stripped down (multitouch-enabled) pale shadows of computers running proprietary software on pussy hardware and be completely unsuitable for anyone who needs to use applications like Office/Photoshop or the Windows platform in general. Which, in a sentence, basically describes the iPad and its problems as a platform.
I'm sure it'd be a highly illegal anti-competitive move, and companies wouldn't risk it, which sucks because it shouldn't be and they totally should go for it. "You wrote your software to be incompatible with Apple's platforms." "And they designed their platforms to be incompatible with our software in the first place." "You're refusing to sell certain products to Apple." "And they're refusing to buy certain products from us." Fair's fair, Apple can make anti-competitive decisions, pick 'n choose who they'll do business with and who they simply shrug off, so other companies should be able to do the same.
And they should do it, too. As I've said many times before, letting a company like Apple who embraces one product from one manufacturer and leaves everyone else high and dry might work when you're dealing in single digit percentages of marketshare, but as those digits go up and they become more dominant, it means less competition in the long run. Sure, (for example) Nvidia might be excited about selling GeForce hardware in every Mac, but they know Apple has no loyalties to them, as evidenced by offering AMD chips for the high-end and going with their own graphics parts in mobile platforms. GeForce sales in the Mac are a sure bet now, but that might not be the case in 3 years, and if when the dust settles there's nobody else to sell your GPUs to and you aren't the preferred supplier, your multi-billion dollar corporation just got headshot'd. All the companies need to be aware that Apple has no problem bossing them around and dictating the terms on which they will or won't do business, and if you think it's bad at 5-10%, imagine what it'll be like at 50-100%.
[Tech] iPad and the Mobile Conundrum
Posted 16 years agoYou probably expect me to bitch this out (if you know what I'm talking about) but instead I'm going to use it as a lead-in to a general mobile problem.
For those who aren't up-to-the-hour on this stuff, the iPad is a just-announced Apple product which is basically the mix of an iPhone and Mac Book Air display. Half an inch thick, 1.5 pounds, multitouch 9.7" IPS screen (1024x768). 16-64 GB of storage, custom 1 GHz processor, 802.11-n wireless, 3G connection and more.
I'm of two minds on this. First, as I wrote about over two years ago, I think that the desirable mobile future is in slate-style multitouch tablets. Laptops are nice when you've got a table to sit at, but their keyboards and bases make them completely unusable in other scenarios. Solution? Make the display detachable and itself portable. Give it its own battery and transmit user input and display output between the display and computing base. Apple's approach stuffs everything into the display itself which has ups (simplicity) and downs (my version would let you use the display as just that, a display, for anything from a netbook-grade computer to a full gaming machine). But it's still an admirable attempt as far as form factor is concerned. And hey, they're even making a dock with a keyboard base--if that can be used unpowered, you eliminate the laptop benefit of a hardware keyboard. Aaand they got rid of the retard chrome accents that they've senselessly been horny for. Unfortunately, that's where Apple and I stop agreeing on things, and I'm not talking about business practices here. This is also where the Mobile Conundrum part of my journal starts up.
The recent trend, it seems, has been to make software, and then to make mobile software. We've got web browsers and then we've got mobile web browsers. Chat clients and mobile chat clients. Mobile versions of software are optimized for low-resolution screens with lower performance overhead and touch/multitouch input. But don't we want those things anyway? Aren't space-efficient interfaces with slick touch-enabled controls and low resource use good for laptops and desktops as well? And there's another odd division, too--the kinds of chips used in the devices. Now, my knowledge on this subject is extremely limited so correct me if I'm wrong on any of this stuff, but basically everything from desktops to netbooks use your standard x86 architecture, while mobile devices almost invariably use the ARM architecture.
So here's where things stand. Desktops and laptops are slowly getting traditionally mobile features--touch-oriented inputs, for the most part. Mobile devices are getting enough computational power to do more than your 2000-era PDA text editor, and can do 3D graphics, web browsing, video playback and more. The two sides of computing, stationary behemoth and mobile lightweight, are converging towards each other, with devices like the iPad and Netbooks all but passing each other (the iPad screen is bigger than some netbooks, with comparable weight, and some netbooks are severely underpowered and can't do half the stuff that the iPad can). And yet they still retain the division in hardware and software, even though they try to tackle the same problem.
There are three potential ways this can play out. First, everything becomes a computer and runs computer OSes and computer software. This is good because it brings an extensive back-catalog of software, but bad because much of it isn't designed for mobile use--interface and performance issues will exist at times. Alternately, everything becomes an ARM-powered device with a new (iPhone? Android? Windows Mobile?) OS using newly-coded software. This is good because it uses mobile design considerations to make for good mobile software and potentially better desktop software, but bad because everything you owned is now useless. Or we can continue to have two different platforms, one from desktop/x86 roots and one from PDA/ARM roots, competing against each other and doing well in their own markets, but being completely in-fucking-compatible with each other.
...I dunno about you guys, but I'm not okay with this shit. I want a unified computing architecture. I don't want games that only run on my Nintendo DS or an iPhone/Pod/Pad or laptop. For that matter I don't want any applications that only run on one platform. I want a smooth continuity in resolution, size and performance from the 640x360 (half-720p) smartphone to the most monstrous of desktop computers. Just in the past few weeks, Sepf was delighted to find that he could play Pokemon on his phone. To me, it's pathetic that we actually get excited when we can play our old games on new devices. That should be EXPECTED. They're general purpose computers. Or, they should be.
What are your thoughts? Do we teach our mobile devices to sit, multitask and high-rez, or try to squeeze more watts and grams from our netbooks? Or do you like the division between traditional mobile and traditional desktop?
Me, I hate mobile stuff. I don't like making compromises. I'm not going to pay for a phone which is 80% the cost of a laptop but can't do Flash or 720p video. I want a DS Lite-sized ultramobile PC running a multitouch-capable version of Windows 7. I want a 720p screen and some UI improvements to make it easier to navigate Windows with a fingertip on a smaller screen. Most importantly, I don't want to sacrifice features just because I'm moving from my desk to a coffee shop to a plane. I don't want a different OS, I don't want to have to close down an application to open another (ahem, iPhone OS), I don't want to have to re-buy games and applications with trimmed-down versions on my mobile device.
Long story short, while I can definitely appreciate the idea of a multitouch slate, I want it to be a computer rather than a PDA.
For those who aren't up-to-the-hour on this stuff, the iPad is a just-announced Apple product which is basically the mix of an iPhone and Mac Book Air display. Half an inch thick, 1.5 pounds, multitouch 9.7" IPS screen (1024x768). 16-64 GB of storage, custom 1 GHz processor, 802.11-n wireless, 3G connection and more.
I'm of two minds on this. First, as I wrote about over two years ago, I think that the desirable mobile future is in slate-style multitouch tablets. Laptops are nice when you've got a table to sit at, but their keyboards and bases make them completely unusable in other scenarios. Solution? Make the display detachable and itself portable. Give it its own battery and transmit user input and display output between the display and computing base. Apple's approach stuffs everything into the display itself which has ups (simplicity) and downs (my version would let you use the display as just that, a display, for anything from a netbook-grade computer to a full gaming machine). But it's still an admirable attempt as far as form factor is concerned. And hey, they're even making a dock with a keyboard base--if that can be used unpowered, you eliminate the laptop benefit of a hardware keyboard. Aaand they got rid of the retard chrome accents that they've senselessly been horny for. Unfortunately, that's where Apple and I stop agreeing on things, and I'm not talking about business practices here. This is also where the Mobile Conundrum part of my journal starts up.
The recent trend, it seems, has been to make software, and then to make mobile software. We've got web browsers and then we've got mobile web browsers. Chat clients and mobile chat clients. Mobile versions of software are optimized for low-resolution screens with lower performance overhead and touch/multitouch input. But don't we want those things anyway? Aren't space-efficient interfaces with slick touch-enabled controls and low resource use good for laptops and desktops as well? And there's another odd division, too--the kinds of chips used in the devices. Now, my knowledge on this subject is extremely limited so correct me if I'm wrong on any of this stuff, but basically everything from desktops to netbooks use your standard x86 architecture, while mobile devices almost invariably use the ARM architecture.
So here's where things stand. Desktops and laptops are slowly getting traditionally mobile features--touch-oriented inputs, for the most part. Mobile devices are getting enough computational power to do more than your 2000-era PDA text editor, and can do 3D graphics, web browsing, video playback and more. The two sides of computing, stationary behemoth and mobile lightweight, are converging towards each other, with devices like the iPad and Netbooks all but passing each other (the iPad screen is bigger than some netbooks, with comparable weight, and some netbooks are severely underpowered and can't do half the stuff that the iPad can). And yet they still retain the division in hardware and software, even though they try to tackle the same problem.
There are three potential ways this can play out. First, everything becomes a computer and runs computer OSes and computer software. This is good because it brings an extensive back-catalog of software, but bad because much of it isn't designed for mobile use--interface and performance issues will exist at times. Alternately, everything becomes an ARM-powered device with a new (iPhone? Android? Windows Mobile?) OS using newly-coded software. This is good because it uses mobile design considerations to make for good mobile software and potentially better desktop software, but bad because everything you owned is now useless. Or we can continue to have two different platforms, one from desktop/x86 roots and one from PDA/ARM roots, competing against each other and doing well in their own markets, but being completely in-fucking-compatible with each other.
...I dunno about you guys, but I'm not okay with this shit. I want a unified computing architecture. I don't want games that only run on my Nintendo DS or an iPhone/Pod/Pad or laptop. For that matter I don't want any applications that only run on one platform. I want a smooth continuity in resolution, size and performance from the 640x360 (half-720p) smartphone to the most monstrous of desktop computers. Just in the past few weeks, Sepf was delighted to find that he could play Pokemon on his phone. To me, it's pathetic that we actually get excited when we can play our old games on new devices. That should be EXPECTED. They're general purpose computers. Or, they should be.
What are your thoughts? Do we teach our mobile devices to sit, multitask and high-rez, or try to squeeze more watts and grams from our netbooks? Or do you like the division between traditional mobile and traditional desktop?
Me, I hate mobile stuff. I don't like making compromises. I'm not going to pay for a phone which is 80% the cost of a laptop but can't do Flash or 720p video. I want a DS Lite-sized ultramobile PC running a multitouch-capable version of Windows 7. I want a 720p screen and some UI improvements to make it easier to navigate Windows with a fingertip on a smaller screen. Most importantly, I don't want to sacrifice features just because I'm moving from my desk to a coffee shop to a plane. I don't want a different OS, I don't want to have to close down an application to open another (ahem, iPhone OS), I don't want to have to re-buy games and applications with trimmed-down versions on my mobile device.
Long story short, while I can definitely appreciate the idea of a multitouch slate, I want it to be a computer rather than a PDA.
[RL] twile.age++; //It's my birthday!
Posted 16 years ago23 years old now, on this the 13th of January. I think I'm starting to get gray furs D:
As always, presents may be left in the rear ~_~ (dommy jackals, otters, and wolves preferred!)
P.S., it also marks my 4th year dating
sepffuzzball. Being bad at remembering dates but good at remembering my birthday, I asked him to be my mate on a day I knew I'd remember :3
As always, presents may be left in the rear ~_~ (dommy jackals, otters, and wolves preferred!)
P.S., it also marks my 4th year dating
sepffuzzball. Being bad at remembering dates but good at remembering my birthday, I asked him to be my mate on a day I knew I'd remember :3[Games] Alan Wake
Posted 16 years agoWho here remembers being excited about that game when they showed off trailers back in '05? xD Seriously, wow. Stuff like this makes me want to cry a little bit inside. Rewind nearly 5 years.
Remedy, the company behind the awesome awesome game Max Payne, announced a psychological thriller where you play as an author who has moved to a small mountain town and started to go crazy. Suddenly your wife is missing, except there's a nurse who looks like your wife or something, aaand the stuff that you're writing in your book starts to come true, and there's a light/dark dimension to the game where you're only safe near light, so during the nighttime you retreat to a lighthouse, and there are these mysterious spooky robed and hooded figures who come for you in the dark, and you're mostly armed with a flashlight or lamp, maybe a revolver at best. Cutting-edge graphics, DX10-only on Windows, also on Xbox. Beautiful visuals. More advanced physics and vastly bigger worlds than Half Life 2 and the other stuff we were enjoying in the 2004-2005 period.
At some point they showed the game off and were bragging about multicore support (Seriously? That shit's standard now. Goes to show its age). They were showing off how they could even leverage, not just dual core stuff, but quad as well. On the then-preproduction Intel Core 2 Quads they were all "Check it out, we have a thread for AI and game stuff, a thread for physics, and a thread which just handles streaming in the game world!"
During 2009 they announced that they were stopping development of the Windows version to focus on the 360 version instead. So much for all that "DX10-only! Look how great we run on a quad core!" business. Presently, the game is due out for a mid-2010 launch on 360 only.
Honestly I don't know what to feel about the whole thing. I mean, I loved Max Payne, and Alan Wake looked to be potentially pretty hair-raising and gorgeous. But that was 5 years ago. 5 years as in, the time between one console and the next one, or ~3 computer part generations. If that game came out 2 years ago it would've looked on par with other titles. I'm reminded of STALKER, which was just stunning when they showed it off in 2001... a little less so as the release date slipped from 2003 to 2007. Game delays, sadly, happen. Eye candy gets stale if you don't have it on shelves soon enough.
What really irks me about the whole situation, then, is the loss of a potential PC title to be Xbox-exclusive. That wasn't Microsoft's call, the developers just didn't want to expend the effort to put it out on PC. It really saddens me; gone are the days when games were made for PC and then ported to consoles. Now we're lucky to get games made for consoles and ported to PC :|
Bleh. Maybe some day gaming will grow up. As all the people who used to be kids playing N64 on little TVs, all paid for by their parents, get real jobs and earn real money and it occurs to them that paying $1000 every 3 years for a computer is nothing in comparison to their $80/month cellphone bill, $3000 home theater setup, and $30,000 car... they'll get real gaming systems. 'cuz whatever your $300 console can do, your $1000 gaming box is going to do three times better.
Remedy, the company behind the awesome awesome game Max Payne, announced a psychological thriller where you play as an author who has moved to a small mountain town and started to go crazy. Suddenly your wife is missing, except there's a nurse who looks like your wife or something, aaand the stuff that you're writing in your book starts to come true, and there's a light/dark dimension to the game where you're only safe near light, so during the nighttime you retreat to a lighthouse, and there are these mysterious spooky robed and hooded figures who come for you in the dark, and you're mostly armed with a flashlight or lamp, maybe a revolver at best. Cutting-edge graphics, DX10-only on Windows, also on Xbox. Beautiful visuals. More advanced physics and vastly bigger worlds than Half Life 2 and the other stuff we were enjoying in the 2004-2005 period.
At some point they showed the game off and were bragging about multicore support (Seriously? That shit's standard now. Goes to show its age). They were showing off how they could even leverage, not just dual core stuff, but quad as well. On the then-preproduction Intel Core 2 Quads they were all "Check it out, we have a thread for AI and game stuff, a thread for physics, and a thread which just handles streaming in the game world!"
During 2009 they announced that they were stopping development of the Windows version to focus on the 360 version instead. So much for all that "DX10-only! Look how great we run on a quad core!" business. Presently, the game is due out for a mid-2010 launch on 360 only.
Honestly I don't know what to feel about the whole thing. I mean, I loved Max Payne, and Alan Wake looked to be potentially pretty hair-raising and gorgeous. But that was 5 years ago. 5 years as in, the time between one console and the next one, or ~3 computer part generations. If that game came out 2 years ago it would've looked on par with other titles. I'm reminded of STALKER, which was just stunning when they showed it off in 2001... a little less so as the release date slipped from 2003 to 2007. Game delays, sadly, happen. Eye candy gets stale if you don't have it on shelves soon enough.
What really irks me about the whole situation, then, is the loss of a potential PC title to be Xbox-exclusive. That wasn't Microsoft's call, the developers just didn't want to expend the effort to put it out on PC. It really saddens me; gone are the days when games were made for PC and then ported to consoles. Now we're lucky to get games made for consoles and ported to PC :|
Bleh. Maybe some day gaming will grow up. As all the people who used to be kids playing N64 on little TVs, all paid for by their parents, get real jobs and earn real money and it occurs to them that paying $1000 every 3 years for a computer is nothing in comparison to their $80/month cellphone bill, $3000 home theater setup, and $30,000 car... they'll get real gaming systems. 'cuz whatever your $300 console can do, your $1000 gaming box is going to do three times better.
[Tech] Digital Media: 3D movies and games in your home
Posted 16 years agoIf you're at all like me, you're a mild-to-heavy digital media whore. The idea of having a library of tens of thousands of organized songs excites you. You may have a budding movie collection. Maybe you torrent old TV shows which haven't made it to DVD (or perhaps they have). The prospect of your favorite blockbuster movie in 1080p makes you giddy. You don't just see computers for their modern communication purposes, but also appreciate their ability to hold literally years of music and video files which they can pull up in a second and stream to a home theater, laptop, or even a cellphone when you're on the go.
I've been intending for some weeks to do a series of journals on a variety of digital media topics: informational journals, opinionated rants, and general discussions. Things that excite me and people like me. In the past I've done a number of them on assorted topics such as DRM in Vista, digital image formats, naming conventions for video resolutions, why Apple should make an HTPC, the benefits of movies in HD, people who think that HD is a rich person's placebo, color depth and contrast ratios, 120 fps and/or 3D in games, a great way to enjoy Blu-ray quality on your PC, and how to make a good server to dish out all your multimedia. Today, I'll be focusing on the 3D stuff again.
The reason that I'm starting with the 3D side of digital media is threefold. First, Avatar was recently released, amidst a perpetual trickle of 3D CGI movies, and it shows a mature look at how 3D can be used to improve immersiveness in movies. Second, the 3D spec for Blu-ray discs was finalized a few weeks ago, meaning that during 2010 you'll start to see home theater setups which allow you to have 3D just like in the theaters--no red/blue glasses bullshit. Third, I've recently been using my own 3D glasses more often for gaming, and gotten a nice feel for them in my 9 months of ownership.
First, the run-down on the whole 3D movie/game deal, for people who don't know--those who do, skip this paragraph. To put it simply, traditional single-lens cameras take a whole 3D scene and flatten it into a 2D image, removing some of the greath depth cues we have to determine distances and indirectly, speeds towards and away from the screen. To see what I mean, simply close one eye. Keep your head still while you do this, because any motion causes motion parallax--close objects move through your field of vision more than far objects, which helps you tell how far away they are. You won't always have this in games and movies. In case it isn't obvious now, the reason that images appear flat on computer screens is because both eyes are getting the same image, and you might as well just be using one. Why does this eliminate depth cues? Among other things, our brain uses the slight differences generated by our eyes being 2-3 inches apart to figure out how close something is. If both eyes see the same image on a screen, it appears flat. If each eye could see a different perspective taken by two cameras side-by-side, however, there would be a natural sense of depth. Which is exactly what is done in 3D movies and games.
Onwards. The recent release of Avatar in 3D and IMAX 3D was, to me, a landmark. It wasn't the first movie in 3D by any means, and it wasn't even the first live action movie in 3D... but it was the first movie since 120 Hz displays became widely available (I'll explain this later) to show that a live-action film can artfully and maturely use 3D to enhance the sense of immersion (gotta stop saying film, there's no film involved now). In a way it is to 3D what Jurassic Park was to CGI in movies, a proof of concept in a major release. Now, I said maturely here, and some people might wonder why. Anyone who has been to a 3D show at a theme park or seen a 3D film geared at children will probably view 3D as a distracting gimmick, because in those cases the focus is the 3D effect and OOOOH LOOK I CAN MAKE THINGS STICK OUT OF THE SCREEN AT YOU! It's in the same way that early animations weren't of two people talking, but rather of a horse running, and early color demonstrations probably had a lot of bright colors. 3D for 3D's sake is a bad thing. 3D to grant you a sense of depth, speed, and physicality of surfaces is a good thing. Avatar did the latter. Say what you will about the plot or budget, they shot the movie with two cameras and edited the effects so that everything could be shown in 3D. Very rarely did things stick out of the screen, and it was never intentional or distracting. Occasionally you'd get bits of fluff and sparks which you had to resist the urge to bat away, because they felt like they were drifting lazily through the theater. Why, then, make the movie in 3D? Because it's not about making protrusions, it's about DEPTH. Depth BEYOND the screen. Suddenly your display becomes less of a moving picture box and becomes, in a visual sense, the opening to a live theater production taking place just inside the screen (and sometimes sneaking out a bit). This is extremely apparent when playing a 3D game with 3D glasses--the screen almost literally melts away, and it's just stuff behind the black frame of your monitor. You reach towards it to try and touch things, and you're actually surprised when your finger is stopped by this invisible wall separating you from what looks like a real (if poorly detailed) gun replica. Proper 3D rarely invades your personal space, instead it turns moving images into objects you can all but touch, events which are unfolding before your very eyes, and camera shots which place you in an almost wholly believable world. Avatar did this. Other movies will do this in the future.
The second reason I'm writing this is, as I said, the finalization of the 3D Blu-ray spec. This was met with mixed reactions by members of the Blu-ray community from what I understand, with some people claiming it's a gimmick and is worthless, others excited, and still more saying that it was too early and it would fragment the Blu-ray market with new players and new discs. To address these things, 3D is no more of a gimmick than color, sound, motion, or images period. They show you exactly what the director wanted you to see, and immerse you in a world of increasing realism and artfully crafted beauty. Early on, the mass production and distribution of of photographs, "moving pictures", films with sound, and color films were expensive--but they all came through as the tech improved and people began to consider them to be important to the theatrical experience. So too will 3D. And for those who see merit but are afraid the Blu-ray market will be fragmented, there's really no need to worry--old discs will work on 3D-ready players, obviously, and 3D discs will work on older players, just without the 3D effect. You'll still be able to tap into one of the video streams and display it as a 2D image, you'll be able to buy a 3D movie and enjoy it on your 2D player, and then when you do feel like springing for the 3D equipment, you'll already have 3D movies in your library. How will the 3D Blu-ray stuff work in the home viewing environment? Well, that's up to the display manufacturers to decide, leaving space in the future for different implementations, but what you'll likely see the most of in 2010 is the 120 Hz display + active shutter glasses approach that Nvidia uses for its 3D glasses. Your TV or display will alternate between images for your left and right eyes, and you'll wear what look like sunglasses (~$65 a pair) which synchronize wirelessly with the TV and black out the left and right lenses at just the right time. The effect is that each eye sees the image it is supposed to, and only that image, making the scene look 3D. There are other possible approaches, of course--a display which shows two images on the same screen, each with a different polarity, can allow you to wear cheaper passive glasses, exactly what theaters do... and a display which uses a lenticular sheet (those postcards and advertisements on boxes which look different when you view them from different angles) can get a different image to each eye if you're in a particular location, with no glasses whatsoever. Hell, a dorky-looking head-mounted display with a screen in front of each eye could even work with it, but these systems are increasingly complex and awkward. But still, by the end of 2010, expect your PS3 to support 3D Blu-ray, expect 3D Blu-ray players and discs to be on shelves, and hell, you can already buy 3D-ready displays.
The final piece to this 3D journal puzzle is that lately I've been using my 3D glasses more often, for gaming purposes. As I said in the journal I wrote within a day of first trying it out, the effect isn't always perfect, but it can be damn good in certain games. Killing Floor, a "zombie" survival FPS of sorts, plays very well in 3D--it has no targeting reticles painted on the screen, like the real world, so it's just you and your real-feeling/sounding gun and zombie mutant monsters. When you pull up your Pipboy in Fallout 3, the knobs and buttons stick out of the screen an inch or two, and it feels so fucking real that it's almost confusing when your fingers go through where you can plainly see them to be. Just tonight I was playing Burnout Paradise and like what I've experienced in doing Audiosurf, the sense of speed and motion is incredible. With the ability to feel how far away something is comes the ability to feel how quickly it is moving towards you, and therefore anything where you're flying at objects (or them at you) suddenly becomes very real. And in racing games which have in-car views, you could swear that you were in a very nice arcade simulation, with a real wheel and gauges in front of you. Even with poor textures and lighting, it feels like there are physical objects just a foot or two inside the screen, if that. This is truly something you have to feel to properly understand (and if you're in the Orlando area I welcome you to come try it out), because ultimately it's all about feeling. Objects feel real. Motion feels real. Your presence looking into or down on the scene is almost flawless. You could swear there was a little person jumping from roof to roof of a dollhouse-scale city when you play Assassin's Creed, actual tracer rounds being fired past you in Fallout 3, animated miniature monsters smashing each other in Torchlight. And this is on a 22" screen--on a 60" TV or a 120" projector image, the handguns will be the size of cannons, Altair will be a midget performing acrobatic stunts on the other side of your wall, and zombies will run at you, really run at you at full bore... run AT you, not just get bigger on the screen. Do so fucking want.
That's part of the reason I'm so excited about Avatar and the 3D Blu-ray spec being finalized. With any luck it will help 3D become cheaper, better, and more widespread. I want a 1080p 3D projector, but they don't sell them. They don't make them. For 5 grand you can get a 720p projector, but you're better off getting a massive 1080p screen for half the price and 80% of the screen size. When companies have an actual market for 120 Hz input, 3D-ready 1080p projectors, which will happen in 2010, I'll be that much closer to having a gaming experience which doesn't shatter the fourth wall, but instead causes it to melt away. There's nothing artificial about this sort of 3D, it's as natural as looking around in the real world. And importantly, unlike today's implementations in the home theater market, there's not going to be any bloody red and blue glasses messing up the color of your favorite movie, or stupid characters waving their hands about outside of the screen. Not unless you're watching a G-rated live action Disney film, that is.
Panasonic has been pushing 3D-ready stuff all this year. Sony has promised products this coming year, as well as 3D capability for the PS3. Samsung, ACER and Viewsonic make 3D-ready computer displays. Mitsubishi makes 3D-ready TVs, and generic DLP HDTVs are already capable of displaying 3D content. 3D will finally be coming to the mass market for desktops, laptops, and home theaters. 2010.
I've been intending for some weeks to do a series of journals on a variety of digital media topics: informational journals, opinionated rants, and general discussions. Things that excite me and people like me. In the past I've done a number of them on assorted topics such as DRM in Vista, digital image formats, naming conventions for video resolutions, why Apple should make an HTPC, the benefits of movies in HD, people who think that HD is a rich person's placebo, color depth and contrast ratios, 120 fps and/or 3D in games, a great way to enjoy Blu-ray quality on your PC, and how to make a good server to dish out all your multimedia. Today, I'll be focusing on the 3D stuff again.
The reason that I'm starting with the 3D side of digital media is threefold. First, Avatar was recently released, amidst a perpetual trickle of 3D CGI movies, and it shows a mature look at how 3D can be used to improve immersiveness in movies. Second, the 3D spec for Blu-ray discs was finalized a few weeks ago, meaning that during 2010 you'll start to see home theater setups which allow you to have 3D just like in the theaters--no red/blue glasses bullshit. Third, I've recently been using my own 3D glasses more often for gaming, and gotten a nice feel for them in my 9 months of ownership.
First, the run-down on the whole 3D movie/game deal, for people who don't know--those who do, skip this paragraph. To put it simply, traditional single-lens cameras take a whole 3D scene and flatten it into a 2D image, removing some of the greath depth cues we have to determine distances and indirectly, speeds towards and away from the screen. To see what I mean, simply close one eye. Keep your head still while you do this, because any motion causes motion parallax--close objects move through your field of vision more than far objects, which helps you tell how far away they are. You won't always have this in games and movies. In case it isn't obvious now, the reason that images appear flat on computer screens is because both eyes are getting the same image, and you might as well just be using one. Why does this eliminate depth cues? Among other things, our brain uses the slight differences generated by our eyes being 2-3 inches apart to figure out how close something is. If both eyes see the same image on a screen, it appears flat. If each eye could see a different perspective taken by two cameras side-by-side, however, there would be a natural sense of depth. Which is exactly what is done in 3D movies and games.
Onwards. The recent release of Avatar in 3D and IMAX 3D was, to me, a landmark. It wasn't the first movie in 3D by any means, and it wasn't even the first live action movie in 3D... but it was the first movie since 120 Hz displays became widely available (I'll explain this later) to show that a live-action film can artfully and maturely use 3D to enhance the sense of immersion (gotta stop saying film, there's no film involved now). In a way it is to 3D what Jurassic Park was to CGI in movies, a proof of concept in a major release. Now, I said maturely here, and some people might wonder why. Anyone who has been to a 3D show at a theme park or seen a 3D film geared at children will probably view 3D as a distracting gimmick, because in those cases the focus is the 3D effect and OOOOH LOOK I CAN MAKE THINGS STICK OUT OF THE SCREEN AT YOU! It's in the same way that early animations weren't of two people talking, but rather of a horse running, and early color demonstrations probably had a lot of bright colors. 3D for 3D's sake is a bad thing. 3D to grant you a sense of depth, speed, and physicality of surfaces is a good thing. Avatar did the latter. Say what you will about the plot or budget, they shot the movie with two cameras and edited the effects so that everything could be shown in 3D. Very rarely did things stick out of the screen, and it was never intentional or distracting. Occasionally you'd get bits of fluff and sparks which you had to resist the urge to bat away, because they felt like they were drifting lazily through the theater. Why, then, make the movie in 3D? Because it's not about making protrusions, it's about DEPTH. Depth BEYOND the screen. Suddenly your display becomes less of a moving picture box and becomes, in a visual sense, the opening to a live theater production taking place just inside the screen (and sometimes sneaking out a bit). This is extremely apparent when playing a 3D game with 3D glasses--the screen almost literally melts away, and it's just stuff behind the black frame of your monitor. You reach towards it to try and touch things, and you're actually surprised when your finger is stopped by this invisible wall separating you from what looks like a real (if poorly detailed) gun replica. Proper 3D rarely invades your personal space, instead it turns moving images into objects you can all but touch, events which are unfolding before your very eyes, and camera shots which place you in an almost wholly believable world. Avatar did this. Other movies will do this in the future.
The second reason I'm writing this is, as I said, the finalization of the 3D Blu-ray spec. This was met with mixed reactions by members of the Blu-ray community from what I understand, with some people claiming it's a gimmick and is worthless, others excited, and still more saying that it was too early and it would fragment the Blu-ray market with new players and new discs. To address these things, 3D is no more of a gimmick than color, sound, motion, or images period. They show you exactly what the director wanted you to see, and immerse you in a world of increasing realism and artfully crafted beauty. Early on, the mass production and distribution of of photographs, "moving pictures", films with sound, and color films were expensive--but they all came through as the tech improved and people began to consider them to be important to the theatrical experience. So too will 3D. And for those who see merit but are afraid the Blu-ray market will be fragmented, there's really no need to worry--old discs will work on 3D-ready players, obviously, and 3D discs will work on older players, just without the 3D effect. You'll still be able to tap into one of the video streams and display it as a 2D image, you'll be able to buy a 3D movie and enjoy it on your 2D player, and then when you do feel like springing for the 3D equipment, you'll already have 3D movies in your library. How will the 3D Blu-ray stuff work in the home viewing environment? Well, that's up to the display manufacturers to decide, leaving space in the future for different implementations, but what you'll likely see the most of in 2010 is the 120 Hz display + active shutter glasses approach that Nvidia uses for its 3D glasses. Your TV or display will alternate between images for your left and right eyes, and you'll wear what look like sunglasses (~$65 a pair) which synchronize wirelessly with the TV and black out the left and right lenses at just the right time. The effect is that each eye sees the image it is supposed to, and only that image, making the scene look 3D. There are other possible approaches, of course--a display which shows two images on the same screen, each with a different polarity, can allow you to wear cheaper passive glasses, exactly what theaters do... and a display which uses a lenticular sheet (those postcards and advertisements on boxes which look different when you view them from different angles) can get a different image to each eye if you're in a particular location, with no glasses whatsoever. Hell, a dorky-looking head-mounted display with a screen in front of each eye could even work with it, but these systems are increasingly complex and awkward. But still, by the end of 2010, expect your PS3 to support 3D Blu-ray, expect 3D Blu-ray players and discs to be on shelves, and hell, you can already buy 3D-ready displays.
The final piece to this 3D journal puzzle is that lately I've been using my 3D glasses more often, for gaming purposes. As I said in the journal I wrote within a day of first trying it out, the effect isn't always perfect, but it can be damn good in certain games. Killing Floor, a "zombie" survival FPS of sorts, plays very well in 3D--it has no targeting reticles painted on the screen, like the real world, so it's just you and your real-feeling/sounding gun and zombie mutant monsters. When you pull up your Pipboy in Fallout 3, the knobs and buttons stick out of the screen an inch or two, and it feels so fucking real that it's almost confusing when your fingers go through where you can plainly see them to be. Just tonight I was playing Burnout Paradise and like what I've experienced in doing Audiosurf, the sense of speed and motion is incredible. With the ability to feel how far away something is comes the ability to feel how quickly it is moving towards you, and therefore anything where you're flying at objects (or them at you) suddenly becomes very real. And in racing games which have in-car views, you could swear that you were in a very nice arcade simulation, with a real wheel and gauges in front of you. Even with poor textures and lighting, it feels like there are physical objects just a foot or two inside the screen, if that. This is truly something you have to feel to properly understand (and if you're in the Orlando area I welcome you to come try it out), because ultimately it's all about feeling. Objects feel real. Motion feels real. Your presence looking into or down on the scene is almost flawless. You could swear there was a little person jumping from roof to roof of a dollhouse-scale city when you play Assassin's Creed, actual tracer rounds being fired past you in Fallout 3, animated miniature monsters smashing each other in Torchlight. And this is on a 22" screen--on a 60" TV or a 120" projector image, the handguns will be the size of cannons, Altair will be a midget performing acrobatic stunts on the other side of your wall, and zombies will run at you, really run at you at full bore... run AT you, not just get bigger on the screen. Do so fucking want.
That's part of the reason I'm so excited about Avatar and the 3D Blu-ray spec being finalized. With any luck it will help 3D become cheaper, better, and more widespread. I want a 1080p 3D projector, but they don't sell them. They don't make them. For 5 grand you can get a 720p projector, but you're better off getting a massive 1080p screen for half the price and 80% of the screen size. When companies have an actual market for 120 Hz input, 3D-ready 1080p projectors, which will happen in 2010, I'll be that much closer to having a gaming experience which doesn't shatter the fourth wall, but instead causes it to melt away. There's nothing artificial about this sort of 3D, it's as natural as looking around in the real world. And importantly, unlike today's implementations in the home theater market, there's not going to be any bloody red and blue glasses messing up the color of your favorite movie, or stupid characters waving their hands about outside of the screen. Not unless you're watching a G-rated live action Disney film, that is.
Panasonic has been pushing 3D-ready stuff all this year. Sony has promised products this coming year, as well as 3D capability for the PS3. Samsung, ACER and Viewsonic make 3D-ready computer displays. Mitsubishi makes 3D-ready TVs, and generic DLP HDTVs are already capable of displaying 3D content. 3D will finally be coming to the mass market for desktops, laptops, and home theaters. 2010.
[Misc] Twile's Twelve Days of Christmas
Posted 16 years agoIn 2009, the world gave to me:
12 terabytes of storage,
11 Star Trek movies on Blu-ray,
10 foot wide beds,
9 months of waiting (for a mediocre film),
8 speaker surround sound,
7 months without work,
6 massive pillows,
5 visits with Kip,
4 new survivors,
3 dimensions of force feedback,
2 furry conventions,
and 1 loving fiance.
Brief summary of the important developments of the year as seen by ye olde red fuzz :3
12 terabytes of storage,
11 Star Trek movies on Blu-ray,
10 foot wide beds,
9 months of waiting (for a mediocre film),
8 speaker surround sound,
7 months without work,
6 massive pillows,
5 visits with Kip,
4 new survivors,
3 dimensions of force feedback,
2 furry conventions,
and 1 loving fiance.
Brief summary of the important developments of the year as seen by ye olde red fuzz :3
[Furry] Sexy/Not Sexy
Posted 16 years agoI don't often do memes, but this sort is always fun, and I haven't actually talked about furry stuff in a journal for 2 months...
Sexy
* Hindquarter curves--tailbase, taint/balls, and bootay. Scappo really makes my point for me here.
* Taints, period.
* More penis than your body has room for. Preferably being used.
* Size differences. See previous reason, also, it's cute having a big guy going out of his way to be gentle, or a little guy eager to please enough that his larger companion can let loose =D~
* Black dicks. 'nuff said.
* Knots. Knotfucking, tying, or simply having them. Murr.
* Multiple penetration. What's better than fun with a friend? Fun with two friends. Or really as many as can coordinate their activities and positions together.
* Jackals, coyotes, otters, raccoons, foxes, huskies, bunnies, tigers, non-scary dragons, and hybrids thereof. Also nagas. And hyenas. Renamons with dicks, a number of Digimon really... yeah, this could be a journal by itself :|
*
AEthian.
* Temporary bellybulges of most varieties, induced by or composed of friends.
Not Sexy
* Chastity. What's less sexy than not currently having sex? Not being able to have sex.
* Bodily waste and blood. I'll grudgingly accept them as necessary for life functions but that doesn't make them sexy >:[
* Pubic hair. Furries are already covered in soft fur, why add nasty coarse curly hair to their most touchable areas?
* Insects. Sorry.
* >2 tits per person. Any tolerance I may have for furry boobs goes out the window when we start talking about superfluous mini-boobs. On a similar note...
* Lactation. Just, no.
* Deliberate induction of pain. Not fun, not pleasurable, not sexy D:
* Clothing. I'm on display, shouldn't you be? ;3
New additions:
* Large nipples. I don't like nipples, period.
* Playing with/large amounts of foreskin. Lol wut?
That's all that occurs to me at this time :3
Sexy
* Hindquarter curves--tailbase, taint/balls, and bootay. Scappo really makes my point for me here.
* Taints, period.
* More penis than your body has room for. Preferably being used.
* Size differences. See previous reason, also, it's cute having a big guy going out of his way to be gentle, or a little guy eager to please enough that his larger companion can let loose =D~
* Black dicks. 'nuff said.
* Knots. Knotfucking, tying, or simply having them. Murr.
* Multiple penetration. What's better than fun with a friend? Fun with two friends. Or really as many as can coordinate their activities and positions together.
* Jackals, coyotes, otters, raccoons, foxes, huskies, bunnies, tigers, non-scary dragons, and hybrids thereof. Also nagas. And hyenas. Renamons with dicks, a number of Digimon really... yeah, this could be a journal by itself :|
*
AEthian.* Temporary bellybulges of most varieties, induced by or composed of friends.
Not Sexy
* Chastity. What's less sexy than not currently having sex? Not being able to have sex.
* Bodily waste and blood. I'll grudgingly accept them as necessary for life functions but that doesn't make them sexy >:[
* Pubic hair. Furries are already covered in soft fur, why add nasty coarse curly hair to their most touchable areas?
* Insects. Sorry.
* >2 tits per person. Any tolerance I may have for furry boobs goes out the window when we start talking about superfluous mini-boobs. On a similar note...
* Lactation. Just, no.
* Deliberate induction of pain. Not fun, not pleasurable, not sexy D:
* Clothing. I'm on display, shouldn't you be? ;3
New additions:
* Large nipples. I don't like nipples, period.
* Playing with/large amounts of foreskin. Lol wut?
That's all that occurs to me at this time :3
[Tech] Google Wave
Posted 16 years agoUnlike some other recent Google Betas I'm totally enthusiastic about Wave. I have two dozen invites, who wants in? Also, if you already have Wave, note me your email so I can maybe add you =D
FA+
