[Tech] iPad and the Mobile Conundrum
16 years ago
You probably expect me to bitch this out (if you know what I'm talking about) but instead I'm going to use it as a lead-in to a general mobile problem.
For those who aren't up-to-the-hour on this stuff, the iPad is a just-announced Apple product which is basically the mix of an iPhone and Mac Book Air display. Half an inch thick, 1.5 pounds, multitouch 9.7" IPS screen (1024x768). 16-64 GB of storage, custom 1 GHz processor, 802.11-n wireless, 3G connection and more.
I'm of two minds on this. First, as I wrote about over two years ago, I think that the desirable mobile future is in slate-style multitouch tablets. Laptops are nice when you've got a table to sit at, but their keyboards and bases make them completely unusable in other scenarios. Solution? Make the display detachable and itself portable. Give it its own battery and transmit user input and display output between the display and computing base. Apple's approach stuffs everything into the display itself which has ups (simplicity) and downs (my version would let you use the display as just that, a display, for anything from a netbook-grade computer to a full gaming machine). But it's still an admirable attempt as far as form factor is concerned. And hey, they're even making a dock with a keyboard base--if that can be used unpowered, you eliminate the laptop benefit of a hardware keyboard. Aaand they got rid of the retard chrome accents that they've senselessly been horny for. Unfortunately, that's where Apple and I stop agreeing on things, and I'm not talking about business practices here. This is also where the Mobile Conundrum part of my journal starts up.
The recent trend, it seems, has been to make software, and then to make mobile software. We've got web browsers and then we've got mobile web browsers. Chat clients and mobile chat clients. Mobile versions of software are optimized for low-resolution screens with lower performance overhead and touch/multitouch input. But don't we want those things anyway? Aren't space-efficient interfaces with slick touch-enabled controls and low resource use good for laptops and desktops as well? And there's another odd division, too--the kinds of chips used in the devices. Now, my knowledge on this subject is extremely limited so correct me if I'm wrong on any of this stuff, but basically everything from desktops to netbooks use your standard x86 architecture, while mobile devices almost invariably use the ARM architecture.
So here's where things stand. Desktops and laptops are slowly getting traditionally mobile features--touch-oriented inputs, for the most part. Mobile devices are getting enough computational power to do more than your 2000-era PDA text editor, and can do 3D graphics, web browsing, video playback and more. The two sides of computing, stationary behemoth and mobile lightweight, are converging towards each other, with devices like the iPad and Netbooks all but passing each other (the iPad screen is bigger than some netbooks, with comparable weight, and some netbooks are severely underpowered and can't do half the stuff that the iPad can). And yet they still retain the division in hardware and software, even though they try to tackle the same problem.
There are three potential ways this can play out. First, everything becomes a computer and runs computer OSes and computer software. This is good because it brings an extensive back-catalog of software, but bad because much of it isn't designed for mobile use--interface and performance issues will exist at times. Alternately, everything becomes an ARM-powered device with a new (iPhone? Android? Windows Mobile?) OS using newly-coded software. This is good because it uses mobile design considerations to make for good mobile software and potentially better desktop software, but bad because everything you owned is now useless. Or we can continue to have two different platforms, one from desktop/x86 roots and one from PDA/ARM roots, competing against each other and doing well in their own markets, but being completely in-fucking-compatible with each other.
...I dunno about you guys, but I'm not okay with this shit. I want a unified computing architecture. I don't want games that only run on my Nintendo DS or an iPhone/Pod/Pad or laptop. For that matter I don't want any applications that only run on one platform. I want a smooth continuity in resolution, size and performance from the 640x360 (half-720p) smartphone to the most monstrous of desktop computers. Just in the past few weeks, Sepf was delighted to find that he could play Pokemon on his phone. To me, it's pathetic that we actually get excited when we can play our old games on new devices. That should be EXPECTED. They're general purpose computers. Or, they should be.
What are your thoughts? Do we teach our mobile devices to sit, multitask and high-rez, or try to squeeze more watts and grams from our netbooks? Or do you like the division between traditional mobile and traditional desktop?
Me, I hate mobile stuff. I don't like making compromises. I'm not going to pay for a phone which is 80% the cost of a laptop but can't do Flash or 720p video. I want a DS Lite-sized ultramobile PC running a multitouch-capable version of Windows 7. I want a 720p screen and some UI improvements to make it easier to navigate Windows with a fingertip on a smaller screen. Most importantly, I don't want to sacrifice features just because I'm moving from my desk to a coffee shop to a plane. I don't want a different OS, I don't want to have to close down an application to open another (ahem, iPhone OS), I don't want to have to re-buy games and applications with trimmed-down versions on my mobile device.
Long story short, while I can definitely appreciate the idea of a multitouch slate, I want it to be a computer rather than a PDA.
For those who aren't up-to-the-hour on this stuff, the iPad is a just-announced Apple product which is basically the mix of an iPhone and Mac Book Air display. Half an inch thick, 1.5 pounds, multitouch 9.7" IPS screen (1024x768). 16-64 GB of storage, custom 1 GHz processor, 802.11-n wireless, 3G connection and more.
I'm of two minds on this. First, as I wrote about over two years ago, I think that the desirable mobile future is in slate-style multitouch tablets. Laptops are nice when you've got a table to sit at, but their keyboards and bases make them completely unusable in other scenarios. Solution? Make the display detachable and itself portable. Give it its own battery and transmit user input and display output between the display and computing base. Apple's approach stuffs everything into the display itself which has ups (simplicity) and downs (my version would let you use the display as just that, a display, for anything from a netbook-grade computer to a full gaming machine). But it's still an admirable attempt as far as form factor is concerned. And hey, they're even making a dock with a keyboard base--if that can be used unpowered, you eliminate the laptop benefit of a hardware keyboard. Aaand they got rid of the retard chrome accents that they've senselessly been horny for. Unfortunately, that's where Apple and I stop agreeing on things, and I'm not talking about business practices here. This is also where the Mobile Conundrum part of my journal starts up.
The recent trend, it seems, has been to make software, and then to make mobile software. We've got web browsers and then we've got mobile web browsers. Chat clients and mobile chat clients. Mobile versions of software are optimized for low-resolution screens with lower performance overhead and touch/multitouch input. But don't we want those things anyway? Aren't space-efficient interfaces with slick touch-enabled controls and low resource use good for laptops and desktops as well? And there's another odd division, too--the kinds of chips used in the devices. Now, my knowledge on this subject is extremely limited so correct me if I'm wrong on any of this stuff, but basically everything from desktops to netbooks use your standard x86 architecture, while mobile devices almost invariably use the ARM architecture.
So here's where things stand. Desktops and laptops are slowly getting traditionally mobile features--touch-oriented inputs, for the most part. Mobile devices are getting enough computational power to do more than your 2000-era PDA text editor, and can do 3D graphics, web browsing, video playback and more. The two sides of computing, stationary behemoth and mobile lightweight, are converging towards each other, with devices like the iPad and Netbooks all but passing each other (the iPad screen is bigger than some netbooks, with comparable weight, and some netbooks are severely underpowered and can't do half the stuff that the iPad can). And yet they still retain the division in hardware and software, even though they try to tackle the same problem.
There are three potential ways this can play out. First, everything becomes a computer and runs computer OSes and computer software. This is good because it brings an extensive back-catalog of software, but bad because much of it isn't designed for mobile use--interface and performance issues will exist at times. Alternately, everything becomes an ARM-powered device with a new (iPhone? Android? Windows Mobile?) OS using newly-coded software. This is good because it uses mobile design considerations to make for good mobile software and potentially better desktop software, but bad because everything you owned is now useless. Or we can continue to have two different platforms, one from desktop/x86 roots and one from PDA/ARM roots, competing against each other and doing well in their own markets, but being completely in-fucking-compatible with each other.
...I dunno about you guys, but I'm not okay with this shit. I want a unified computing architecture. I don't want games that only run on my Nintendo DS or an iPhone/Pod/Pad or laptop. For that matter I don't want any applications that only run on one platform. I want a smooth continuity in resolution, size and performance from the 640x360 (half-720p) smartphone to the most monstrous of desktop computers. Just in the past few weeks, Sepf was delighted to find that he could play Pokemon on his phone. To me, it's pathetic that we actually get excited when we can play our old games on new devices. That should be EXPECTED. They're general purpose computers. Or, they should be.
What are your thoughts? Do we teach our mobile devices to sit, multitask and high-rez, or try to squeeze more watts and grams from our netbooks? Or do you like the division between traditional mobile and traditional desktop?
Me, I hate mobile stuff. I don't like making compromises. I'm not going to pay for a phone which is 80% the cost of a laptop but can't do Flash or 720p video. I want a DS Lite-sized ultramobile PC running a multitouch-capable version of Windows 7. I want a 720p screen and some UI improvements to make it easier to navigate Windows with a fingertip on a smaller screen. Most importantly, I don't want to sacrifice features just because I'm moving from my desk to a coffee shop to a plane. I don't want a different OS, I don't want to have to close down an application to open another (ahem, iPhone OS), I don't want to have to re-buy games and applications with trimmed-down versions on my mobile device.
Long story short, while I can definitely appreciate the idea of a multitouch slate, I want it to be a computer rather than a PDA.
FA+

Your main desktop apps would have two 'modes' on the same program. You would have a full powered desktop version with all the bells and whistles with a UI that makes sense for use at a desk, and a portable version it switches to when it is streaming to a mobile device. The only real hang-up on this is bandwidth, which I think is going to be solved before high-powered ultra mobility is solved.
Desktops will always have the advantage with power, cooling, and reliability. They just need to be able to stream to portables more effectively.
That said, I wonder if you can hack the iPad into being a drawing tablet. Because that would totally be awesome.
They do have drawing programs for the iPad, but I assume you mean something you'd plug into a computer?
and yes, I mean plugging it into the PC to use as a tablet like a cintiq.
What the FUCK are they thinking with a 1024x768 screen? That's 1.33:1 aspect ratio. I haven't used a 1024x768 screen in about 8 years now. I thought it was bad they were using 1.5:1 aspect ratio before, they just took a step backwards. They should be doing 16:9 like everyone else is tending towards. Fuck this 10x7 IPS bullshit, where's the 720p OLED screen? You want power savings, you want a fine display, you want thin, you want something worthy of looking at for hours at a time, it's a fuckin' OLED with inky blacks, blinding whites and vibrant colors. Furry porn looks like shit on all the screens in our apartment in comparison to Sepf's Nexus One's OLED display.
WHERE is the camera? How is Apple ever hoping to make video chat standard if they don't include it in all their products? Is it like with the iPod Touch, where they decided not to include it just so the iPhone still has a few selling points?
Why is there only one dock port on it? What if you want to have it in landscape orientation while you have it docked? Damn thing looks like it's ready to fall over and looks all kinds of silly standing up the way it is.
Custom chips? Really? It's bad enough that they're using ARM architecture at all. And that they have a proprietary OS. Now they're making their own hardware. Competition just died a little more today.
No phone use? Why can't I use a bluetooth headset and send/receive phone calls from it? I can see that it might be unwieldly to carry around when shopping and whatnot, but around the house and workplace it'd do just fine. If I'm gonna spend $830 on something, I want it to be a replacement for a smartphone, not a supplement.
But on the subject of computing platforms... Would you really want to use some of the mobile software on your Desktop? And vice versa, getting a full Desktop program on the hardware isn't entirely feasable at this point. We can all agree that the overall goal on EVERY platform is to get resource usage low and interfaces as slick as possible, however some things have to be sacrificed on the mobile front (I have yet to see drag n drop on a mobile platform, and good luck getting photoshop CS5 on your iPhone whenever it comes out). Until hardware barriers are overcome this will probably be an issue for quite a while.
I really dont like netbooks as a whole. Sure some people may love them but really, I think they fit your definition of too much 'compromise.' Screen and keyboard too small to really be useful for anything, not enough processing power to play with the big boys, and still too big and bulky to be all that portable (I define 'portable' to be anything that I can fit in my pocket. I dont want to carry a laptop bag everywhere). Sure you get 8-10 hours of battery life but... seriously, does that really do you any good when all you need to do is browse the internet or check email or etc. You can do that on your phone now, seriously.
Personally, the niche the iPad and netbooks doesnt really need to exist. HOWEVER. Progress there does spell good things for the mobile phone territory (smaller chips = more power in mobile phones) and for the useful laptop territory (eg, better battery life in a bulkier machine). So at the moment it's kind of a necessary evil for development to happen.
Whether I want every mobile program on a desktop or desktop program on a mobile is irrelevant, the points are that a) there are programs I'd like to have on both platforms which thanks to age or niche popularity will not be ported, and b) I don't want to pay for a program twice.
Will a mobile phone run Photoshop nicely? Probably not, compared to a desktop. But it should still be able to work, for anyone who wants to use it. They do make image editing tools for the iPhone, after all.
I don't like netbooks either, because they're always half-assing them. Apple showed with the Air and here that you don't have to make them crappy, though. Modern hardware is capable of everything in a small form factor except for advanced 3D gaming, which is generally an accepted tradeoff.
I'm in favor of devices with screens bigger than the iPhone and other smartphones. DS Lite-sized. Running a real OS. 'nuff said.
Now in their defense, you can dock it and use a keyboard... but in the PC's defense, you can get tablet and touchscreen models, too. In both cases, though, software design limits the games utilizing those novel input forms. iTouch games aren't designed to support keyboard controls. PC games aren't often designed to utilize tactile input.
So ultimately it boils down to who can crank the most polygons, draw the cleanest textures, and offer the most fun. With the exception of the last, there's no debate on the PC being the winner at those.
I think the ideal solution is to make the OS handle whatever UI software is on it, much as a server can detect if it's client is using a mobile device and respond accordingly - if it sees it's on an iPod, it shouldn't try anything fancy, eliminate extra elements, etc.
If you put a basic OS on all devices that will tell the UI what it's capabilities and restrictions are, then the UI (be it windows/etc) can adapt itself to it's new home. Installing a new OS would be as simple as uploading the software to the device.
Also agree that the chrome should be independent of the OS. In a way it's like implementing widgets for websites--OS X draws radio buttons one way and Windows draws them another way, but they all still work, and a website can call for a radio button and count on a certain type of behavior. I want the same thing from the OS--it says "give me a window with minimize/maximize buttons on the frame and 3 control buttons above a text entry field" and the UI process can draw that any way it wants. Want the frame to be glass? Easily done. Want your Min/Close buttons to form a yin yang? Your funeral. Are you on a high-DPI touchscreen device? Draw the buttons at a higher resolution with more space between them so it's easier to push them. You get the idea.
I'd wager that in a year, Apple will release a hardware upgrade offering shit that people wanted the first time around. Maybe they'll put a camera on it. They'll probably bring back the iPhone's chrome borders for no reason. The volume controls will disappear and reappear in another generation. They'll make it widescreen perhaps. Minute little things that any idiot could've pointed out at launch. It wouldn't be unheard of from them, look at the iPhone or iPod lines. iPhone, launched 2007. 2nd gen came out in 2008 with 3G support and a slightly different form factor and build materials to improve transfer speeds and connection quality. 3GS came out in 2009 with... a compass, vaguely improved performance and a better camera. iPod Touch, launched 2007. 2nd gen in 2008 with the internal speaker and volume controls from the iPhone, which you think would be FUCKING OBVIOUS on a MEDIA PLAYER. In 2009 they again proved that they have no fucking idea what they're doing by adding the light sensor from the iPhone and voice control (with no built-in mic!) plus the vague speed upgrades, still leaving out the camera. iPod Nano, launched in 2005. 2nd gen came out in 2006, changing from colorless plastic with a metal back to brightly colored aluminum with curved edges. 3rd gen in 2007, going from tall and thin to short and fat with, with the back made out of shiny metal again and muted colors. 4th gen came out in 2008 with the previous form factor of thin and short, and yet again getting rid of the shiny metal backing and bland colors. In 2009 they added a camera, when people wanted the camera for the Touch.
It's like a fucking merry-go-round of ideas. Let's make it out of plastic with a metal back! No, let's make parts out of plastic! No, let's do aluminum! Back to metal backing! We need to make it shorter! Wait, let's make it tall and thin again! Did we add the camera to the right iPod? Shit, we forgot a volume control on the first iPod to not have a click wheel of sorts! *vomit*
I'd almost feel bad for iProd application developers if I didn't harbor such animosity for them. Imagine developing a game to run on the Apple mobile platform. I'll ignore Nano, because even though it shows a fun random smattering of features, it's not a realistic comparison. First up, performance. There are 3 tiers: Classic and 3G iPhone, 3GS, and iPad. Camera? iPhone only. Light sensor? Not for gen 1 and 2 of the Touch. Resolution? 320x480 for everything but the iPad. Aspect ratio? 1:5, but 1:33 for the iPad. GPS on everything after the 2nd-gen Touch and iPhone, but half-assed on the Touch, Compass only on the 3GS and iPad. Hardware keyboard available on the iPad only, but not something you can count on having... all you can count on is basically the Touch gen-1 with ear buds, the extra level of resolution and performance, as well as the presence of a camera, compass and light sensor are all luxuries that may or may not be present.
The thing about Apple that really pisses me off is how, once they've created a new model/device, the old ones pretty much cease to exist, as far as they're concerned. The same goes for the accessories to said devices. For example, this $50 AV cable I bought to go with my $450 iPod Touch won't work without the $50 Universal Dock, because oops, they forgot to put the authentication chip in the cable. Even then, there are a slew of docks, only one of which will actually do what it's supposed to. All this effort to watch a video, which must be encoded to extremely specific parameters.
Oh, and wouldn't it be really cool to use bluetooth to sync your Touch to your computer? Wait, why would anyone want to do that? It's not like most computers are bluetooth ready, or like the the chip has been in iPod Touch for the last 2 generations. But hey! You can play the piano on it! Neat, right?!
Anyway, way off topic... I think that the iPad is a complete waste, at least for the next year or two. For people who can actually draw, it might make a good tablet, but I don't see many other uses, as of yet. In combination with one of those laser keyboards, it might catch on. For some reason, though, I don't see the iPad being anything more than a giant iPod.
1) Complete ransparency on all levels of the technology must occur. In other words, one shouldn't have to know the details of how the underlying operating system or hardware works, one just needs to know that it works, and works reliably, predictably.
2) Complete separation of system code and application code. System code's encoding should be native to the operating system and to the hardware. Application code's encoding should be Java-like opcode which will run on any system in a secure virtual machine. Furthermore, users should be able to write and compile their own applications with a compiler that is built into the system.
3) The concept of the wireless peripherals should be vastly expanded, so that mice, keyboards and even monitors are extensions of the various computing devices. Furthermore, multiple personal computing devices should be able to seamlessly communicate with each other, so that an owner can freely copy or move applications and data between any of his devices, as long as they have the memory for it.
But...I can promise you that big business will fight any such advancements. Any business which earns its income by producing hardware or software keeps people coming back to them by controlling innovation through intellectual property law. The advancements needed to make personal computing truly user friendly are a direct threat to such businesses.
No cameras on this thing, no HDMI, and no multitasking. I could go on, but wow that just killed it for me.
Well, here's a picture: http://www.ilounge.com/assets/image.....ssories/16.jpg All descriptions I've read on it say that it's a USB-to-Dock-Connector or Dock-Connector-based SD reader (and the picture would suggest both models are available).
The thing which is pretty fuckin' ironic to me is that my Archos 400-series, from 2004, had a built-in CF card reader so I could transfer photos from my camera to my 20-gig PMP, back when CF cards stored hundreds of megs rather than tens of gigs. Nobody seemed to care about it then. Now it's a $30 accessory for an iPad and people are getting excited about it because they think from the name that it's a clip-on camera D:
Fwiw, I think the tablet is only fairly mediocre. I was expecting a little bit more from it.
1) I think the clincher is the most unobvious one: this fucker is gonna wobble like crazy when you have it on a flat surface [since the damn thing doesn't have a flat surface!], and typing on it will be a pain in the ass, just barely better than the iPod Touch to be honest.
2) Still no flash support, and I can't figure out why.
3) Maybe this is designed a bit -too- much for the common denominator.
2) Probably because no Flash support means a) better battery life and faster performance, and b) that'd require software to be written specifically for that system architecture to run Flash. We're not in desktop OS land anymore, Vanny. All software has to be made from fucking scratch.
3) How fuckin' weird is it that we're saying a .5" thick multitouch IPS display with a cell connection and up to 64 gigs of flash memory is too common denominator? Still, widescreen + OLED + camera + a real fucking OS, and they'd have it perfect.
I agree with you completely about the seperation of mobile and desktop platforms. And the seperation of just about every device out there. That was one of the reasons i was rather excited at first by the android OS. I honestly thought it would be like having a legitament computer. A linux based one but whatever.
So the spectacular press release, the grand new device, is nothing more than a pre-emptive strike to android OS netbooks.
I was truely hoping for a true mac experience. Even if they ditched using a pressure sensitive stylus that artists can use, for the simple average joe friendly touch input, so long as it was a traditional computer experience, id find it interesting.
But its just an iphone... biggie sized.
Dunno how often you frequent my journals or not, but I'm the antithesis of Mac/Apple friendly, and I abhor their business model (provide the hardware, firmware, OS, default software, content distribution system, and the pricing and availability of everything except for 3rd party software... which they also regulate and can block at any time for any reason). So a true Mac experience would still be something I'd frown heavily upon, and want to fail, if only because it's Apple and they destroy the competitive environment... but at the same time, I'd give a silent nod towards that direction, of putting a full computer running a non-mobile OS in a mobile device.
I'm trying to find a niche for this thing, and ultimately, well... It's not for artists, because they need more processing performance and pressure sensitivity unless they just want to do crude finger painting and sketches. It's not for heavy readers, because it simply can't measure up to something with e-paper when it comes to battery life, weight and comfort of use. It might be useful for students, as 10 hours (theoretical) should last for a school day... but generally? It's... honestly, a toy for people to dick around with. Sure, you can sit back in a comfy chair at a coffee shop and browse your email, trying to look as important as Steve Jobs. And I guess if you're in a car and you have your shit with you, you can take it out and get GPS directions to somebody's house. And yeah, with a few keyboard docks in key locations you could use it instead of a laptop for simple document composition. But at its size and cost it just strikes me as woefully impractical. For the same cost, you can get a laptop. For a hundred bucks more, you can get a Mac laptop. And that'll do almost as much.
Let's let Apple tell it to us, though. What are we to do with an iPad? It's the best way to experience the web, email, and photos. ...why? What about an iPad slideshow is so much better than, say, a Macbook slideshow? And what about the web, and email? The web: typing in a link, reading stuff for a few seconds, finding something else to click on, reading for a few minutes, repeating until you have better things to do. Email: clicking on whatever is unread, reading it, and typing a response. I'd think that whatever has the highest-quality and largest screen, and the best keyboard, would be "the best way" to experience that stuff, unless I'm missing something crucial here. The web is all about information exchange, information is pure I/O. Display and keyboard. They might've made a better case if they said it was the best mobile gaming device, because at least then they'd be able to make debatable claims about it having the largest multitouch screen or whatever.
Here's the reality of the situation, not just for the iPad, but for anything in that form factor. A device with a 10 inch screen and a few more inches of bezel won't fit in any pocket on any piece of clothing you own. Nor will it fit on any belt hoster, it would look absolutely absurd. It's like the size of a clipboard. So you can carry it around, like a clipboard, or have a backpack/carrying case for it. If you've got to have a carrying case it's no more portable than a netbook, and just some odd ounces lighter, and if you carry it around like some managerial type person with a purpose, you're apt to get mugged, sneered or laughed at. The only fucking difference that its portability makes, aside from the few ounces here and there it's got over a netbook, is that it's easier to use when you have no level surface to set it on, like a laptop. Standing up, sitting in a chair with no table or desk, whatever. And the trade-off is that it's correspondingly worse when you do have those surfaces and want to use them, because it doesn't stand up on its own. Yeah, you can carry with you a keyboard and dock, but then you basically have a laptop that snaps in two on command. More flexible? Sure. More portable? Only when you know you're only going to need it for light browsing and low-input work.
I've said it a hundred times before and I'll say it a hundred times more, unless a device is the size of a DS Lite or smaller, nobody is putting it in their pocket. Even a DS Lite is pushing it for people who wear tight jeans. Anything bigger than the DS Lite is going to require its own carrying case with a handle or shoulder strap, at which point you might as well just use a netbook, and anything less will simply be shit for any reasonable amount of text input. That's the size target for an ultramobile PC.
Sadly, a giant iphone is not something i need. It will not replace my netbook and tablet. It wont even replace my gigantic, 3-4 hour life, tablet pc i lug around.
Apple says that the iPad is better for media and productivity than an iPhone because it's bigger and faster, and better for portability than a laptop because it's smaller. Of course you can turn that statement on its head and say that it's worse for media and productivity than a laptop because it's smaller and slower, and worse for portability than an iPhone because it's bigger.
I think an inherent problem is that users are looking to consolidate devices rather than get "yet another electronic device" that they have to pay for, keep track of, keep charged, keep sync'd, etc. I don't want a netbook if it doesn't offer advantages over a laptop. I don't want an iPad of it can't run the same software a netbook or laptop can. I don't want a smart phone if it won't have the storage that a portable media player will have. I don't want a half dozen half-assed devices, I want one or two really solid ones. Apple and other companies don't see things that way, because they naturally like selling you as many devices as possible. So every product gets its own unique selling point with no direct supersets. The Shuffle is super small. The Nano has an FM tuner. The Touch has sheer storage with a touch screen. The Classic has sheer storage, period. The iPhone has a phone. The iPad has the biggest screen of any touchscreen-based Apple product. And the laptops have performance and software you won't get anywhere else. Sigh D:
This means the iPad can currently do more than iPhone/itouch can do. It will most likely ship with iPhone OS 4.0.
Yes, it DOES have multitasking, it's been confirmed via leaking already.
This are a few of the features that the upcoming iPhone OS that the iPhone 4G and iPod touch 2nd and 3rd gen (1st gen probably won't get it) will have.
Always remember this, whatever iPhone has, iPod touch, and now the iPad will also have since they all run the same OS, the iPhone OS.
You do not have multi-touch on iPhone / iPod touch UNLESS you are in an app that allows it (say the Virtuoso Piano app). This has been changed as of the iPhone OS 3.2 and iPhone OS 4.0.
That and I think the Android phones have multi-touch outside of apps so Apple had to have it too.
I have 48 apps on my iPod touch... I can think of maybe a FEW apps that would benefit, but not practically just...a novelty.
Arm architecture is phenomenally low powered. The two worlds will bridge when battery technology catches up with the power requirements. Till then, they'll likely stay seperated like they are.
The iPad has a 25 Watt-hour battery. If they simply halved the specs of an Air (single core 1.86 GHz, screen half the surface area, half the graphics performance) you'd imagine it would halve the power requirements from 8 Watts to 4 Watts, so an iPad battery would get you about 6 hours. Not as nice as the 10 hours it has now, but most people would be thrilled to get 6 hours out of a device that size, and then they'd have the full awesomeness of an x86 PC backing it.
One of the things that has always interested me as an engineer, is the decisions that are made in the creation of a product. There are invariably a *LOT* of things that are taken into consideration that are not just the technical aspects. Business decisions get made with alllll kinds of interesting inputs. In the end, there are usually a few key decision makers that weigh in everything and make the final decision about what to put into a system and why.
All the tech details aside, and all the "they should have's cuz what it is is stupid" aside, I think it would have been pretty interesting to sit in on those meetings and hear why the choices for the ipad were made the way they were made. In the end, the market will decide if they made the right choices or not. And one thing is for certain, it's going to be an interesting journey.
As I mentioned in a comment further down, the thing that would seem to be the most important to me is performance per watt. An Atom is supposed to idle at close to no power, so all we're concerned about is power draw under load. If the Atom draws twice as much power but is able to do twice as much work, that's hardly a fair comparison and they just need to underclock it or make an even more stripped-down version.
Honestly I've been trying to think of solid uses for it, and I'm pressed for good examples other than letting you check your email while sitting on your couch in a Snuggie, updating your Twitter while standing on the bus, or trying to find directions in a map application in public without using a sub-4" screen or trying to balance a laptop in one hand while you log in... set up a 3G connection... open up a web browser... look for and type in your address and destination, and wait for Google to give you directions. It's too proprietary OS to be a strict laptop replacement, and too large to be a smartphone replacement.
They're just... excited, and not thinking clearly, and trying to get an admittedly neat (albeit not entirely practical) gadget. They're trying to justify its purchase as being something for school, which is already tens of thousands of dollars, so what's $500-800 extra on top of that? D:
.....And I just turned on Owl City on my iPod Touch.... o~o
I wonder how long it will take for someone to crack it and put windows on it? :P
And rivals to the iPad do exist, I know that Archos make a 9" tablet running Windows 7.
What on Earth would we talk about? We'd all be locked up in our homes on the damn things all the time.
We would probably even be displeased with that and say it's destroying humanity or something xP
Basically, there is no way of winning. There is no right, there is no wrong. It's the pessimistic hallmark of a postmodern society where everything breaks down.
Depressing I know, but the boots fits in most cases.
sorry, fresh of a lecture on postmodernism that made me think about everything in a new light :O
Love your journals by the way, interesting viewpoints that i both agree and disagree with.
Vogel~
If a computing product was made which did everything we want from a computer, I don't think it'd be boring at all. It'd be extremely liberating. It would open up new kinds of interactions, save time on old activities, and generally make things more awesome. Let me give some examples.
Imagine that our dream mobile device lets us take pictures of everything we're shopping for. Next week when we want to go shopping again, we check off all the items we need to re-buy, and it looks at our shopping list, checks stores for local availability, looks at current traffic conditions around your location, and says "If you go to these 3 stores you'll spend 8 miles and 14 minutes on the road, but save $15 over shopping at this location..." etc. Suddenly you can spend less time and money on shopping.
Imagine that, through very precise GPS positioning, digital compasses and image recognition, people are able to leave virtual graffiti anywhere in the world that they can physically visit. You hold up your dream device and by using the GPS and camera, the system figures out if there's any graffiti on what you're looking at, and draws it in accordingly. Wouldn't it be interesting and amusing to see notes, doodles, etc that past visitors had left? I imagine that people could make spontaneous treasure hunts by leaving riddles and clues in random locations. I'm sure that millions of users with such a system could come up with much cooler ideas than that.
At the very least, such a device would let people always have their important information at their fingertips, as well as real-time info on happenings all over the world. If there's a bored friend eating lunch alone in the same building, it would allow you to find them and have a conversation over your meal. If there's been a major accident on the highway you're on, your device could alert you within minutes or seconds, helping you to avoid the traffic and keep the area clear for emergency crews. If you see something awesome or that you want to remember, you can easily snap a photo or video clip and add a text, audio, or doodle to it to help you remember why it's noteworthy. I wouldn't say that this stuff is or makes people incredibly boring, displeased, or worse off.
I think that the potential of human psychoanalytic behaviour has to be taken into account when a company sells a product.
There's a tiny example of how human psychology affects the rejection or acceptance of a product, in the release of Windows Vista. There is no doubt that Vista generated a negative reputation, and this has since been recuperated with the release of Windows 7. In talking to a number of people about the positive changes that Windows 7 has over Vista, it seems that the more positive change is the change in wording of many functions. Instead of asking "Are you sure you want to do this?", it will ask "The program wants to do this, is this ok?"...
My point being, psychology is important in relation to how a product is accepted. From how the design makes you feel, to how the wording on the screen makes you feel.
The ipad is aimed at the customer who knows nothing about computers and wants to have something they can have at home and show off to people when they come round. Whereas advanced netbooks (of which I know nothing because I am cluless about PCs) are aimed at the market of people who know what they are doing with regards to computer electronics. They are polarised markets.
In continuation, let alone a product that can "do everything" (within logical reason); a product that can "appeal to everyone" in a place where people are so polarised about electronics, is therefore illogical, impossible and a tragically utopian idea.
A product that appeals to you, Mr. Twile, you awesome ball of highly admirable electronic knowledgability, and (I am guessing) someone that enjoys the technical side of electronics... I can't imagine that same product appealing to me (someone who knows nothing about silicone itself, let alone silicone technology), mainly because I want something simple, that's designed in a cool way and just "feels nice". I doubt you would buy something based on that alone... I would. Because I am electronically ignorant and intimidated by it, but happy nonetheless ^_^ despite the fact that the ipad can't do certain things that the electronics engineer wants to do... It's not made for the electronics engineer.
I will admit however, I don't fit into the target audience of the iPad. But I can see who they are aiming at. It's a "safe" product that can be put in the hands of both clueless children to play EA Games at home, to the CEO of Clueless Incorporated in their office. It's not meant for the road. Their promotional video shows no clips of the device being used on the move. It's not portable in that sense. I'd feel like a dick using the iPad on the bus or train O.o
Let me put it another way, I gave the example of technical products not appealing to the clueless consumer because of intimidation. I think you are rightly offended by the iPad because it is a clueless product, but I don't think your customer demographic sits anywhere in its target audience, thus rendering your hatred of it (don't take this the wrong way, i admire you and I hope you can understand my point) irrelevant to the consumer it is made for.
This is such a complicated issue that is so much more than simple choice of electronics. It's about consumer psychology, capitalism and corporate competition also.
I suppose my argument, in clarification, is not against your love of technology. It is in defence of the people that are clueless about chips and need Apple's guidance in order to live their lives. That is who this device is made for. I can never support blanket damnation of a product based on one technical viewpoint. In continuation, the iPad does indeed have a place in the world's marketplace.
[is afraid of a big response from you now D: *hides* ]
Of course human behavior has to be taken into account when designing stuff. I took classes on that stuff. That's kind of a separate issue from the overall product market, though.
I do not believe that stuff for computer-illiterate users and tech-savvy users has to be on totally different pieces of software and hardware. More experienced users can use additional tools to tweak and modify things to their heart's content, less experienced users can get tools which help them tackle challenges in ways that they understand, and everyone can be treated to an intuitive, attractive computer experience. It's not illogical, impossible, or utopian.
Compare it to, say, a car. I don't know the first thing about car performance and inner workings, aside from the basic idea behind an internal combustion engine. And yet I can drive one, because I was taught (by a person, in this case) how a couple key controls work, and what the rules of the road are. A very car-savvy person can buy the same car and pop it open, swap out half the parts, and make it very different in appearance and function. Computers are similar.
You think I don't want products that are cool and feel nice? I'm incredibly picky about stuff I spend money on.
As far as issues I had with the iPad itself, they're either things that an average user would enjoy or just not care about, nothing that would break the experience. I think it should have a camera. I think it should have a 16:9 aspect ratio. I think it should be able to run a standard PC OS and applications. Now, you might say "But a simple user is less comfortable with a standard PC and applications, the iPad interface helps them understand things and makes it simpler" and you'd probably be right about that. But that doesn't mean we have to make a binary choice between iPad interface and standard PC interface. All you have to do is make an "iPad application" which runs on top of Windows or Linux or OS X and provides that same interface in full-screen mode. Advanced users can exit out of that and do regular applications if they so desire. The iPad UI would be just like Windows Media Center or Front Row in that respect, something that provides a streamlined entertainment experience on top of a full, flexible OS.
I doubt many children will be getting iPads for Christmas. Even the base model is $500, more than a PS3 and Xbox 360. And it's more fragile than either. I'm reminded of when I last went to a place to drop off a package and for whatever reason, the woman doing my paperwork was talking about getting a 360 for her son. She seemed to think it was terribly expensive and said she was shocked to find out that a lot of games are $50-60. Honestly I can't blame her, that's like taking an entire family of 5 out to see a movie for the evening.
I also doubt it's aimed at clueless CEOs. They might want one, but that's not the major consumer base they're going for.
If it's not intended to be used on the move, then why do they have GPS and cellular connection functionality with it? Sure, they don't show it being used on the move. But I've never seen an Apple commercial which shows that stuff, and a lot of their products (every iPod, the iPhone, and their laptops) all are. If it's not meant to be mobile, why sell a dock for it? Why obsess over its weight and not have a keyboard on it? I'm sorry, the iPad is DESIGNED to be mobile. It's meant to fill in where a laptop is less comfortable to use, and those are all very mobile settings.
RISC processors (which is the way ARM chips work) are pretty much all microcontrollers, or as it has been termed recently because of their dramatic increase in available power, a SoC (System-on-a-chip). This just means that the RAM, Input/Output interface handling, ROM storage and other bits and bobs are all on one bit of silicon, which means that it's a highly integrated little package that reduces the size of the devices. Unfortunately, due to the fact that so much is all on one chip (and even more so now that they've started including graphics cards on them) that means that the actual CPU side can't be that complicated, or else the cost of the chip would escalate wildly.
On the other hand, we can look at the Intel x86 architecture. Traditionally (up until the recent Clarkdale and Pineview cores in the Core i3/i5 and Atom respectively) the only thing on the processor die has been the CPU, CPU registers and the cache. Nothing else. The RAM, storage, Interface and everything else are totally external to the chip. This means in essence that the system can be a whole lot faster, as the silicon real estate available for transistors is huge.
The main differences between the two however, is that the RISC design is highly integrated and low power, whereas x86 is blindingly fast but consumes one hell of a lot of power. Most laptop CPUs now have a TDP (essentially how much power it dissipates) of between 20 and 45W, and a lot more for desktop machines, whereas your typical ARM CPU only draws about 1W total when your laptop also has to power a GPU, chipset etc.
ARM designs could technically be infinately superior to x86 if they started getting it all multi core (which I think the Cortex A9 can actually do) and push the clock speeds up. But the only issue with that is Microsoft. Ever since the days of Windows 2000 and Win XP, Windows has only ever run on x86 architectures. That makes it easy from a programming point of view because the x86 architecture MUST use a unified instruction set. All the Assembler mnemonics are the same, but there is no set list of assembler instructions for RISC, for example the instruction set for a Microchip PIC16F series which I use will be completely different for an ARM processor (and infinately better too, ARM assembler code is a nightmare). So in the end, until Microsoft decide they are going to run on a different architecture to x86, nowhere else is going to have the resources to throw at other architectures and make them just as fast. And in the end, this is the same reason that Apple switched over to Intel processors, because due to how many they were churning out because EVERYONE runs windows, it just made alot more sense.
Basically stuff that I understood with a little technical detail I didn't. Still though, I have to wonder... how much of a gap is there between RISC and CISC as far as performance per watt is concerned, with today's stuff? The Mac Book Air, for example, "sucks down" an average of 8 watts (if their 40 Watt-hour and 5 hour battery life are accurate). And the iPad is supposed to take a mere 2.5 watts (again, if the 25 Watt-hour and 10 hour battery life quotes are correct). Clearly that's a big difference, a factor of 3, but the Air screen is also almost twice the surface area and uses older, presumably less power-efficient tech... as well as an SSD (which can by itself take as much power as the iPad). And I'm pretty sure that it's more computationally powerful, too.
So I'm really curious about how the conventional x86 chips measure up against Apple's custom ARM as far as performance versus power consumption. Subtracting the extra power for the larger Air screen and SSD and factoring in higher performance, how do they really measure up? Basically, how many microwatts does it take per triangle rastarization and list sorting for reasonably efficient chips from the two architectures?
So performance per watt between RISC and CISC is essentially the same, but it generally boils down to how well the actual chip is made, not in terms of the instruction set.
With the introduction of products such as the ARM Cortex A9 and the Intel Atom (even if VIA's C7 was already occupying the market that the Atom is now dominating) the performance-per-watt between the two architectures is becoming remarkably similar. ARM's cortex A9 uses >1W per core, but then again it is only running at 1GHz. An Intel Atom N280 runs at 1.6GHz, and has a Hyperthreaded core there too, so that can consume somewhere up to 8W. But then again, the Atom is going to be considerably faster at more compute-heavy tasks thanks to it's faster clock speed and having more Cache (which is something I'm not all that up to speed on myself, unfortunately).
However, because the two architectures are completely incompatible, we can't just say "That's faster than this" because there's no fair way of comparing them. We'd have to use different programs, assembled and compiled by different assemblers and compilers, written by different people. And having two alien applications won't give a reliable result.
I mean, the tasks that we're going to have, be they on desktop or smart phone, are going to keep on converging. Searching databases to find a song that matches your criteria. Displaying flash videos. Decoding H.264. Texturing and shading operations in 3D graphics. Just like with commercial servers, our primary concern in the mobile market now that we're doing real computing tasks needs to be performance per watt, assuming that things scale nicely.
If anything, computing is tending towards the cloud. The future, for the most part, will consist of 3 computers - the large, powerful media-server-slash-everything-else in your house, the laptop/netbook you carry around in the rare occasions that you need it, and a mobile device that's aware of two things: The world around it, and the corresponding information on the Internet.
And if you're living in a high-tech area, chances are you're already equipped with all of these. How, exactly, will personal computing get so much more advanced in the next 20 years that we need to rethink everything we know? The Sixth Sense device, while novel, is too intrusive for daily use. Wearable glasses with cameras, wireless access and intelligent finger-motion detection? I can't imagine too many people would want to walk around waving their arms in the air, looking like total idiots. It was cool in Minority Report, but only because one person was doing it.
Direct neural interfaces? Not for the next 50 years, and that's even with our "exponential rise" in technological achievement. The most people can do is make vague guesses at the colors - we can't even make out the shapes in the painting yet.
Then there's the Ghost in the Shell and the Matrix routes, neither of which have it ending well for humanity.
So, pray do tell, where to from here? What I WOULD like to see are physical public access terminals, scattered around like bus stops and corner cafes. If your mobile's not powerful enough (or you just hate the cramped keypad), shoot over to the nearest station, log in to your Google account, and check your email.
As for the "one size fits all" device - I can say with 99% confidence that there will never be such a thing. The 1% is predicated on the assumption that it'd be really expensive to surgically enhance the entire human race.
AR =/= cloud computing. Sure, they can go together, but they're... not really the same. When I think AR, I think (as I wrote above) virtual graffiti that you leave somewhere, which only shows up when you're using an app that ties into that database and pointing the camera at a location with said graffiti. Or the iPhone app where you move the phone around and it shows you the image the camera is capturing, with an overlay that shows the direction, distance and average ratings for any nearby restaurants.
Cloud stuff is all well and good if done in a mature way, but short of web mail, I've yet to see it implemented to my satisfaction.
And then there's wearable and ubiquitous computing. Also interesting stuff. Gah, I feel like I'm back in college, I took two courses and taught one on just this stuff.
And I'm also confused as to how you think cloud isn't being done in a "mature" "way". Google's search index? Maps? Wave? Twitter? Facebook? Goggles? Foursquare? Folding@Home? Azure? E3? If anything, cloud computing will reach its apex in the next 3-5 years.
I'm sorry, but Twitter, Folding and Wave aren't everything in computing. What happens if I want to play a complex 3D game, or watch a movie, or record and upload a video, or any other number of things? As cool as it would be if mobile devices could be rock stupid and just act as access terminals for a distributed cloud computing ecosystem, we're missing out on a lot of things. Bandwidth for one, keeping a solid 1 Mbit connection open at all times is by no means a given today, and that's enough bandwidth for FLAC-quality audio, not uncompressed video streams at desirable resolutions. Latency is another issue, the time between pushing a button on a mobile phone and getting a response from a computer a thousand miles away is too great to be a perfect substitute for local processing. Infrastructure is yet another, it's just not currently easy, as far as I know, to sign up for 100 gigs of cloud storage, x trillion computational operations and y MB-hours of RAM worth of computing which your mobile phone will display the results of.
Nothing would please me more than to let a remote computer do all the work for my mobile phone. Why wouldn't I want to play Crysis at the native rez on my phone and have it look gorgeous, because my desktop can render it at super high settings on such a low resolution? What could be cooler than streaming the terabytes of SD and HD content from my home file server to my phone? It'd be all kinds of awesome if I could do that. But that just doesn't work right now. Neither my phone nor my computer have the software necessary to facilitate such a seamless connection. Neither my phone's download bandwidth nor my desktop's upload bandwidth are sufficient for more than a 2 Mbps video on a good day. And in the very few hours my phone would survive under such a communication workload, I would undoubtably drop frames or my entire connection dozens of times if I were roaming.
Oh, also, airplanes and cell dead zones :| I think I pretty much exhaust the subject on my Google Chrome OS journal. I love the idea, but all we have bandwidth for now is at best streaming standard-def YouTube videos and the data for your emails and online photo album.
Took me two days to figure it out (and reading today's strip).
Like I mentioned elsewhere, I expected it to be along the lines of a tablet computer, like a HP Touchsmart but of Apple OSX, hardware and design. I didn't expect anything totally revolutionary, just something done before but perhaps better.
I also love how in their video on the site they say some of the most redundant things:
"When something exceeds your ability to understand how it works, it sort of becomes magical..."
"It just feels right, to hold the internet in your hands as you surf it, and with a screen this large you can just see more of the web..."
"You can go through huge quantities of email really quickly and its fun because your doing it all with your hands..."
"When you take the product out of the box and hit the power button, the display immediately comes to life..."
When you read it it definitely has less flare and those lines certainly didn't help with my opinion. I'm probably bitching too much over this, but perhaps there is something in the works to make it something incredible. Who knows. I just don't see it catching on... unless people are better off than I think they are. Simply not for me.
The screen size thing though, that statement just makes me want to vomit on Steve Jobs. It is a 9.7 inch screen. That's a full 3.5 inches smaller than the smallest CRT or LCD I've used on a computer in the past 10+ years. Fuck that shit. A screen THIS LARGE? The only way you get SMALLER than that is if you use a smartphone or netbook.
Steveo should get a job at a chicken processing plant or some kind of factory. "You can remove huge quantities of chicken feathers really quickly and it's fun because you're doing it with your hands" >_>;
Or do you like the division between traditional mobile and traditional desktop?
There will always be a divide between mobile and stationary devices. A device that tries to be a jack of all trades will be a master of none, which isn't looking good given that having a cellphone or a desktop computer is not mutually exclusive. For someone like me either it fits in my pocket or it sits in a corner. No middle ground.
The market for the iPad is the netbook market. It is even priced fairly competitive for it (well, for Apple it is competitively priced if you use the cost of a 16GB model iPad) to be in the netbook market. I don't think the iPad will succeed. It is a closed system competing against a market that has nothing but open systems and it doesn't even have a USB port so no plugging your Western Digital Passport in for more hard drive space unless you carry even more equipment with you. No Flash for you too, hahaha.
It'll probably still sell like hot cakes because it happens to have the logo of an apple with a bite mark on the back of it. At least it has a IPS screen. Although what is up with all that bezel?
Not a perfect example of course, because the problem with the knife is a physical limitation, while the software jack/mastery isn't constrained in that way. You would be right to point out that physical computer devices have the same limitations, but still, I find them to have a lesser impact. I mean, with properly designed processors, things should be able to shut down when not in use, right? Under ideal circumstances, your laptop could do the same work the netbook is using with (ignoring the screen) the same power draw, but simply have the option to do it FASTER because it's able to provide and dissipate more power. Suddenly, the serrated edge of the knife can scale up or down as needed. If that were the case, and CPUs scaled nicely that way (which we're still not completely at yet) the only difference between a PDA-sized computer and a 20" desktop replacement behemoth of a laptop would be that the desktop replacement would be the cap on how fast it can go, the size of the screen, the battery size, and the weight--where on that continuous scale you'd like your device to fall is a personal and financial preference.
I do understand what you mean about not having middle ground between the ultraportable and the stationary, though. When I've got the ability to have a half kilowatt of power and no battery concerns, I opt for the behmoth desktop which takes performance to an extreme. And if I had things my way, the only other computer I'd own would be a DS Lite-sized UMPC with a detachable keyboard dock. Just small enough to fit into a pocket, meaning with the right software and hardware it could double as my phone, but big enough to actually perform computer tasks on, and with the option to have a hardware keyboard when I can sit down and use it to my benefit. I wouldn't need to lug around a laptop bag, PMP, and/or phone... just one device. Hence my great frustration when no companies try to make such things, except OQO--which went out of business.
Honestly though, I've never understood the STRICT netbook market. People who have a netbook for literally nothing except browsing the web. They have a web browser, maybe antivirus (if they get a virus they just reload the OS, as the only software is a web browser!), that's it. And likewise I don't understand the iPad market. The devices aren't portable enough to, as you said, fit in your pocket, so they still need a carrying bag which renders them only slightly more portable than laptops and basically nullifies that advantage. I've heard people say that they have a laptop and netbook too, because the netbook gets them a few more hours of web browsing when that's all they need, but honestly if they bought extra batteries instead of the netbook, they could have the same or greater browsing life as the combined devices, for the same or less cost, with the same or less weight. So really, I don't understand why Netbooks get their own name, reputation, market, intended uses, when they're just... smaller laptops. The iPad is targeted at also being that, in a different form factor, but as I griped about above, it's really not--you no longer have a general purpose laptop that'll run anything, just slower and on a smaller screen. So fuck that.
You'd think if they were all up on being groundbreaking, saving power, having a good picture and being easy on the eyes they'd either go AMOLED (if they want to make a slate computer) or color e-paper (if they want to make an eBook reader). But no, they go for a technology which is better than most laptops have, but nothing new to them or the market as a whole.
The bezel probably serves a few purposes. First it gives you something to the sides of the screen to hold onto, so you can grip the device between your fingers. Second, it makes the device larger (so they have more space inside to work with) without requiring the screen to be even bigger and more expensive, power-hungry, etc. Third, it... I don't even know D:
They finally come out with a tablet, and there's no pen input for artistic things. That's complete bullshit, and I will be switching my major because I will not be in a job where they still use macs like it's still the 80s or something... GOD DAMN they piss me the fuck off. This iTablet or whatever the fuck it is is the last straw for me, and I hope Win7 will finally kill off those fucking fanbois and they will realize that they have been following their precious Apple inc. like lemmings right towards a cliff!
It's pretty fuckin' lame that Apple didn't throw a bone to the artists. I mean, they needed to add a tablet screen to a slate-style notebook, and they failed on both counts. What they have won't run real art programs, and it won't do any more pressure sensitivity than you'd have by dipping your fingertips in paint and smearing them on a wall. I can understand their decision, at the least, to embrace capacitive multitouch--that's what will benefit most users (let's face it, most people don't want to keep track of a $100 stylus and don't give a rat's ass about pressure sensitivity, but they DO like multi-finger gestures and pinch-to-zoom). But seriously, the fact that Modbooks exist is proof that there's a market for the slate Mac OS-running tablet. They can make a multitouch version and a tablet version at launch, aimed at mainstream users and artists respectively, and promise that within a year they'll make a version with both multitouch and tablet input (the tech does exist!). Problem fuckin' solved.
The really annoying thing about the whole iPad ordeal are the Apple fanatics who say that the PC's fate has been sealed and it's doomed to die because of the iPad. Now, I'm of the opinion that real x86 computers, Mac or PC, fill a definite need for customers, and we'll still have to see how things play out, but I don't see how it could doom PCs and not Macs at the same time. Although the hardware and OS is slightly different, the platforms are damn close neighbors.