Forgot your password?
typodupeerror
GUI Portables

Shall We Call It "Curated Computing?" 331

Posted by kdawson
from the art-it's-not dept.
medcalf writes "Ars Technica has an opinion piece by Sarah Rotman Epps on the iPad and other potential tablets as a new paradigm that they are calling 'curated computing,' where third parties make a lot of choices to simplify things for the end user, reducing user choice but improving reliability and efficiency for a defined set of tasks. The idea is that this does not replace, but supplements, general-purpose computers. It's possible — if the common denominator between iPads, Android and/or Chrome tablets, WebOS tablets, and the like is a more server-centric web experience — that they could be right, and that a more competitive computing market could be the result. But I wonder, too: would that then provide an incentive for manufacturers to try to lock down the personal computing desktop experience as well?" And even if not, an emphasis on "curated computing" could rob resources from old-skool computer development, as is already evident at Apple.
This discussion has been archived. No new comments can be posted.

Shall We Call It "Curated Computing?"

Comments Filter:
  • Like a museum (Score:5, Insightful)

    by mujadaddy (1238164) on Friday May 14, 2010 @11:54AM (#32208028)
    "It's very cold, and very beautiful, and you're not allowed to touch anything."

    Sorry, I'm more of a hot-rodder than a passive consumer.
  • by iamapizza (1312801) on Friday May 14, 2010 @11:55AM (#32208034)
    Please just bite the bullet and call yourself an Applogist. (Geddit, Apple Apologist?)
  • by fuzzyfuzzyfungus (1223518) on Friday May 14, 2010 @11:55AM (#32208038) Journal
    It's a "managed freedom institution".
  • an emphasis on "curated computing" could rob resources from old-skool computer development

    That doesn't necessarily have to be true. It's not like developers are en-masse converting to develop for mobile platforms. There is an ecosystem in the desktop software that has to be maintained however the market for that is pretty much saturated. This means that new developers will probably lean towards mobile computing because that market is new and pretty much open. As more people get these devices, that market will also start to get saturated and probably much quicker as the gatekeepers try to keep the bad and duplicate apps out.

    • It's not like developers are en-masse converting to develop for mobile platforms.

      Major video game developers have already en-masse converted to develop for game consoles.

    • Yeah, that's the thing I don't get.

      I have an iPhone. I use it to make phone calls, email, listen to music, do light web-browsing, take pictures. That's about it. Sure there are other niche things I use on it, but for the most part those are the big 5 I use it for.

      My laptop, I use for everything else.

      Why do people think these "niche" devices have to be everything to everyone? They aren't. Here's your car analogy:

      People commute in cars to work every day. They also use those cars for various other travel reasons. If they want to store a LOT of materials in the back of their car, they're limited to either making several trips, borrowing a truck from a friend, or something else. If they were moving a lot of materials constantly, it would make more sense for them to use a truck.

      In short. Trying to force the idea on the public that having one of these devices will render any other computer obsolete shows a serious lack of critical thinking. (Just like my car analogy does)

      • Why do people think these "niche" devices have to be everything to everyone? They aren't.

        Until they're everything to almost everyone. At that point, if you're not in the class of "almost everyone", then the record industry, movie industry, and business software industry will assume you to be either A. an employee of an established, licensed, and bonded company, B. a student training to be an employee of such a company, or C. a pirate.

        • At that point, if you're not in the class of "almost everyone", then the record industry, movie industry, and business software industry will assume you to be either A. an employee of an established, licensed, and bonded company, B. a student training to be an employee of such a company, or C. a pirate.

          And this differs from today because? Oh, I see, because today they consider option B and option C to be identical.

        • by UnknowingFool (672806) on Friday May 14, 2010 @12:42PM (#32208664)

          Until they're everything to almost everyone.

          Why do people here on slashdot have this crazy notion that slashdotters are everyone? They're not. They are the minority. Most people couldn't tell you the difference between GPL, BSD, xfs, and X Windows. And they don't care. You give them a device and the first thing they care about is how do they do [some function]. The shorter the learning curve, the more they'll think it's some sort of magical device.

          Technology intimidates most people. Think of your average grandparent. They like the TV. They like radio. They have DVD/VCR players that have the wrong time. They hate computers. Why? Because they only want to learn just enough for them to use [some function]. They don't need to program the time on the VCR/DVD. They know to put in the media and press PLAY.

          There are products designed for slashdotters; Apple doesn't however design products for slashdotters. They design consumer products for the average consumer. They design professional products (MacBook Pro, Mac Pro) for the design professionals (graphic artists, photographers, musicians, film makers). Even their server line is designed for specific users. None of these are designed for geeks like you and me.

          The iPad is a limited device. It is not designed to replace the desktop. It is designed to be an extension of it. It is designed to consume media with limited ability to create. It is not for me but this fits for most consumers. They check their email and surf the web; they don't code.

          • > The iPad is a limited device. It is not designed to replace the desktop.

            Tell that to all of the gleeful Apple fanboys hyping this thing as the second coming.

            They're hoping/clamoring that it will wipe away the last usurper and bring forth a new walled garden utopia.

          • by ducomputergeek (595742) on Friday May 14, 2010 @01:25PM (#32209330)

            More and more I find my self in the "everyone else" category. Sorry, but I no longer have a desire to build a machine and spend all weekend hacking something together. I want something that just works. Apple's products do that for me. After I bought my Dad an iMac, I've spent exactly 2 hours in 3 years upgrading his computer to OS 10.6 last christmas. Before when I went to visit, it was 3 - 4 hours of me fixing his PC. Which usually meant formatting and reinstalling everything.

            Honestly, I look to replace the iMac with an iPad 3G for my Dad next year. All he does is check email, track his stocks, read the newspaper online and that's it. Maybe a video from Youtube from time to time.

  • Oh good (Score:2, Funny)

    by Anonymous Coward on Friday May 14, 2010 @11:59AM (#32208074)

    As a 30 year old man I love having big brother make all the decisions for me as I never grew out of a child like mental state and can not possibly make a choice by my self

  • by Shag (3737) on Friday May 14, 2010 @12:02PM (#32208112) Homepage

    Hmmm, that'd be news to me, and to various people I know.

    If she means to say that a tablet would be better for the bathroom than a laptop, though, she might have a point...

  • by daveime (1253762) on Friday May 14, 2010 @12:02PM (#32208130)

    Can't Slashdot editors find ANYTHING newsworthy that isn't about Apple ?

    Fucks sake, the content of this article boils down to "Apple's latest iDevice is equivalent to a gold plated toaster, where user choice has been minimized, but leads to a better overall toast experience".

    It might be gold-plated, but it's still a turd underneath, and no amount of iHype or Apple apologists will change that.

    Bye bye karma, see you again sometime.

    • by tepples (727027) <{tepples} {at} {gmail.com}> on Friday May 14, 2010 @12:06PM (#32208188) Homepage Journal

      Can't Slashdot editors find ANYTHING newsworthy that isn't about Apple ?

      It's not just about Apple. It's also about Microsoft, which uses the same App Store structure for Xbox 360 indie games and Windows Phone 7 apps. (In fact, Apple appears to have copied much of the structure of the iPhone developer agreement and App Store from Microsoft XNA Creators Club and Xbox Live Indie Games.) And it's also about Nintendo, which was the first to require that all apps be approved by the device manufacturer.

    • by Mindcontrolled (1388007) on Friday May 14, 2010 @12:10PM (#32208250)
      To be honest, in the latest Apple related stories, the Apple apologists have been the ones being carpet-bombed with troll mods lately. I agree, however, that the whole Apple thing seems to be the current method of choice for our esteemed slashdot overlords to gather site impressions by keeping the flamefest running. Well, at least it is not climate change this time. Where's that gasoline? Gotta fuel the flames! Burn, iBaby, Burn!!!
    • by Culture20 (968837) on Friday May 14, 2010 @12:13PM (#32208278)

      Can't Slashdot editors find ANYTHING newsworthy that isn't about Apple ?

      Well, I heard that BP's next move with the oil spill in the gulf of mexico is to shove a bunch of Apple iPads into the pipe.

    • by bmo (77928) on Friday May 14, 2010 @12:16PM (#32208320)

      Article choice seems to be lackluster these past few years. We got a link to a nutjob calculating the end of the world with regards to the gulf oil disaster instead of like... well.. a link to NPR.

      http://www.npr.org/templates/story/story.php?storyId=126809525 [npr.org]

      However, I do cruise on by here every so often in the vain hope of a good story. The firehose method of "voting for stories" sucks.

      ObT: Yeah, Apple is a walled garden, so what? Some people can't handle anything else and to decry walled gardens as evil are entirely missing the point. It's better to live in a walled garden when you're entirely incapable of defending yourself from the barbarians at the gates.

      --
      BMO

    • by Abcd1234 (188840) on Friday May 14, 2010 @12:23PM (#32208392) Homepage

      It might be gold-plated, but it's still a turd underneath

      Why? If users like the experience and it let's them get things done, what makes it a "turd", exactly? Granted, it may not be your kind of turd (I'm more of a Linux guy, but god knows it can be a shitty experience sometimes), but that doesn't mean it's a poor product. It's just not marketed to you, that's all.

      • by ducomputergeek (595742) on Friday May 14, 2010 @01:11PM (#32209086)

        I switched to Mac about a decade ago. Primarily because I needed a new laptop and was tired of trying to get things like sound cards to work on Linux at the time. Apple gave me a Unix laptop that also happened to have commercial software support like MS Office. And I've been sold ever since. My time is worth something to me, especially now. I deal with technology at work all day. Last thing I want to do when I get home is get on or fix another computer. Same when I go visit my Dad, hence why I got him an iMac. Spent a total of 2 hours in 3 years working on it and that was upgrading to OS 10.6. I used to spent 2 - 3 hours everytime I was home.

    • by spire3661 (1038968) on Friday May 14, 2010 @12:30PM (#32208492) Journal
      The mobile market is ON FIRE. We are seeing all the big players shuffle and shove each other to give birth to the appliance internet. This is why we see Apple stories every day, because they have a HUGE mindshare in this arena at the moment. The iPad is only a turd if you were wanting a full-on x86 tablet.
    • by gstoddart (321705) on Friday May 14, 2010 @12:33PM (#32208524) Homepage

      Can't Slashdot editors find ANYTHING newsworthy that isn't about Apple ?

      Slashdot is reporting on what other people are talking about. Everyone is talking about the iPad, and in case you haven't noticed, Apple's track record with very successful consumer technology is hardly something you can ignore -- iPod and iPhone and iTunes have generated huge sales.

      It might be gold-plated, but it's still a turd underneath, and no amount of iHype or Apple apologists will change that.

      What, specifically, makes it a turd? What aspect of saying this a cool device makes one an 'Apple Apologist'? I don't even know what it means to be an Apple Apologist -- I don't even own an Apple computer, but I do own an iPod (four, in fact), and I'm certainly not "apologizing" for anything they've done. You think the Zune was some great new bit of technology that the world missed out on?

      You're entitled to your opinion, I just fail to see why people like you are so smugly convinced that this isn't a useful bit of technology. I'm very excited by this device, because it's one of the most novel computer devices I've seen in quite a while -- in terms of form factor and interface, as well as how I envision it being used.

      This is mostly about people bashing Apple, and acting like children and saying everyone who doesn't agree that this device is the work of Satan is a doody-head.

      I mean seriously, if the only thing you have to add to the discussion is "why are we talking about Apple again" ... stop reading the threads.

    • by Duradin (1261418) on Friday May 14, 2010 @12:41PM (#32208646)

      If this article wasn't here you couldn't be seen hating on it and if I didn't see you hating on Apple I might just start to wonder if you actually really did like Apple. You don't want people to start thinking you like Apple now, do ya? Better get that hate on!

  • by drinkypoo (153816) <martin.espinoza@gmail.com> on Friday May 14, 2010 @12:03PM (#32208132) Homepage Journal

    Walled gardens have obvious benefits and drawbacks. But more relevantly to this story (or summary, heh heh) this terminology already exists [wikipedia.org] and no new phrasing is required.

  • by oldhack (1037484) on Friday May 14, 2010 @12:04PM (#32208146)
    The key distinction is: Are you buying a hardware? Or are you buying a hardware encumbered with license restriction that effectively says you cannot "hack" where "hack" is whatever the vendor deems undesirable?
  • Locked Down (Score:5, Interesting)

    by RAMMS+EIN (578166) on Friday May 14, 2010 @12:07PM (#32208202) Homepage Journal

    ``Ars Technica has an opinion piece by Sarah Rothman Epps on the iPad and other potential tablets as a new paradigm that they are calling 'curated computing,' where third parties make a lot of choices to simplify things for the end user, reducing user choice but improving reliability and efficiency for a defined set of tasks. The idea is that this does not replace, but supplements, general-purpose computers.''

    That's fine and dandy, but we don't need *locked down* devices for that. You can make the choices for the end users just fine, without taking away their ability to make different choices. Ubuntu is a good example of this: you can get the streamlined desktop experience that Canonical provides by just going with the defaults, or you can adapt the environment to your liking, starting with things like changing desktop backgrounds and installing packages from the main repositories, and continuing all the way to running a custom kernel and third-party software completely independent from the repositories.

    By contrast, many of the 'curated computing' providers will sell you a device where you are prevented from doing many things, all _in the name_ of making things easier and more reliable for you. But really, that's a false dichotomy - your ability to deviate from it does not impact the ease of use and reliability of the default configuration.

    • Re:Locked Down (Score:3, Insightful)

      by Em Emalb (452530) <[moc.liamg] [ta] [blameme]> on Friday May 14, 2010 @12:12PM (#32208272) Homepage Journal

      I would posit the average end user DOESN'T want a lot of choices. I'd say wanting to do whatever you want with a device is pretty much in the geek realm. (Overall) I'm not saying I agree with it, just saying that your average end-user doesn't care that they can't run a specific version of some (for example) SSH program on their phone. Hell, 99% of the world doesn't know what the hell SSH is.

    • Re:Locked Down (Score:5, Insightful)

      by JustinOpinion (1246824) on Friday May 14, 2010 @12:20PM (#32208370)
      This.

      It's a false dichotomy to discuss "streamlined user experience" versus "user freedom" as if one is completely at odds with the other. To provide a streamlined experience simply requires good design and sensible defaults. You don't have to lock-out the user from changing those defaults, accessing the full capabilities of the device, or repurposing the device entirely.

      Of course it makes sense that vendors of locked-down solutions would spread this misunderstanding. They want to enforce consumer lock-in to their product/services stack. By convincing customers that the lock-in is actually to their benefit, they now have people effectively begging to give up their user freedoms. What bothers me is that media outlets seem not to have generally caught on to this lie. Instead they repeat the false dichotomy, as if it were a fact of nature. I guess it is because computers are still fairly misunderstood by the public at large. (By comparison, most people would not buy it if they hired an electrician who installed locks on their fusebox, telling them that they'll have to call/pay him when the fuses blow... because only then can he guarantee a proper "electrical user experience"...)
  • by Tei (520358) on Friday May 14, 2010 @12:09PM (#32208224) Journal

    "curated computing"?

    We already have a word for that, is dumbification.

  • by fuzzyfuzzyfungus (1223518) on Friday May 14, 2010 @12:09PM (#32208226) Journal
    I think that to focus on the "curated" aspect really misses(or obfuscates) a critical and ugly point.

    Consider the following analogy: You want your house to be aesthetically pleasing and pleasant to use; but know fuck-all about color matching and picking furniture. So, you hire an interior decorator. They "curate" your space and emit a list of suggestions. You can then make it so, or not. On the other hand, if you go to a museum, the curator's decisions are not suggestions, and they are generally tailored to fit the desired audience as a whole, not necessarily you. You cannot add, remove, or substitute anything. Your only choice is to attend the museum or not.

    In computing terms, the "interior decorator" situation is basically equivalent to the OEM providing a set of sane defaults, chosen for some mixture of security, ease of use, power, and cost. You can pick your interior decorator and, if you wish, you can deviate from their suggestions.

    The "museum curator" option, on the other hand, is the iDevice/carrier lockdown situation. You can either take it or leave it; but if you take it, that's it. the OEM retains cryptographic control over "your" property forever.

    The big difference is whether your "curator" is providing a list of suggestions, or a list of orders. The former, frankly, is something that OEMs(particularly the wintel guys) really ought to do a lot more and a lot better. Sane, secure, usable defaults are a good thing. The customer shouldn't have to blow the stock image to hell and rebuild from scratch just to get a desktop worth using. However, any set of defaults that doesn't include a "screw this, I'll do it myself and take the consequences" button, somewhere, that allows you to reject advice and do your own thing is ultimately invidious and will inevitably be used as a tool of rent-seeking(as in consoles, where the OEM extracts a tithe for the privilege of being allowed to sell programs that run on the hardware, or as in the App Store) and likely censorship and all sorts of other fun stuff.
  • by decipher_saint (72686) on Friday May 14, 2010 @12:10PM (#32208240) Homepage

    So, here's a question. Does the "average" user who picks up an iPad expect it to be capable of more than what it does out of the box?

    This is something I just don't know, I bought a Netbook last year and even with that I could install whatever would run on it reasonably, I know I don't like the feeling that I'm limited to what I can run not because of hardware limitation but because of a conscious designer-driven decision (but that's just me.)

    • Does the "average" user who picks up an iPad expect it to be capable of more than what it does out of the box?

      "There's an app for that." But then people run into the limitations of what Apple allows in an app and complain to other people. For example, a device with iPhone OS 2 or 3 can run Safari and iPod at once, but not Safari and Pandora at once.

      I bought a Netbook last year and even with that I could install whatever would run on it reasonably

      Same here. I use my netbook as the low-end laptop that it is. Of course, the danger here is that laptop companies will stop making netbooks in favor of "curated" or "walled" tablets if they see far more profit in the latter.

      • I guess the thing that is on my mind is, when someone buys into the "walled garden" will they see its limitations, turning it into a temporary market/fad?

      • by gstoddart (321705) on Friday May 14, 2010 @12:49PM (#32208786) Homepage

        Same here. I use my netbook as the low-end laptop that it is.

        But I don't see an iPad as even being in the same category as a low-end laptop, certainly not a replacement for it. This seems more like a device which is intended to be used differently from your existing devices, and quite possibly in conjunction with them. You sync your iPad with your main machine, and load up the media you want on it, plus you can surf on your wireless (or even 3G if you buy the fancy one).

        Of course, the danger here is that laptop companies will stop making netbooks in favor of "curated" or "walled" tablets if they see far more profit in the latter.

        I actually question if multiple vendors could successfully do this. If the underlying OS is still a Windows variant, it's a lot harder to design it to work differently and give a different experience and interface -- this throws away the mouse and starts over. Also, it's going to be tougher to create the infrastructure for things like app stores and the like.

        Apple has this thing running on the same OS they use for their phones, and already has built the infrastructure for stuff like iTunes. Microsoft could try to compete with this device, but I don't think I'd want to buy one from them.

        I actually think trying to compete with this device will be quite difficult -- Apple is trading on their ability to produce a really excellently functioning user experience on a well-defined platform. Nobody else really has that to draw on.

        • by fluffernutter (1411889) on Friday May 14, 2010 @01:25PM (#32209340)

          This seems more like a device which is intended to be used differently from your existing devices, and quite possibly in conjunction with them

          It would be great if these devices worked in conjunction with the other devices in my home, but the sad fact is the more 'curated' they are, the less able they are to do so. My iPod touch talks to one machine at a time in my home, I need to be physically connected to that machine, and I need to use a specific application on that machine. That's it! How boring.

          Sure if I want to I can load an app that uses some anonymous server that is in some anonymous location as long as it is approved by Apple but if I want to use all the machines in my home I can forget it. Frankly I think it is an absolutely abhorrent development to computing and it is about a dozen steps back.

  • Kiosk Komputing (Score:3, Interesting)

    by microcars (708223) on Friday May 14, 2010 @12:12PM (#32208270) Homepage
    I thought it was a better term. or maybe Consumption Computing?
  • by KGBear (71109) on Friday May 14, 2010 @12:14PM (#32208286) Homepage
    "moderated computing." Someone other than you decides what you can and cannot do. Good idea from the point of view of end users, people who really couldn't care less about the technology itself, only what it enables them to do. But terrible idea for the rest of us. How long until general purpose computers become a niche application or a hobby like ham radio? And of course become a "boutique" item costing orders of magnitude more than "consumer" toys?
  • Inevitability (Score:5, Insightful)

    by danaris (525051) <danaris.mac@com> on Friday May 14, 2010 @12:14PM (#32208290) Homepage

    Let's face it. We are geeks. We are always going to like the freedom and power to do whatever we want with our computers.

    But we are not the majority.

    Most people don't really care if their operating system allows them to recompile their kernel, write a new text editor, or even install arbitrary software. They would be happy enough to be able to install the stuff their friends have, not have to worry about viruses, and surf the web and chat with the aforementioned friends. And do some occasional work.

    Some of this stuff is still Not There Yet on the iPad. And maybe the iPad itself will not be the dominant device of its type once things settle down in a few more years. But I think it's foolish to expect that the completely-open, easily breakable, general-purpose PC is going to be the only, or even the primary, computing device that most ordinary people use in 10 years.

    PCs will certainly still be around. Business applications, by and large, will always be a poor fit for the iPad and similar devices. So will programming. So will some types of games (but not all!). And, heck, at least for the time being, the iPad requires a computer with iTunes on it for managing it.

    But for the vast majority of people, a fully-featured PC is overkill for what they want to do. We're entering a period of transition—and, I would say, moving further toward the maturity of the computer age. As many people have pointed out in previous discussions, in the 1950s, if you owned a car, you more or less had to know how to do a bunch of basic maintenance tasks. Now, many of the parts you had to maintain no longer exist (such as the carburetor, as I understand it—I'm not a car person), and most of the others you can't maintain on your own: you have to take it to the dealer or an authorized service center, or void your warranty. Computers today are just starting to move past where cars were in the 1950s. It's no longer absolutely necessary to know how to perform maintenance tasks, but it still makes things run much more smoothly. And with the iPad, not only do you not need to do those tasks—you can't.

    For some people, that will always be a dealbreaker. And you know what? That's OK. Apple doesn't care if everyone buys an iPad, any more than they've ever cared that not everyone buys Macs. The world will go on, but changed: instead of just computers, we'll have computers and "curated computing" devices.

    Dan Aris

    • by DerekLyons (302214) <fairwater&gmail,com> on Friday May 14, 2010 @12:37PM (#32208578) Homepage

      Let's face it. We are geeks. We are always going to like the freedom and power to do whatever we want with our computers.

      Some Slashdotters are computer geeks and want all that freedom and power - some of us are other kinds of geeks and we just want our computers (desktop, smartphone, whatever) to just bloody work. For the first kind of geek computers are a toy, and for the second kind computers are a tool. It's important to recognize the difference, not only between kinds of geeks but how computers are viewed even among geeks.
       

      As many people have pointed out in previous discussions, in the 1950s, if you owned a car, you more or less had to know how to do a bunch of basic maintenance tasks. Now, many of the parts you had to maintain no longer exist (such as the carburetor, as I understand it--I'm not a car person), and most of the others you can't maintain on your own: you have to take it to the dealer or an authorized service center, or void your warranty. Computers today are just starting to move past where cars were in the 1950s.

      The problem with that analogy is that it's utterly false - for cars and computers. Garages and PC service techs both emerged fairly shortly after their respective technologies began to spread into the consumer market.
       
      What you're missing is that the story of technology in the mass market is the story of ever simplifying user interfaces. Electric starters replaced cranks, and shells and menuing systems replaced the command line.

    • by Culture20 (968837) on Friday May 14, 2010 @12:37PM (#32208580)

      Most people don't really care if their operating system allows them to recompile their kernel, write a new text editor, or even install arbitrary software.

      I was with you until "install arbitrary software". People _expect_ this, and are rankled when a device that (despite looking like an oversized iPhone) is universally recognized as a computer won't install Mac software they already bought. Insult to injury: they have to pay $15 or more for the privilege of using the same type of productivity software on the iPad that comes free with their usual OSes.

    • Re:Inevitability (Score:3, Interesting)

      by medcalf (68293) on Friday May 14, 2010 @12:49PM (#32208792) Homepage

      I think part of it is simply product maturity. To extend your car analogy, there are still car people. They are insane people who do things like fiddle with the software for their brakes (just to tie back to an earlier slashdot story) and program their fuel injectors and add new power sources. The average "car guy" of the past has been left behind, either to become today's super-geek car guy, or to become an average user of the cars he owns.

      The same thing seems to be happening in computers, where the average computer geek is being left behind. Those who are left will be super geeks on the computers, who actually know how to build their own circuits or use an iPad to transfer software to an Apple // or write code to modify locked down devices; most of the rest will become average computer users.

      I don't see this as a bad thing.

  • by ClosedSource (238333) on Friday May 14, 2010 @12:15PM (#32208306)

    We won't be calling anything "Curated Computing", but remember we can't all be famous phrase coiners.

  • The word (Score:5, Insightful)

    by Spatial (1235392) on Friday May 14, 2010 @12:17PM (#32208350)
    The word is 'appliance'.
  • by Assmasher (456699) on Friday May 14, 2010 @12:22PM (#32208388) Journal

    ...between reducing user choice but improving reliability and efficiency myself.

    Why do non technical people believe the words that pour out of Jobs' gob? The man, and Apple's advertising, is infamous for saying things he knows are not true. Hell, my favorite recent example of this was when he bashed Flash about being designed for PCs as one of the reasons not to use it on the iPhone/iPad when his company makes you use Objective-C! LOL. Guess what Objective-C was designed for?

  • by Animats (122034) on Friday May 14, 2010 @12:25PM (#32208412) Homepage

    "My friends, each of you is a single cell in the great body of the State. And today, that great body has purged itself of parasites. We have triumphed over the unprincipled dissemination of facts. The thugs and wreckers have been cast out. -- And the poisonous weeds of disinformation have been consigned to the dustbin of history. Let each and every cell rejoice! For today, we celebrate the first glorious anniversary of the Information Purification Directive! We have created, for the first time in all history, a garden of pure ideology, where each worker may bloom secure from the pests of contradictory and confusing truths. Our Unification of Thought is a more powerful weapon than any fleet or army on Earth. We are one people. With one will. One resolve. One cause. -- Our enemies shall talk themselves to death. And we will bury them with their own confusion. -- We shall prevail!" -- Apple, 1984. That's the copy from the famous Apple ad with the guy speaking to an audience of people in grey from a big screen.

    The Apple fanboys hate that paragraph (and will mod it down to "Troll" in about 30 minutes). But that's a clear statement of Apple's "walled garden" approach. They even use the same terminology: "A garden of pure ideology, where each worker may bloom secure from the pests of contradictory and confusing truths". As for the "Information Purification Directive", see the the EFF's analysis of the Apple iPhone Developer Agreement. [eff.org] Apple tries to keep the Developer Agreement secret, but they accepted a NASA app, which made it subject to a FOIA request, and now anyone can read it.

  • by molo (94384) on Friday May 14, 2010 @12:27PM (#32208432) Journal

    Am I the only one who thought of the removal of Gnome UI customization when reading the description?

    -molo

  • by Mutatis Mutandis (921530) on Friday May 14, 2010 @12:29PM (#32208474)

    Curated Computing sounds like a bad idea to me, because those third persons are making decisions without actually knowing my needs and habits as a user. Therefore less choice is very likely to lead to less relevance as well. This is the kind of computing you get in a big company where a central IT department sets policies and standards for everything, and it generally drives people who try to develop something new or display some creativity into raging fury -- even if the choices that are being made for you aren't braindead.

    I think in the long term devices such as the iPad are going to be a success only if they can be personal enough. In theory a more convenient model could be one in which the systems learns from my behavior as a user and adapts accordingly. However, so far this tends to be based on a frequency-of-use approach, which is rather limiting. It isn't much help to the less skilled user, who might never be able to find the right options. And there are potential privacy considerations, if this is focused on monitoring the behavior of a single person.

    A better mechanism could be a kind of 'Evolved Computing' working like this: I make myself a member of a peer group, based on common activities and common user interface preferences. I get a software package which may be inherently flexible but complex, perhaps too complex for the daily needs. Monitoring the group statistics allows the system's managers to de-emphasize some features, and highlight or offers others which might be attractive to this group. As a user, I can be presented with tools that other members in my peer group have found useful, and can adapt or reject. Another group may have another set of preferences, of course, but a particular group is offered the relevant subset in its user interface.

    It's nothing really knew -- it's traditional user feedback, and the selection mechanism for iPod Apps or other extension packages. But it could be done smoother and more intelligently.

  • by alen (225700) on Friday May 14, 2010 @12:30PM (#32208480)

    over the last 10 years home laptop sales have outpaced home desktop sales. but in the end most people still do the same basic things on laptops. Internet, email, youtube and a few others.

    this is where tablets come in. they are supposed to be good at the basics with the iPad having other software allowing you to do some other basic computing tasks. The reason Win7 doesn't work in tablets is that the install is 15GB and with 16-64GB SSD sizes it's too much lost space for the OS. iphone OS, Android and WebOS are all under 1GB. they will grow, just like Windows did over the last 30 years. the goal is to get the basics right, take over the market and add other features on later. Just like MS did with windows.

    i'm thinking of going this way this year. buy a low to mid tier desktop PC. i'll build myself or just buy a Mac Mini. It's enough to play older TW games and Civ4. for the average computing on the couch, mobile computing, taking kids' cartoons everywhere, etc a tablet is the way to go. it's smaller and lighter than a laptop. and the iPad has some nice build quality. feels better built than my $1500 HP business laptop.

    the desktop is your server where you keep most of the data. the iphone/ipad is the client you take with you. the desktop is where you can hack. a lot of people don't care about their iphone/ipad not being hackable. it's there to get stuff done or consume media. not explore the file system.

    we've had home server systems being sold, but i think they are a waste of money. you just need an external HD to hold your data with a thermaltake external holder. and they cost a lot in electric bills. and only the crazy OCD people think they need to have 20TB of their music and porn online all the time and available

  • by hsmith (818216) on Friday May 14, 2010 @12:33PM (#32208516)
    So far tablets have been "meh" - they have been around for almost a decade now and have gained no traction. Then Apple comes along and sells a million in a month without even trying. Is it a "power" device - No, it isn't. But, it fits a niche. It sent HP and Microsofts attempts at reinventing the tablet into a tailspin and caused them to be scrapped. But what niche will the iPad fill? I have one and love it, for a few reasons - but I see it more of a device for my grandparents. They have no real need for a computer and its power. They want to send email, check pictures, and that is about it. Why should the spend $70 a month on Cable Internet when they can spend $15 on 3G for the iPad? The iPad and the Ink have potiental to become great thin clients as well. Years ago I was looking at the Tough Book "Notepad" thin clients, they were $2000+ EACH and were nothing more than a Citrix client. We shall see. The tablet market has been reinvented. It will be interesting to see what Android vaporware will come out, I'll love to grab one when it finally does.
  • by gimmebeer (1648629) on Friday May 14, 2010 @12:33PM (#32208522)
    This statement alone is enough to disregard the entire article.
  • by bartwol (117819) on Friday May 14, 2010 @12:33PM (#32208526)

    ...where you choose a vendor who will make your computer be reliable.

    Gimme a break.

    Apple weenies (and a bunch of slashdotters too) need to let go of that aged-out belief that Windows isn't reliable, or that an unreliable app makes the whole platform unreliable. You don't need to switch vendors...you just need to stop using the bad app(s).

    Apple users are going to great lengths these days to rationalize the fact that they have chosen a platform with somewhat limited choices. The fact is that they have chosen a computing STYLE, and in so doing, have to some extent limited their computing CAPABILITIES. So, for example, they choose the iPad STYLE and they lose Flash CAPABILITY.

    Enough of the "less-is-more" argument in Apple-land...less may be prettier, but it's still less.

  • by fermion (181285) on Friday May 14, 2010 @12:37PM (#32208570) Homepage Journal
    First, this is a trend. There was a time when I hacked by computer by soldering, when components are big enough for me to fix things in my own home. You don't here people complain about not being able to solder a computer. That is no longer the expectation. Now people get upset because they can't upgrade a computer, as if removing four screws and pulling a cable gives them any great ability. But that is what the kids calls freedom. Freedom to go to the store and buy a part. Now most computers are laptops, and hacking is downloading programs and installing them, maybe opening them up and putting in new hard disk or memory. Apple is a villain because you can't add a battery. And then we get to the silliness of a phone, a device that my any manufacturer is closed wall garden. I don't see anyone building rougue cell towers at their home to get better reception, or to redirect calls to the landline. Maybe they are.

    And hackers think they are cool because they change the background image or download a naughty application. I am with them. There was a time when I thouhgt putting the Bill&Opus motif on my mac was the end all, I thought I was hot. But that is really an adolescent rebellion against anything that is forbidden, not any kind of technical issue. For most of us we have things we hack and things that we need to work. The PC is every office because it can be administered and locked down in a way that few other OS can. No one cares about hacking it because that is not it's purpose. The same goes for the iPhone and iPad. How many people complained that they could not hack their Razr. It was a good phone and that is all we cared about.

    If one wants to fiddle go and buy a copy of Make. What we don't need to do is think that Apple or whoever all of sudden violated some basic human right. Most of us don't care that we can't pull out the water pump from our car, and do car that we only have to see the mechanic once a year instead of every week. Most of us don't care that our televisions can't be repaired, but are happy that they give us a few years of good service then die so we can upgrade. Most people don't want a phone or a computer that they continuously have to fiddle with and upgrade. Those who do have cheap ones they can buy. Just not the iPad. Which is ok, because if one is a really a cool hacker, one does not need to show off with an iPad.

  • A better Answer (Score:3, Insightful)

    by 99BottlesOfBeerInMyF (813746) on Friday May 14, 2010 @12:40PM (#32208634)

    Clearly most Slashdot users prefer more choices over someone making choices on their behalf when it comes to computing. That's because we're computer geeks. The average person, however, is getting real benefits from having a group of experts with more control over the device on their behalf. They also get real negative consequences, such as some applications they want never making it to the device they use and less ability to migrate devices without losing one's investment in apps.

    Okay, we know all that already, right? So now we come to what people are doing about it. Half the venders are ignoring the benefits Apple has provided, secure in the knowledge that Apple's innovation will lose in the market. Half of them are emulating Apple, betting Apple is right. What none of them are doing, that I've seen, is innovating. Is there really no way to create a system that provides both the benefits of their "curated computing" while not bringing about the drawbacks? Can't someone build a central marketplace for apps that are vetted, and hosted by any and all comers? Can't a phone or series of phones be built where there is a guarantee that the apps will be portable between those phones and have been vetted for security and performance concerns so the user can make informed decisions? I've long advocated that the average desktop user doesn't have the information they need or the OS level control they need to effectively know what apps to run and how much to trust those apps. I've long advocated that the only way to get proper unbiased information is to build into the OS a way to get greylists of what apps are trusted from multiple sources, weigh them, and then take good, automated action on behalf of the user while providing them the details they need. It's easier to put all this power into the hands of one company, but then you end up having to trust a single party (be it Apple or MS). So who's going out making a better solution? Come on Google, I'm looking at you.

    Using an app store should be a process of getting data from many parties. "Three out of four of your security feeds say the battery performance of this app is unacceptable and should be avoided". "Warning: this app only works on this phone and has no vendor promise to allow you to support other AndroidCert phones going forward. Be sure to take this into account." "Warning: this app is rated as malicious by two of your four security feeds. You will need to change your app settings to download it. This is not recommended." In addition, devices should be doing the right thing in the background, sandboxing apps and severely restricting ones that have not been vetted... maybe even refusing to run unsigned apps by default.

    It is not impossible to create a decentralized app store using data and servers from a variety of companies... a personalized store that only shows users the apps that meet their security, performance, and compatibility requirements; or at very least makes the needed data available to the end user. People complain about the Apple iPhone App Store, but complaining is not really very useful. Who's making something better? Who's making something that is going to take hard work, but which will make a store that gives users all the benefits of Apple's store and freedom besides?

  • by fantomas (94850) on Friday May 14, 2010 @12:41PM (#32208654)

    "...a new paradigm that they are calling 'curated computing,' where third parties make a lot of choices to simplify things for the end user, reducing user choice but improving reliability and efficiency for a defined set of tasks."

    How about "censorship" instead?

    Ok, I know I am playing devil's advocate but if the slashdot headline was "China develops computing model where users have reduced choice but increased reliability, with the choices made by the State Education Department", I know the word censorship would be bandied around pretty quickly.

    Depends on who you want to make the decisions for you and of course a big question is how much opportunity you have to affect those decisions if you'd like to get involved in the process.

    • by 99BottlesOfBeerInMyF (813746) on Friday May 14, 2010 @01:20PM (#32209244)

      Ok, I know I am playing devil's advocate but if the slashdot headline was "China develops computing model where users have reduced choice but increased reliability, with the choices made by the State Education Department", I know the word censorship would be bandied around pretty quickly.

      There's a difference between censorship and choosing what to sell. When the government says you can't sell Catcher in the Rye, that's censorship. When Barnes and Noble decides not to sell Catcher in the Rye, that's just choosing what they want to sell. The former is an act of the government and the latter is just competition on what to carry. You can always go to another book store. You have no right to force a non-monopolist to carry a given product. When they don't do so... that's not censorship. To be perfectly clear, if China was controlling what choices a user has, that is censorship. If Apple only offers some choices, that's not. You can always use a different phone, or install Android on your iPhone and use a different App store, or jailbreak it and use a different App store.

  • by cwgmpls (853876) on Friday May 14, 2010 @12:50PM (#32208800) Journal

    Once a human is involved, all computing is curated, by definition. Because people don't usually talk binary, but computers do. To resolve this inherent human / computer interface problem, there first were programming languages like assembly. Then there were high-level languages like Fortran or Basic. There were OS commands and command-line interpreters. All of these were curated interfaces; they hid the underlying structure and provided the user only what they needed for a specific task.

    I remember when the first GUI interfaces came out in the early 80's -- people claimed they were not "real computing", but some limited, "curated" interface.

    It is amazing how far we have come since then. But still today, every time someone tries to make an incremental improvement in human / computer interface, it is still is derided as not "real computing", and some kind of strange novelty of limited usefulness. All computer interfaces are limiting, by necessity. That is not a weakness. On the contrary, computer interfaces are most powerful and productive when they provide humans with exactly what they need and nothing more.

  • How is this new? (Score:4, Interesting)

    by LoudMusic (199347) on Friday May 14, 2010 @01:24PM (#32209306)

    How the hell is this new? Because Apple did it? Have none of you been aware of game consoles for the past several decades? How is an iPad particularly different from a Nintendo DS or Sony PS3?

Going the speed of light is bad for your age.

Working...