Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Analyzing Apple's iPhone Strategy 270

Galen Gruman submitted infoworld's summary of Apple's grand strategy for the iPhone. He points out that the real important part of the new iPhone is the software, not the hardware. He talks about the new SDK stuff, the ad-hoc app distribution, and other stuff. It's a reasonable read if you have been ignoring the iPhone and want to know what the hype is about over this release, but doesn't break any new ground if you've been paying attention.
This discussion has been archived. No new comments can be posted.

Analyzing Apple's iPhone Strategy

Comments Filter:
  • Re:ATT Contract (Score:3, Informative)

    by UnknowingFool ( 672806 ) on Wednesday June 11, 2008 @10:30AM (#23747393)
    If it is a deal breaker for you, you can buy an unlocked one from O2 and have it shipped to the US, but it will cost you a lot more. Even if you got one on the T-Mobile network, I would suspect that the network connectivity features won't work that well. I guess the only solution then is to move to Canada. :P
  • Re:Objective C (Score:3, Informative)

    by crmarvin42 ( 652893 ) on Wednesday June 11, 2008 @10:37AM (#23747509)
    Why?

    I'll admit I'm not a programer and I have a tendency toward reading pro-apple sites, but I was under the impression that objective C is just an extension of C, and that regular C code would compile and run fine without extensive modification.
  • by SilentTristero ( 99253 ) on Wednesday June 11, 2008 @10:39AM (#23747539)
    From TFA:

    Still, neither of the iPhone DRM licenses enables the collaborative development that typifies open source projects. So Apple created a new "ad hoc" license that allows developers private distribution of iPhone executables to up to 100 registered handsets. Groups of coders can share work in progress binaries via e-mail or source code control.

    However, even the ad hoc license is not the wide-open solution that the open source community ultimately desires. An iPhone user should be able to opt into installing and running unsigned applications, a capability offered by all competing mobile platforms.
    This is the showstopper for me. A smartphone without a real freeware ecosystem will never truly thrive, for the same reasons that that open source development and commercial s/w development drive each other on standard platforms.
  • by Anonymous Coward on Wednesday June 11, 2008 @10:43AM (#23747615)

    >Free Playstation 3, Nintendo Wii and Microsoft XBox 360
    Are you fucking kidding me? Someone ban this shit NOW!
  • Re:Objective C (Score:5, Informative)

    by bsDaemon ( 87307 ) on Wednesday June 11, 2008 @10:56AM (#23747835)
    Obj-C is often considered what C++ would have been, if C++ were done right. However, for a right while only NeXT really used it. GNUStep, which was trying to copy NeXT Step, started supporting it as well.

    When Jobs came back to Apple (he also formed NeXT), Apple acquired NeXT and all their technology. This is when OS X was born and why it uses Obj-C.

    So, basically only MacOS X and GNUStep really use Obj-C in any significant way (at least that I'm aware of).

    The syntax is a little weird, and the targeted platforms are somewhat limited, so not many people know it or bother to learn (unless they want to develop for Mac or GNUStep).

    Its a turn off because people like familiar things and would rather use C++ or Java rather than Obj-C, I suppose -- and Obj-C is sort of the barrier to entry to Cocoa and Carbon.
  • Re:Objective C (Score:3, Informative)

    by afidel ( 530433 ) on Wednesday June 11, 2008 @11:35AM (#23748549)
    MIDP Java is generally pretty small and fast, it's what's used on basically all smartphone platforms other than the iPhone and Windows Mobile (ok there's Symbian native, but I don't think most new development is going that direction due to the portability of Java).
  • Re:Objective C (Score:5, Informative)

    by Lally Singh ( 3427 ) on Wednesday June 11, 2008 @11:40AM (#23748623) Journal

    I'm not trying to slander Java, but I've never used a Java app that doesn't take up a disproportionate amount of processor and memory when compared to the same type of program written in some flavor of C.


    And many have said that about C vs Assembler. The difference is that you'd have to add a zero or so to the end of the price of the app. Java's substantially easier to write apps for, in certain domains, in certain sizes of applications.

    C doesn't have features which make it reasonable to write very large applications (namespaces come to mind). You can do it (e.g. Unix kernels, etc), but you have to be much more disciplined without those features. That discipline costs in terms of additional expertise required, (higher programmer salaries) and project management (more overhead for managers, documentation, etc).

    Also, Java has features which make writing tools for it substantially easier than C. Better available tools also reduce the cost of software production.

    For the most part, Java sacrifices starting performance for long-term performance. Letting a Java app 'warm up' for a while will show substantially better performance than when it first started running.
  • Re:Objective C (Score:5, Informative)

    by Space cowboy ( 13680 ) * on Wednesday June 11, 2008 @12:07PM (#23749159) Journal
    Hmm - I have to assume you've not used ObjC much or at all - you have to take it with its class library (Cocoa), similar to Java, but it's ridiculously easy to use once you've spent a week or so learning it. Literally, it took me a week to be proficient in this "new" language.

    Applications don't need namespaces - frameworks do, but applications should be perfectly happy being run in their own (default) namespace. I think most people will be writing applications on the iPhone, not frameworks.

    As for tools, XCode comes with data-modelling tools to create entity relationship diagrams/models that integrate with your code, it comes with fantastic dtrace-driven graphical performance monitoring tools, and an excellent integrated gdb-based debugger which does things like fix-and-continue, step back, etc.

    Just putting some context into place,

    Simon
  • Re:Objective C (Score:5, Informative)

    by bsDaemon ( 87307 ) on Wednesday June 11, 2008 @12:10PM (#23749221)
    Yes, you can write generic programs in Obj-C. What are you doing to do with it without the library framework, though?

    If you want to actually **DO** anything with it, then you need GNUStep or Cocoa. Sort of like, you can write C programs on any system with libc and the header files available, but without all the fancy extras, like gtk or whatever, you're severely limited in what you can do without having to start from scratch.

    I don't program in Obj-C. I don't use Mac. I don't want to do either of those things. Its not FUD as much as an explanation.

    If you want to program Mac apps, you pretty much have to use it (or java) from what I can tell. If you want to use it without GNUStep or Cocoa, then you need bindings for your toolkits, same as anything else.

    Just because you *CAN* do something, doesn't mean people really do. Yes, there are Obj-C bindings for GTK. I don't know how many people use them, but I would venture to guess its not that many. GNUStep software is written in Obj-C, same as Cocoa-using OS X software is.

    That is the point I am trying to make. And I don't "prentend" to use bsd.

    dick.
  • by sgtrock ( 191182 ) on Wednesday June 11, 2008 @12:19PM (#23749421)
    "While Linux likewise has the fanatical user base... they just have no way of monetizing it. Linux users like being locked into that platform, but not enough to actually pay for anything. They are happy to use hardware two generations out of date, happy with being completely locked into FOSS (since extremely few companies will write for Linux), etc, but not happy enough to actually spend any money supporting what they supposedly believe in. Look at Red Hat- they've been doing poorly for years now, and that's not going to change (although their dropping the failed "Linux on the Desktop" project will undoubtedly help them a great deal).

    While Apple has been gaining market share (up to 4-5%)... Linux's has remained flat for the past ten years (always around 0.65%, even as the size of the market has virtually exploded). Meaning... every Apple sold is coming from Linux's share of the market (either actual or potential). Which is good, since Linux has no chance of succeeding in competition with Microsoft, while Apple can do quite well with a tiny market share."

    Sigh. You're wrong on so many points that I don't know where to start. The Linux vendors in the server OS and application space have been making money hand over fist for that same ten years, you know.

    We just needed to see the desktop environment catch up, that's all. We needed the OS itself to get responsive enough in the face of no vendor support (and sometimes downright hostile responses to queries about drivers), we needed the applications to get good enough, and we needed some market force to get people to look at Linux as a desktop appliance. That'll settle the lack of vendor support all by itself.

    We've seen the OS get very responsive indeed, to the point that running some games under wine are actually faster than running them in Windows XP on the same hardware. Applications are out there to meet the basic needs of most consumers, while other options are becoming at least tolerable. Driver problems are largely resolved with only a few holdouts refusing to either release binary drivers (not ideal) or provide any help at all to the people writing FOSS drivers.

    Finally, the fact is that your information about marketshare is a bit out of date. Every Website tracking company that publishes its global stats, from Hitslink to W3 Counter to Xiti to TheCounter, all show that Linux began increasing its market share a while back. Depending on how far back a given site lets you see, you can argue that it started in early 2006. Certainly, every tracking site that goes back to December 2006 shows that when Vista was released, Linux began growing. That's market force number one.

    The second is the release of the eee. All of a sudden, the hardware vendors realized that they could make a pot full of money selling a device without having to include Microsoft Windows or OS/X and people would buy it. Not just buy it, stand in line all night to get one!

    Micrsoft's response? A warmed over, extremely limited version of Windows XP Home with a drop dead date that's only 2 years out, and even then they want the hardware restricted. It has the hardware vendors so unimpressed that they seem to be flat out ignoring it.

    Asus stated that they expected to sell 40% of their eee line as Linux. Asus has also decided to include a small Linux distribution in the BIOS of every motherboard that they manufacture.

    MSI figures 50% of the Winds that they sell will be Linux. Acer has publicly stated that they're moving their entire laptop line over to Linux. Dell is still adding desktops and laptops to the pool of preinstalled Linux boxes (including the mini-Inspiron). HP is offering the Mini-Note with Linux side by side with the Vista versions.

    2008/2009 is the start of Linux moving into the mainstream. It's going to be fun to see how far it gets! :)
  • Re:Objective C (Score:4, Informative)

    by QuantumFlux ( 228693 ) on Wednesday June 11, 2008 @01:07PM (#23750353)
    You don't necessarily need Mac OS X or GNUstep to use Obj-C in any significant way.

    Debian Etch (and many other distros) has both the gcc-objc compiler and libFoundation libraries in the stable repository. I use them all the time to write GUI-less server applications. The Foundation library (the non-GUI toolkit for Objective C) makes it trivially easy (much like Java) to write a little piece of multi-threaded code that sits around waiting for input on a socket - WITHOUT all the overhead of launching yet another JVM instance.
  • by Anonymous Coward on Wednesday June 11, 2008 @01:12PM (#23750477)
    Thant's kind of shortsighted. Windows Mobile is not bad. It works far better as an embedded OS that Windows does as a Desktop OS. There are also thousands of applications out here for it.
  • Re:ATT Contract (Score:3, Informative)

    by Kickersny.com ( 913902 ) <{kickers} {at} {gmail.com}> on Wednesday June 11, 2008 @01:38PM (#23750973) Homepage

    T-Mobile in the US because T-Mobile uses the 1700 MHz band
    Wrong.
    http://en.wikipedia.org/wiki/T-Mobile#United_States [wikipedia.org]

    https://support.t-mobile.com/knowbase/root/public/tm22037.htm [t-mobile.com]

    T-Mobile's domestic roaming partners all operate on the GSM 1900 band.
  • Re:Objective C (Score:2, Informative)

    by anomaly256 ( 1243020 ) on Wednesday June 11, 2008 @01:39PM (#23750999)
    Provided you're willing to install things from outside of Apple's AppStore Applet (I think i just broke my p key), Cydia and AppTapp Installer both have Python bindings for the iphone's Objective-C UIKit Framework and others. Not to mention Java wrappers for those frameworks too. And Ruby wrappers. Really, you're only limited to Objective-C if you _let_ yourself be limited to it.
  • by Jhan ( 542783 ) on Wednesday June 11, 2008 @01:48PM (#23751195) Homepage

    This is the showstopper for me. A smartphone without a real freeware ecosystem will never truly thrive...

    So host your freeware on AppStore. They seem to encourage it since a few of the apps in the keynote where free downloads.

    Make it, upload it, set the price to 0. Any iPhone user can download it for zero cost.

    Of course it still sucks that this free program will have been DRM:ed by Aplle and can't be freely exchanged between phones, but such is life.

  • Re:Objective C (Score:5, Informative)

    by menace3society ( 768451 ) on Wednesday June 11, 2008 @04:22PM (#23754069)
    Okay, ready to learn Objective-C? Class names normally begin with capital letters and instances of classes begin with lowercase, just like Java.
    You call a procedure from an object with the syntax [object function:argument], similar to lisp. If there are multiple arguments, it looks like [object function:argument arg2Name:argument2 arg3Name:argument3].
    You declare classes as follows:

    @interface :
    {
              float aFloat;
              NSString *string;
    }
    - (NSString * ) string;
    - (void) setString:(NSString *)newString;
    - (NSString *) theFloat;
    - (void) setFloat:(float)value;
    + (NSArray *) someArray;
    @end /* of @interface */

    Obj-C objects are always pointers. Methods (functions) that begin with a '-' are instance methods; they would be called by an instance of the object (i.e. [instance method]. Those beginning with a '+' are class methods; they are called with [Class method].
    Use #import instead of #include. #import always checks to make sure it doesn't include a file twice, so you don't need to bother with #ifndef's.
    Here's an implementation file
    @implementation
    { /* private variables go here */
    }

    - (id) init
    {
              if (self=[super init])
              {
                          string = [[NSString alloc] initWithString:@"This is a string.";
              }
    returm self;
    }

    - (void) setString:(NSString *)newString
    {
              string=newString;
    }

    - (void) setFloat:(float)value
    {
              aFloat=value;
    }

    - (NSString *) string
    {
              return string;
    }

    - (float) theFloat
    {
              return aFloat;
    }

    + (NSArray *) someArray
    {
              return [[NSArray alloc] initWithObjects:
    }

    You can see that, as in Java, variables are in-scope within member functions.
    The method alloc is implemented in the ObjC base class, NSObject, and allocates memory for the instance. It will always be followed up with an init method of some kind.
    The keyword 'id' is a macro for any instance of NSObject or any of its subclasses.
    The variable 'self' refers to the current object. The variable 'super' refers to the current object, interpreted as it it were its parent class. Since every object but NSObject begins with self=[super init], only NSObject needs to know precisely how the Objective-C runtime is implemented.
    Not shown here is how flags are handled, which is usually of the form [object shouldDoSomething], which then returns YES or NO. To set behavior, it's [object shouldDoSomething:YES].
    In Objective-C, NSStrings are denoted like C strings, but with an @ before the open quote marks: @"This is an NSString." [object description] will return an NSString that tells you something about object, usually for classes within the core frameworks it is a text representation of the data.
    The null pointer as an object is called nil. nil, or indeed any object, will accept any method call and fail silently, so make sure you properly alloc and init your objects, and double-check that they actually respond to the methods you send them.
    Write to the console with NSLog(NSString*).
    There. Now you know Objective-C. How the fuck hard was that?
    NB: I wrote this off the top of my head, and it's been a while, so there are probably a ton of bugs in it. But, you get the idea.
  • by sgtrock ( 191182 ) on Wednesday June 11, 2008 @10:23PM (#23758289)
    I think you're missing the point of FOSS. The whole point is to allow people to scratch their own itch. Naturally, that's going to take them in different directions. Naturally, that also means that UI designers are going to go down different roads. Maybe an analogy will help explain what I mean.

    There's an old carpenter's saying that has been adopted by us geeks that you may have heard: "When all you have is a hammer, everything looks like a nail." Well, I don't want just a hammer. I want a full toolbox, the pegboard full of specialized tools behind the workbench, all the power tools in the cabinet to the left, and the floor full of standalone workstation tools (lathes, bandsaws, table saws, etc.) for when I want to do some really heavy work. I don't expect my woodworking tools to all look and act alike. Why should I expect my computer tools to do so when they do such different jobs?

    I should probably note here that I seem to be a rarity in that I really don't like OS/X's UI. It is missing features that I regard as basic requirements after years of using Linux. While not exhaustive, my list of things that I think it's missing include true maximized windows, multiple workspaces, the ability to having more than one app displaying by default, etc. OS/X showcased 'features' that I REALLY hate are things like the single menu bar at the top of the screen instead of letting each app display its own menu as part of the window, that incredibly annoying app dock at the bottom, and the very thing that you like most about it; the lack of flexibility in UI.

    That's not to say either of us is right or wrong about UI choices, btw. What works for you doesn't work for me and vice versa. It's just that in my view, OS/X's major fault is that it assumes that everyone wants to work the way that their UI designers have laid things out. It thinks all anyone wants is a hammer and not a full toolbox.

    In actuality, the incredible flexibility of the FOSS development model and therefore Linux is a strength, not a weakness. It is why you see Linux used for everything from the smallest embedded device all the way up to the largest supercomputers and everything in between. No other OS out there combines that flexibility (the *BSDs can actually exceed it depending on how you measure) with its level of popularity. At this point in time, no other OS out there has the breadth and depth of available applications. Again, you can argue case by case that specific Linux apps or classes of apps don't measure up to counterparts available on other platforms. Taken as a whole, however, it's clear that no other OS can boast as broad a range of successful applications.

    In sum, my contention is that the very thing that you decry, the broad range of UI tools and interfaces, is what will benefit Linux on the desktop the most. The truly successful UI stuff will continue gain popularity and see more widespread use as time goes by. The less successful ones will collect a smaller number of adherents. Some will only see use in niches. Others will simply fade away over time.

    In all of this, who loses? Certainly not the developer community at large, although some number of them will inevitably see their personal favorites wither and die. The developer community will be much larger than it would be if you forced everyone to follow a single model. (assuming you could force a bunch of FOSS developers down a single path. Talk about herding cats!)

    Contrary to what you seem to believe, I don't think the users will be negatively affected, either. We are by nature an extremely adaptable species. You may find this hard to believe, but people have been adapting to new UIs ever since Ogg first tied a rock to a stick that Mog had been using to hit Gog over the head. It probably took Mog at most two tries to figure out which end was the UI and which was the business end of his new club. :)
  • Re:Objective C (Score:2, Informative)

    by Lally Singh ( 3427 ) on Thursday June 12, 2008 @10:06AM (#23763385) Journal
    Oh God, Categories make the fragile base class worse. So much worse.

    Example: I add a method to NSString, and so does someone in a library I'm using. Or just another programmer on the team, and our dev process doesn't manage categories as the landmines they really are. Same name, different semantics.

    Pop Quiz: which version gets loaded into the executable?

    Answer: No way to tell! It depends on the linking order! And no matter what, someone's code is going to get the wrong semantics for this method!

    Examination: There is no way to catch this at compile time. It happens and you have to run it in the debugger to figure out which category actually runs.

    The fragile base class problem refers to C++'s binary layout dependencies for superclasses: if you change a base class in one binary, then you have to recompile every shared library, plugin, etc that it links to. Otherwise it will use the old layout, which will lead to terrible, incredibly-painful-to-debug things. Some code thinks you have a 24 byte object, and appends subclass's members at offset 24. Others think you have a 20 byte object, and append subclass's members at offset 20. Hilarity ensues.

    Obj-C's binary layout mechanism avoids this. But categories are just as bad as the original fragile base class problem. It only works if you track *exactly whom* adds what categories to what classes. It's a hack backdoor. But in too many places, it's been encouraged as a primary method of problem solving. What happens when two people want to implement NSTableDataSource differently on the same container? (If my understanding of which interface you impl is off, remember it's been 4-5 years since I really used cocoa).

Today is a good day for information-gathering. Read someone else's mail file.

Working...