


Intel-Powered Smartphones Arriving Soon 182
adeelarshad82 writes "After years of promises to enter the smartphone market, Intel has finally done so. During his keynote at CES, Intel's Chief executive Paul Otellini said that Intel has signed Lenovo and Motorola to contracts to use its Atom processors in smartphones. Unlike past launches, Intel has held Medfield back until its partners were ready to go to press as well. According to an early preview, Medfield pairs a 1.6GHz Atom CPU with an SGX540 GPU designed by PowerVR. This is the same GPU we've seen tip up in the Samsung Galaxy Nexus and Droid Razr, though Intel is clocking it higher, at 400MHz. Intel's new SoC encodes video at 720p at 30 fps, can playback 1080p at 30 fps, and supports 1920×1080 output via HDMI. The first smartphone to carry an Intel chip will debut on China Unicom during the second quarter."
Re:You haven't entered the market (Score:0, Informative)
Intel supplies chips, not phones. Intel is now selling their chips to smartphone manufacturers, meaning they are in the smartphone chip supply market.
Re:what kind of power draw? (Score:3, Informative)
Re:the specs and benchies are a YAWN (Score:4, Informative)
intel should be making these on 22nm and new products being released now,
I think the reason they don't is because 22nm requires more power to operate at a high speed.
??? How's that happen, typically the lower the process size, the lower the energy use for the same design (and the higher possible clock speed before heat issues occur).
Mark my words in the next 5 years ARM is going to have a CPU good enough for a laptop. the cost structure of Wintel will make this a huge financial/profit opportunity for laptop makers
I don't think this matters. The CPU is not why laptops are expensive, so even if some ARM manufacturer manages to match Intel in performance (unlikely), what motivation is there to dump all your legacy software that only runs on Intel?
Also, if AMD couldn't overtake Intel with chips that were at times superior, why do you think VIA or Qualcomm could?
The issue isn't cost, but performance. Even a low end x86 (except Atom and AMDs equivalent) can outperform an ARM chip significantly. Also, have you seen the price of replacement notebook CPUs? They are a lot more expensive than similar desktop CPUs. It's not the only reason they are more expensive, but it certainly is part of it.
ARM may in fact catch up to x86 - the question then is, will Intel focus more on their own ARM development? Performance focus has been moving from per-thread performance (where x86 is usually pretty good, and performance/watt is not necessarily a huge concern) to multi-threaded performance (where performance/watt can translate quite well, since you just have to add more cores to up the overall performance). I believe ARM is better at performance/watt than x86, so with ARM catching up on core count (and probably exceeding x86 soon), x86 may indeed lose it's lead. Of course, that only hurts Intel if they focus on x86.
Then again, I can see a slightly different future. Both multi-threaded and monolithic-threaded have their advantages. For notebooks (and maybe desktops/servers) I can see a primary ARM CPU handling most of the work, and an x86 (or Power, or whatever?) taking on the brute force stuff when needed, and otherwise powering down. Of course, applications and libraries will now need to store both ARM and x86 versions, and the OS will need to have code to allow cross talk between x86 and ARM at least at the cross-process level, but possibly even within processes themselves. Then again, that would be a huge undertaking and possibly not worth the effort.
Re:Dull Specs, but battery life? (Score:4, Informative)
1080p decode is useful for two things:
1) Decoding 1080p media and scaling it down to the display in real time, eliminating the need for a reencode. It's inefficient and a waste of space, but still - in some use case it's better than reencoding before loading to the device
2) HDMI output
Re:Too late (Score:3, Informative)
For the most part, they don't need to. Android has already been ported, and 75% of the apps for Android are written with the standard SDK, meaning they're cross-platform Java applets.
That leaves the 25% of remaining apps that are written with the NDK. Of those, most can be recompiled by the developer with minimal effort (the NDK supports building for x86 or ARM, and most apps wouldn't require any changes to recompile). Of those that can't, or aren't, Intel is going to be supplying binary translation software (read: emulation). That part won't run all that great, but it will run.
Basically, the point is that Android is particularly well-suited to switching between architectures because not much of it (or its apps) is architecture-dependent.
Re:You haven't entered the market (Score:5, Informative)
Most android apps are java; very few use any arm-specific code at all. They should run fine on android-x86 (which exists already BTW)
Anandtech has the numbers (Score:5, Informative)
Re:what kind of power draw? (Score:5, Informative)
That is their claim in the graphs in the article. Graphs that don't mention which competing devices are being compared and which have no numbers. But they are claiming to be middle of the pack in idle power consumption, which has always been the fatal flaw in x86 mobile devices until now. If they have really managed to get an x86 to idle at a couple of milliamps of current then they are probably in the hunt. If not, it is all bogus like an x86 tablet. Who wants a phone you have to charge daily even if you don't call or even light up the display? It is all about idle time with these more mobile devices, not how many HD frames you can push for the hour or so the battery can hold up.
Anand has done a really good job analyzing Medfield's performance and power usage, with actual comparisons against other shipping competitors.
So, to answer your questions:
1. Performance comparison -
Sunspider javascript benchmark (lower is better) -
Intel Medfield - 1331 - compare to iPhone 4S - 2250 & Galaxy Nexus running Android Icecream Sandwitch - 1988
Browsermark benchmark scores (higher is better) -
Medfield - 116425 - compare to iPhone 4S - 87841 & Galaxy Nexus running Android Icecream Sandwitch - 97381
Intel's Medfield has a good 20-50% performance lead against currently shipping top of the line ARM. Granted most ARM phones are due for a refresh in 6 months which will give them an expected performance boost of about 30-50%, Medfield will still be in the same ballpark performance. Its definitely a viable option. Plus, a Medfield upgrade will also come out in 9-12 months.
2. Power consumption on standby -
Medfield standby - 18mW - compare to iPhone 4S - 38mW & Galaxy S2 - 19mW
3. Power consumption during 3G web browsing -
Medfield standby - 1W - compare to iPhone 4S - 1.3W & Galaxy S2 - 1.2W
Power consumption during 720p video playback -
Medfield standby - 850mW - compare to iPhone 4S - 500mW & Galaxy S2 - 650mW
Barring video playback, Medfield actually has better power consumption numbers than iPhone 4S and Galaxy S2. Even in video playback, Medfield is only a little bit higher. Unlike what people have been warning about, Medfield is NOT a power hog and is in fact at par with currently shipping ARM.
Instead of getting into fanboyism, people should be excited by this news. Firstly, Intel is the small underdog here, while ARM is the 800lb gorilla. Secondly, it sounds like a cliche but competition is almost always A Good Thing. Thirdly, I am personally extremely excited at the idea of a Medfield based tablet - it would give us enough flexibility to run multiple OSes and the millions of legacy x86 apps and games sloshing around in the great wide interweb. It would also allow us to run Win8 when it eventually releases which is also an attractive proposition.
Lastly, if you put aside the purist RISC/CISC debate, x86 over the years has promoted and ensured an open ecosystem of OSes, applications, websites, and open source projects/communities. For all the goodness that ARM contains, its advent into smartphones and now tablets has caused more walled gardens and vendor lockdowns, not less. The root cause may very well be greedy corporations (heck, even El Goog is turning into one) and nothing to do with ARM per se, but I'm calling it like I see it.
Re:What phone-centered software runs on x86? (Score:5, Informative)
Correct me if I'm wrong (and I could be), but AFAIK there is no strong ecosystem for x86 software that is geared toward usage on a touch-screen phone. Granted, Win8 will run X86 and will probably garner some touch-oriented software for the small screen, but it doesn't exist yet. So if I get one of these phones which 'apps' will I run? I suppose there is the Android x86 port, but I would imagine that most of the existing Android apps would fail in that environment.
From Anand:
"By default all Android apps run in a VM and are thus processor architecture agnostic. As long as the apps are calling Android libraries that aren't native ARM there, once again, shouldn't be a problem. Where Intel will have a problem is with apps that do call native libraries or apps that are ARM native (e.g. virtually anything CPU intensive like a 3D game).
Intel believes that roughly 75% of all Android apps in the Market don't feature any native ARM code. The remaining 25% are the issue. The presumption is that eventually this will be a non-issue (described above), but what do users of the first x86 Android phones do? Two words: binary translation.
Intel isn't disclosing much about the solution, but by intercepting ARM binaries and translating ARM code to x86 code on the fly during execution Intel is hoping to achieve ~90% app compatibility at launch. Binary translation is typically noticeably slower than running native code, although Intel is unsurprisingly optimistic about the experience on Android. I'm still very skeptical about the overall experience but we'll have to wait and see for ourselves."
http://www.anandtech.com/show/5365/intels-medfield-atom-z2460-arrive-for-smartphones [anandtech.com]
Re:what kind of power draw? (Score:4, Informative)
Sorry for replying to my own comment, but I just realized that my link to the Anandtech article got stripped out.
Here's the link: http://www.anandtech.com/show/5365/intels-medfield-atom-z2460-arrive-for-smartphones [anandtech.com]
Re:Title fail then (Score:3, Informative)
I've heard similar things around their GPUs (I'm thinking older ones).
i945 chipset costs $x
i945 chipset + GMA950 = $x + $0.75
Factor in the marketing scam of "centrino" which customers thought meant a certain model CPU, but really meant Intel CPU (Pentium M or better) + Intel Chipset + Intel Wifi. Throw in a 75 cent GPU and you have a laptop that consumers clawed over each other to get.
Even though Intel has a history of making garbage GPUs.
First foray was Intel 740 which was a standalone AGP card. What a joke.
i810 is an integrated version of the GPU, and is pretty crappy. Doesn't even support VESA modes above 640x480x16.
Generations of crappy integrated solutions branded "Exxxtreme graphics"
GMA910 Which Intel forced Microsoft to qualify for Vista Capable even though it was physically incapable of running WDDM drivers and thus couldn't run Aero. Lots of pissed off customers and OEMs and lots of lawsuits. I think they had warehouses full of these things and they were trying to foist them off. This crappy chip popped up again for the Celeron-M in the original 7" EeePC, as well as some early 9 and 10 inch models.
GMA950 represents the bare minimum that can run Aero. It was also forced on generations of Atom users when Intel tried to block nVidia out of the platform (why would users want accelerated HD video on an anemic CPU...)
GMA500 Based on a PowerVR core, the specs are actually good, but it had horrible driver support under Windows and Linux. This was on Z-series Atoms.
Larrabee never panned out as a dedicated GPU either.
And in almost every case an Intel GPU underperforms an equivalent bargain bin integrated GPU from AMD/ATI or nVidia. Most end users do 2d desktop software, and Video viewing, so they aren't demanding users, but the Intel GMA represents poor performance value. I think GMA stands for "Gimp along Media Adapter" or "Garbage Media Adapter". I think some of their more recent attempts are a bit better, but I hate the software associated with whatever's on my i5 company laptop.