Dual-Core CPU Opens Door To 1080p On Smartphones 314
An anonymous reader writes "Following Qualcomm, Samsung is also close to launching a new smartphone processor with two cores. Based on ARM architecture, the new Orion processor promises five times the graphics performance of current chips and to enable 1080p video recording and playback. Next year, it seems, dual-core smart phones will be all the rage. Apple, which is generally believed to have the most capable processor in the market today, may be under pressure to roll out a dual-core iPhone next year as well."
Dual core iPhone (Score:2, Funny)
Re: (Score:3)
Okay then (Score:5, Interesting)
Re: (Score:2)
The future is now then, that's not much of a prediction.
Re:Okay then (Score:5, Interesting)
The GP2X is a dual 200Mhz processor handheld console that runs off 2 AA batteries and was released in 2005. It was not that revolutionary, it was not that expensive, it was not that difficult to program. You can program on both chips so long as you don't mind temporarily ditching the in-built MPEG-decoding firmware that runs on one of them (it's replaced on each bootup) or being very careful where you tread. The Nintendo DS was multiprocessor. And not in the unconventional sense of "one CPU and a handful of support chips", two real, full, general purpose, programmable chips. Most modern games consoles have more than one real chip, many of them using multiple specialised processors (e.g. IBM's Cell in the PS3 which is a "9-chip" component).
Massive parallelism has been around forever and it's in consumer electronics already and has been for quite a while. It might not be *designed* like that but you have always had home computers with multiple processors that can be programmed to operate in parallel. There were people misusing floppy drive controller chips and sound processors to do all sorts, and the GPU is "another" processor now, and one that's extremely good at running lots of things in parallel. Do you think that your CPU could ever keep up with software-rendering on a modern game? Or that a single-core "general purpose" GPU could?
Your professor is right, except for his timescales - parallelism is already here, right down to tiny embedded systems, and has been for years. Just that hardly anybody uses it without some seriously specialised programming design beforehand. That tells you quite a lot about how expensive it is to use effectively. Hell, to a new programmer, even threading and re-entrancy can be a huge nightmare for them and our solution at the moment is blocking locks and things like that (so that we can *think* about them being seperate linear instances). If you can become an expert in parallelism you'll probably have a good career ahead - but a specialist one. Everyone else is just waiting until they can pass everything off to a dedicated chip and get told when it's finished - they don't *WANT* to properly program parallel programs (tongue twister!), they just want everything to happen serially and faster, or the closest approximation to that that they can get.
Seriously, when you program any graphics library, you just throw a huge amount of 3D object data into another processor and let it get on with it. You don't care if it runs at 6GHz, or whether it's got a separate internal processor for every pixel of the final image running at 1MHz, so long as it damn-well draws the screen as quickly as possible and tells you when it's done. Parallelism is a hidden facet of modern architecture precisely because it's so necessary and so damn complicated to do it right. Programmers, on the whole, are much happier with linear processes. It's taken games this long to pass off physics, AI, etc. to a separate thread, and we've had threading for DECADES.
Parallel things are just a placeholder at the moment because we can't up the GHz much and if we could, it wouldn't help that much with the tasks we want to do. So even operating systems are handing off everything they can to other chips - Ethernet TCP offloading, GPU, sound-processors, RAID, you name it. It's all about making the general-purpose CPU's do as little as possible so the main use of the computer can continue uninterrupted. And parallelism is only used to increase the things we can do, not break tasks into much more efficient subtasks. Most people who have dual-core or above in the early days wanted it so that other things (e.g. antivirus) got out of the way of the real process (e.g. games!) so it could have as much time running in it's single thread as possible.
Parallelism can see fantastic gains but it needs to be done from the design stage each time. Mainstream multi-core products are too much of a moving target to do anything more fancy than run a number of huge, single-threaded programs at the same time. That's *not* the same thing, and never will be. Parallelism is specialised. Games programmers would give their right arm for everyone to have a single 100GHz processor instead.
Re:Okay then (Score:5, Informative)
Smartphones have been multicore for at least ten years. The early ones had two separate CPU cores, one for the application stack and one for the phone stack. One of the design requirements for the Symbian EXA2 kernel was that it should have a hard realtime nanokernel that could run both as completely independent software stacks on the same CPU core, cutting costs.
Even when they then only had one ARM core, the SoCs were heterogeneous multicore chips. Something like TI's OMAP3530, found in a lot of devices, has a CPU core, a GPU core, a DSP core, and a couple of other specialised cores.
That's why the headline here is quite misleading. Doing 1080p H.264 decoding on a pair of 1GHz Cortex A9 cores might be possible, but it seems very unlikely. Chips from the last generation, however, could all do 720p H.264 decoding on the DSP and / or GPU cores. This is not a chip that has enough processing power in the ARM cores to decode 1080p video, it is a chip that has two ARM cores and also, independently, has enough processing power to decode 1080p H.264 streams.
This part of TFA made me laugh:
Apple’s A4 processor, which is based on an ARM Cortex-A8 design, has been generally described as the most capable chip combination in the smartphone landscape today
I don't know who these people are, but I suspect that they are ignorant Apple fanboys. The thing that makes the A4 interesting is that it removes a lot of stuff that most ARM SoCs ship with, because Apple didn't need them. Photos of the die indicate that the A8 core (the same core that everyone else has been using for a year or two) is a stock part, unlike, for example, the Snapdragon which is very heavily tweaked (new floating point pipelines and so on).
Re: (Score:3, Funny)
But if we believe the parallel universes theory, aren't we all programming in parallel already?
Re: (Score:2, Insightful)
Re:Enlighten me please (Score:5, Informative)
It seems they are talking about recording 1080p, not viewing 1080p. You don't have to view your recorded videos on the phone.
Re: (Score:2)
it would be nice to be able to play it back onto a hooked up TV straight after taking it... that still requires the device be able to playback 1080p, which is far easier than encoding it...
Re:Enlighten me please (Score:5, Insightful)
Re: (Score:2)
What's wrong with using this pixel abundance to shoot 1080p instead? There would still be enough pixels to allow for noise reduction.
Re: (Score:3, Insightful)
How does lossless digital zoom compensate for lens quality? As for noise reduction, wouldn't a lower resolution sensor of the same physical size be better than an algorithm?
Re: (Score:2, Interesting)
How does lossless digital zoom compensate for lens quality?
I just meant it as an example of the fact that current phones' camera sensors do offer more pixels than is needed for 720p recording, so they use it for nice extra things such as digital zoom. So why couldn't the sensors of the near future use them for 1080p recording instead?
As for noise reduction, wouldn't a lower resolution sensor of the same physical size be better than an algorithm?
A lower resolution sensor, with the same S/N ratio, would capture less detail. Just dismissing higher resolution sensors as a marketing gimmick ignores the technical advancements in image sensors.
I suppose that higher resolution sens
Re: (Score:3, Interesting)
I think GP is talking about the fact that you only get the higher resolution with optics of matching quality. All optics have some inherent blurring, only the degree depends on quality (and with very high quality optics, you may run the diffraction limit: http://en.wikipedia.org/wiki/Diffraction-limited [wikipedia.org])
So if you combine a high-resolution sensor with a second rate lens, all you achieve is that the blur where a sharp edge should be is spread across more pixels.
To some degree this is OK because it makes the i
Re: (Score:2)
1080p on a cellphone is nothing more than a sales gimmick
It's irritatingly commendable that cellphone makers must create a market at 5 inch resolutions that doesn't even exist at 4 to 7 times that size for living room enjoyment.
Where the hell is our overdue 1080p on sub-40-inch TVs? The industry is holding out at the low end: If it wasn't for this 3D LCD fad, we'd sooner be seeing 20 inch TV's finally get 1080p so they, and classroom slide projectors can finally catch up to 15-year old desktop resolutions.
Re: (Score:2)
On Google [google.com] you lazy ass idiot.
Re: (Score:2)
classroom slide projectors
Remove the word classroom and slide from there. I think those have always been analog. I meant to hilight how 640x480 and 800x600 plagued projectors so that college presentations meant for @ 1024x768 and higher were butchered because the prof had to manually lower resolutions just so they could be over-head projected for the class. This was sacrificing font clarity and screen real estate and seems to still be a problem on our more modern business Smartboards when the video source is on PC's and laptops desi
Re:Enlighten me please (Score:4, Interesting)
Except that they're shrinking the die size enough that these new dual core, 1.5Ghz + chips will use less power then the current Snapdragon ones.
Re: (Score:3, Insightful)
As an owner of 1080p recording equipment, let me say that any phone that records 1080p will be a giant turd.
when you record at 1080p your lens quality becomes very important. even at 720p you can get away with the crappy NTSC lenses, but 1080p will show off bad glass right away.
1080p in a phone is a ragingly stupid thing. It's stupid in all the consumer camcorders t hat are under $1800.00 right now as well. (Nasty purple fringing and plain old awful lenses on these things coupled with really low record
Re: (Score:2, Interesting)
Re: (Score:2)
Re:Enlighten me please (Score:5, Informative)
The feature is important for 1080p output, combined with HDMI makes a phone compatible with most projectors, LCD/LED TV's and modern monitors. I can easily see myself walking in and displaying a video or presentation stored on my phone. Ideal for impromptu sales pitches or just bringing a movie over to a friends place.
Re:Enlighten me please (Score:5, Funny)
I have a hard time understanding how 1080p is such a great feature on screens 4" or smaller in diameter.
You raise an interesting question, that which will likely be the next big paradigm in smart phones. Circular screens.
Re: (Score:2)
and this new samsung soc design will most definetely find it it's way into systems like next gen apple tv, plenty of samsung tv's and cheapo stb's.
and even with a crappy lens it's still
Re: (Score:2)
What good is a delicate tiny miniaturized HDMI connector with a great stiff heavy hulking cable attached to it?
LG is releasing a series of dual-core smartphones (Score:5, Informative)
Screw smartphones, how about a new NSLU2 (Score:2)
...with one of those bad boys in it. And an eSata interface. All your home server problems solved!
Re: (Score:2)
"Plug computing" ((c) Marvell) is the successor of that direction. There's stuff like this [globalscal...logies.com] hitting the market, though still single-core for now.
Once these new Cortex A9s are more in the wild, I'd expect them to find a home in the wall wart machines as well.
Not dual core. More RAM! (Score:2, Insightful)
Now I can watch amazing 1080p on a 4.5" screen. My cinema experience is now complete.
Smart phones don't need dual core. They need more RAM.
App designers are guaranteed certain resources when the application runs on a phone. This is why a single-tasking paradigm was popular, because it simply guaranteed these resources. Multitasking requires sharing of memory. Without swap space enabled, memory may run out quickly. Android has mechanisms for saving a program's state and killing off the least-recently-used ap
Re: (Score:3, Interesting)
Yep, I have ~400MB available on my Desire (576MB in there total according to the spec sheets), and it's still not enough.
My old Milestone (256MB RAM) was constantly killing off applications in the background because it was running out of RAM, sometimes not even saving the app states properly, causing me to lose my place (doesn't sound too bad, but it gets annoying quickly)... my Desire fares better, but there's still the occasional low memory kill when I have a lot of browser tabs open.
What I want is a giga
Killer feature. (Score:3, Insightful)
And by killer, I mean battery killer.
I think smartphones need to go back to basics. I'd take a smartphone that lasted 4 days of normal use on a single charge anytime over a new one that does shit I don't really need anyway 10% (or even 30%) faster.
Once they've got battery life back under control, get back on performance.
Re:Killer feature. (Score:5, Funny)
Fine. So you're a member of the 1% of all cellphone users that doesn't regularly connect their phone to their TV to watch HD movies.
Maybe you should try joining the rest of us in the 21st century, with chargers at home, at the office and in our cars!
Re: (Score:2)
you're a member of the 1% of all cellphone users that doesn't regularly connect their phone to their TV to watch HD movies
Did you mistype, or are you really claiming that 99% of all cellphone users regularly connect their phone to their TV to watch TV?
Or is this a "whoooooosh" moment for me?
Re: (Score:3, Interesting)
Certainly a woosh.
OTOH, when I bought my Evo, I read plenty about the poor battery performance. Not a big deal, I thought, as I didn't buy it because I make a lot of calls, and figured that I would have some buffer to work with because so.
Unfortunately, what I didn't necessarily predict, is that I would be using wifi tethering as much as I do. This eats up battery pretty quickly. I bring my netbook to all sorts of places so that I am not tied down with school, and can work on and turn in assignments even
Re:Killer feature. (Score:5, Interesting)
Not necessarily; some tests by ARM/Symbian/Nokia strongly suggest that a n-core chip of x frequency is a good way to get considerable energy savings over a singlecore chip of n*x frequency. Of course whether or not it would be used that way is another thing...
Re:Killer feature. (Score:5, Informative)
Battery life is fine if you keep the screen off. I get a standby power draw of roughly 5mA on average on my Desire. That works out to about 280h of standby time, and that's with a bunch of always-connected applications (Google Sync always active, an IM client, SIP client) in the background, and WiFi and Bluetooth on. Turn all that stuff off and I get values more around 3mA... 466h.
Obviously a screen that draws almost 100x as much (seriously, at full power the AMOLED screen draws close to 300mA!) is going to kill off the battery very quickly.
In comparison, the SoC uses very little power (full CPU load on the Desire's Snapdragon is 40mA higher than idle - tested with SetCPU's stress test) and scales very well with load. If you really want to increase use time, build more efficient screens... fuck the processor.
Re:Killer feature. (Score:4, Interesting)
Wow, I want these new cellphone chips in notebooks and network attached devices! They are far ahead of the watt sucking crap from Intel and AMD.
Re: (Score:2)
Well, they ARE already partially available in NAS type devices. As a matter of fact, the cheapo Buffalo LinkStation Lite I have sitting out in the hall runs on an ARM chip of some sort, IIRC...
Smartbooks and tablets are starting to run on ARM as well. Shouldn't be too long until your wishes are granted ;)
Re:Killer feature. (Score:4, Funny)
Battery life is fine if you keep the screen off. I get a standby power draw of roughly 5mA on average on my Desire. That works out to about 280h of standby time, and that's with a bunch of always-connected applications (Google Sync always active, an IM client, SIP client) in the background, and WiFi and Bluetooth on. Turn all that stuff off and I get values more around 3mA... 466h.
Obviously a screen that draws almost 100x as much (seriously, at full power the AMOLED screen draws close to 300mA!) is going to kill off the battery very quickly.
In comparison, the SoC uses very little power (full CPU load on the Desire's Snapdragon is 40mA higher than idle - tested with SetCPU's stress test) and scales very well with load. If you really want to increase use time, build more efficient screens... fuck the processor.
So what you're saying is that if you never actually look at your phone, or use it as a phone, you can run idle applications in the background. Colour me impressed.
Re: (Score:2)
Correct. That's why I never manage more than 12 hours on a single charge, and carry around an external USB power pack with a 4400mAh LiIon battery in it...
The gist of what I was trying to say was, however, that the CPU is NOT the component that draws so much power on smartphones, but rather the screen, and to a lesser effect (contrary to popular belief) the radios.
Re: (Score:2)
Battery life is fine if you keep the screen off. I get a standby power draw of roughly 5mA on average on my Desire. That works out to about 280h of standby time, and that's with a bunch of always-connected applications (Google Sync always active, an IM client, SIP client) in the background, and WiFi and Bluetooth on. Turn all that stuff off and I get values more around 3mA... 466h.
Obviously a screen that draws almost 100x as much (seriously, at full power the AMOLED screen draws close to 300mA!) is going to kill off the battery very quickly.
In comparison, the SoC uses very little power (full CPU load on the Desire's Snapdragon is 40mA higher than idle - tested with SetCPU's stress test) and scales very well with load. If you really want to increase use time, build more efficient screens... fuck the processor.
So what you're saying is this new dual core processor will be great for watching 1080p movies with the screen off (or on an e-ink screen) ?
Re: (Score:3, Interesting)
No, I'm saying that given the same other hardware, this processor won't affect battery life negatively in a very noticable way.
Battery life is already crap, and it's not because of the processors used. All this power optimization should be taking place where it's needed most... crappy AMOLED screens with twice the power draw of LCD when displaying anything remotely useful (i.e. not a mostly black screen), for instance.
Re: (Score:2)
True, but then you wouldn't be able to receive calls ;)
Just saying, the screen is (in most use cases) the component that draws the most power, usually by far. Manufacturers need to stop screwing around optimizing power draw for when the phone is idle with the display off, but rather make it so that it'll last 24 full hours with the screen ON!
WTF do I need 200+ hours of standby time for if after using the phone for 6 hours heavily it's completely dead?
Nokia also uses ARM 11 (Score:2)
"Apple, which is generally believed to have the most capable processor in the market today"
Huh? I thought Apple used the same processor, ARM 11, as Nokia.
Re:Nokia also uses ARM 11 (Score:4, Informative)
iPhone 4, iPod Touch gen.4, iPad, and Apple TV gen.2 all use the Apple A4 processor, which is an ARM+GPU manufactured by Samsung.
Yes but the tech press loves Apple (Score:3, Funny)
If you haven't noticed, everything Apple does is always "brilliant" and "innovative" according to the tech press. Doesn't matter if they are releasing something that is the same as everything else. For example the Apple TV gets praise lavished on it as an amazing on-demand streaming device, even though nearly every Blu-ray player with an ethernet port also does streaming and, of course, plays DVDs and Blu-rays on top of that.
For that matter, it might even be Apple PR copied verbatim. It is amazing how many
Re: (Score:2)
After that Apple will write benchmark for Anandtech who will publish it without question.
I instantly know when I hear things like "Snappier", "User Experience", "Superior UI" and "responsive/ness" that the poster has absolutely
Re: (Score:2)
Totally OT, but I remember visiting Anandtech when the kid was still in high school and thinking, wow, what a bright young dude. The last time I was there, however, it seems as though it's nothing but a bunch of what you just mentioned.
Sorry, though, I don't even have a lawn.
Re: (Score:2)
Probably a bit but I've never seen one of his tests replicated. Plus he tends to leave out a lot of factors (such as the starting and ending dBm for the "Death Grip" tests he did).
I used to, then the housemate killed it (lawns do use a lot of water and need a lot of maintenance). However if it makes you feel better you can get off my woodchips.
yes but nokia is taiwanese (Score:2)
or japanese or swedish or swiss or something like that . Apple is american .
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:3, Informative)
Re: (Score:2)
Lies! Apple puts the second core in the sticker!
Obligatory XKCD (Score:2, Insightful)
The most capable mobile processor (Score:5, Informative)
Apple, which is generally believed to have the most capable processor in the market today, may be under pressure to roll out a dual-core iPhone next year as well.
This is silly. Apple is using Samsung's processor, an OEM version of the Hummingbird (which is not exclusively sold to Apple by any means). So if anyone has "the most capable [mobile] processor in the market today" (and even that statement could be debated), it's Samsung (certainly not Apple).
Re:The most capable mobile processor (Score:5, Informative)
A4 has a PowerVR SGX535 GPU, which can push 28 million triangles/sec whilst the Galaxy S has a PowerVR SGX540 GPU that pushes 90 million triangles/sec.
http://bit.ly/bM3JeK [bit.ly] note: the article lists iPhone 3gs at 7 million triangles/sec with 28m deleted, but IIRC it's actually the other way around. (7m rumoured, but it was actually 28m)
Re: (Score:3, Insightful)
distortion field - the gpu isn't apples, neither are any part on the soc they call A4. it's just a marketing diversion. a more truthful way would be to say that it's apples codename for the chip they're currently buying. much like g4 was a marketing term attached to some cpu's apple bought from elsewhere earlier, it didn't make them apple designed cpu's.
there's other "but"'s that could be attached to it, but none of them really change the fact that anyone with cash could order chips pretty muc
Tablet/Dock (Score:2)
Where to get data from + Re:Tablet/Dock (Score:2)
Well, a 10" touchscreen is not the point. The point is that you want to connect it to ANY 45" TV via a hdmi link. You got to have a little bit of imagination there. couple it to a blue tooth mouse and aBT keyboard and a power adapter and youhave a mobile work station.
However if you want to use it for media player, the market speaks forgets that not every 1080p is equal. The main problem is: how to get data INTO the phone. a HD blueray iamge is 40GB. With current SD card storage you can load one of such 108
Given the restrictions ... (Score:2)
...on the phone by the manufacturers and carriers, what's the whole point of having that much power? Recording and watching 1080p video? Pfft....the lack of imagination is pathetic. I have a tons of apps that I'd like to work on, if the phone platform is as open as the PC platform. Laptops just don't have the mobility and form/shape required for a ubiquitous interaction.
I just wish there are more manufacturers put out more high-end mobility devices for the MeeGo platform. Can't wait to get my hand on the
Misplaced Priorities (Score:2)
We're all worried about smartphone CPUs that can decode 1080p video when none of them have screens that can display it.
Stop and think about it for a minute.
Marketspeak has totally infiltrated discussion about display resolutions and I am branded as an idiot when I bring up that 720p involves a lot more than 720 vertical pixels, and that progressive scan doesn't mean shit on LCDs to begin with.
I would love to be proven wrong here, but I have no idea why progressive v. interlaced is even brought up anymore.
Re: (Score:3, Interesting)
HDMI or displayport out? you will likely find more and more products with this port: http://en.wikipedia.org/wiki/PDMI [wikipedia.org]
Consider, where before, only apple could show having a singular port, now all may have so. End result, you can dock any device to any tv without worrying about carrying the right cable. Should make parties more interesting, i suspect.
Misleading Headline? (Score:5, Informative)
It's also about maths (Score:2)
I wonder whether multiple cores are enough or you maybe need something like DMA [wikipedia.org] or a data crossbar [wikipedia.org] into the hw architecture as well as whether the current SD cards can cope at that pace
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Compression, I understand, is mandatory for storage because of external memory bandwidth constraints
Re: (Score:2)
Re: (Score:2)
If it's run into any of the CPUs/GPUs, then the stream needs to be transfered to the CPU/GPU RAM
Re: (Score:2)
Interesting
The original Nintendo DS has a 2 core CPU (Score:2)
Re: (Score:2, Informative)
HTC Tilt 2 ? (Score:2)
I'm not entirely certain, but I thought that the HTC Tilt 2 (AT&T and others) had two processors. Granted, one was for the phone subsystem, and the other for the OS, but still. Maybe someone can elaborate.
Either way, I've been thinking we need to add more processing power to phones for a long time. They are way too slow for the things we want to be able to do with them. But, the offset is the battery life. :(
HD? bah. (Score:2)
Wake me up when smart "phones" do 1080p 3D (two CCDs) @ 60fps, for 8h on one battery.
Re: (Score:2)
Wouldn't that ideally be 6 CCDs? Three for each "eye" one for each RGB.
David Lynch (Score:3, Interesting)
David Lynch talks about watching film on a cell phone. [youtube.com]
Re: (Score:2)
Apple already preparing for this (Score:3, Informative)
Re: (Score:2)
But the iPhone will have Dual iCore® technology. And have Steve Jobs' approval.
Re: (Score:3, Funny)
Re: (Score:3, Insightful)
Who needs multiple cores when multi-tasking has been decreed irrelevant?
Or did I miss an update where multitasking was invented and gifted to the world by Apple?
Re: (Score:3, Informative)
My guess is that the guy who wrote that pcworld article has not actually programmed for the iPhone. The article makes a big deal of the programmer having to do something about adding multitasking to their applications, but from what I've gathered from a few colleagues who have made some iPhone apps (some very popular) it actually requires an extremely small amount of work. By Wikipedias definition of multitasking the iPhone does multi-task, though I've noticed a lot of people trying to redefine the term lat
Re: (Score:3, Informative)
I've heard that the iPhone does the kill behaviour too.
Not to come across as too Fanboi-ish, but the N900 does it marvellously. Next step - Nokia, please make a slimmer,prettier Maemo/Meego phone? Please?
Re:Apple? (Score:5, Insightful)
True, seems like they've been setting the pace. Touchscreen phones were pretty much non-existent outside of the Palm and a few Windows Mobile 6 phones until the iPhone came out, and even those phones were highly dependent on a stylus, iPhone was the first touchscreen without a stylus. Ever since the original iPhone everyone's been playing catch-up, and while others offer faster cpus and more megapixels, no one offers the 200,000+ apps or the huge fan based and the chance to be a millionaire app developer. [techradar.com] In fact some of the largest Android game developers have boycott the Android Market. [androidandme.com] Do I care if the camera is 3mp or 5mp? No. Do I care if the phone offers the apps I want? Of course, these aren't just phones anymore, they're pocket PCs
Re:Apple? (Score:5, Insightful)
Re: (Score:2, Interesting)
So what you're saying is that you want to pay your hard earned money for a PC that the developers will actively seek to prevent you from gaining root access on, who's apps can only come from one place
Yes.
Re: (Score:3, Informative)
http://www.androidguys.com/2010/08/08/google-removes-easy-root-android-market/ [androidguys.com]
http://www.androidguys.com/2010/06/29/att-explains-opt-android-market/ [androidguys.com]
Re: (Score:2)
There are reasons for doing cpu intensive things (even if not particulary 1080p video) in a portable device of the size of a phone that you carry as much as your phone. But you are right that battery is a problem. Something that should have hopely days of autonomy could have a few hours using a powerful cpu, apps that take advantage of it, and a big, bright and colorful touchscreen unless using a high capacity battery. Before adding even more power hungry capabilities to phones some optimization on the powe
Re: (Score:2)
What is considered CPU-intensive? I'm sure people could say the same about netbooks, but I develop 10MP raw images on mine. Works just fine, but a little slow.
Re: (Score:2)
Develop? You mean process. You aren't using chemicals.
Re: (Score:3, Informative)
Re: (Score:3, Informative)
The N900 comes with 32GB of Flash built in, so it's enough for an episode of a TV show at 1080p by your metrics. I think you're talking nonsense though. BluRay disks store 25GB per side. If your assessment were accurate then this would be enough for 2 hours on a dual-layer disk, one hour on a single-layer disk. Given that most BluRay movies come on single-layer disks and don't take up the entire layer, and that TV stations use less bandwidth than BluRay, I wonder where you are getting this '1080p that i
Re: (Score:2, Redundant)
Re: (Score:2)
That's what he just said. What are you trying to contradict exactly?
Re: (Score:2)
It's not essential to see all the pixels, but when you have 1080p material and a phone, wouldn't it be good if the latter could play the former?
And maybe played by the phone itself? I know, it's a craaaazy idea but some people might actually want that...
You mean if the latter played the former? Wow, that IS a crazy idea!
Re: (Score:2)
And maybe played by the phone itself? I know, it's a craaaazy idea but some people might actually want that...
And some people want to have sex with wallabies.
Re: (Score:2)
There's some speculation it's a Mali GPU (Score:3, Interesting)
After Samsung "announced that it is adopting the Mali [GPU]...for its future graphics-enabled ...SoC ICs" [eetimes.com], it sounds plausible that the speedup and the lack of information about the GPU could relate to this Mali technology from ARM.
ARM has recently released source [arm.com] for some parts of the Linux drivers for current Mali GPUs under GPLv2, which might be the first step towards ARM SoC's with fully-open GPU drivers.
There are no guarantees, but at the moment it appears that ARM is much more receptive to the idea of