Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Cellphones Handhelds Technology

Dual-Core CPU Opens Door To 1080p On Smartphones 314

An anonymous reader writes "Following Qualcomm, Samsung is also close to launching a new smartphone processor with two cores. Based on ARM architecture, the new Orion processor promises five times the graphics performance of current chips and to enable 1080p video recording and playback. Next year, it seems, dual-core smart phones will be all the rage. Apple, which is generally believed to have the most capable processor in the market today, may be under pressure to roll out a dual-core iPhone next year as well."
This discussion has been archived. No new comments can be posted.

Dual-Core CPU Opens Door To 1080p On Smartphones

Comments Filter:
  • Okay then (Score:5, Interesting)

    by Jbcarpen ( 883850 ) on Wednesday September 08, 2010 @02:21AM (#33505768)
    My parallel programming professor likes to harp on the fact that nearly all new computers in the future will be multicore. Apparently he's right.
  • by dsanarco ( 1817934 ) on Wednesday September 08, 2010 @02:29AM (#33505806)
    1080p on a smart phone screen is overkill, but it can output over hdmi to a screen capable of displaying 1080p
  • Re:Killer feature. (Score:5, Interesting)

    by sznupi ( 719324 ) on Wednesday September 08, 2010 @02:45AM (#33505880) Homepage

    Not necessarily; some tests by ARM/Symbian/Nokia strongly suggest that a n-core chip of x frequency is a good way to get considerable energy savings over a singlecore chip of n*x frequency. Of course whether or not it would be used that way is another thing...

  • by bemymonkey ( 1244086 ) on Wednesday September 08, 2010 @03:17AM (#33505996)

    Yep, I have ~400MB available on my Desire (576MB in there total according to the spec sheets), and it's still not enough.

    My old Milestone (256MB RAM) was constantly killing off applications in the background because it was running out of RAM, sometimes not even saving the app states properly, causing me to lose my place (doesn't sound too bad, but it gets annoying quickly)... my Desire fares better, but there's still the occasional low memory kill when I have a lot of browser tabs open.

    What I want is a gigabyte or two of RAM and swap on SLC NAND... *drools*

  • by Anonymous Coward on Wednesday September 08, 2010 @03:18AM (#33506010)

    1. Why can't my HD camcorder play games?
    2. Streaming things from a phone is weird. Why, with full DRM, and online stores, wouldn't it make more sense for a token to be present in your account which enables streaming to multiple devices from the source and/or downloading for when on the go?

    2a. Why would someone who keeps HD content on a phone and wishes to watch it on a TV have a problem with internet connectivity?
    2b. The only really good use case I can think of is people using their phone a replacement for those DVD/BD binders. For all those times I just had to take my HD movie collection with me and jack into someone else's HD TV. So pretty much not most hotels, airplanes, bus, etc, and maybe the people I'm going to visit.

    3?
    I can completely understand the desire to stream stuff from a phone, just because you'll have something there eventually, and want to share it with others on a bigger screen, pics, movie, music, etc. BUT, HD content is going to waste so much space uselessly until you get to that HDTV. Your own will more than likely have better things to keep the media on in your own home, like a PC, PS3, XBox, the freaking disc, etc. So.. other people's HDTV. I don't get it.

  • Re:Killer feature. (Score:3, Interesting)

    by pspahn ( 1175617 ) on Wednesday September 08, 2010 @03:44AM (#33506114)

    Certainly a woosh.

    OTOH, when I bought my Evo, I read plenty about the poor battery performance. Not a big deal, I thought, as I didn't buy it because I make a lot of calls, and figured that I would have some buffer to work with because so.

    Unfortunately, what I didn't necessarily predict, is that I would be using wifi tethering as much as I do. This eats up battery pretty quickly. I bring my netbook to all sorts of places so that I am not tied down with school, and can work on and turn in assignments even when on a fishing trip in Wyoming.

    The consequence is that I now bring my USB cable damn near everywhere I go. It is pretty compact and fits in a thigh pocket without issue, but it does give me a sense of still being wired even though I'm not supposed to be. My brother, who also got an Evo after seeing mine (he's an early 20's iThing convert), opted for the macho aftermarket battery for $60 and I have to say, I think I'd like the same. What I'd really like to get, however, is a decent solar charging backup battery I can clip on my backpack so that I always have some extra juice when I need it. And after searching around for the right answer, I've concluded that everything is either a cheapo solution that won't last, or is still quite overpriced.

    Ultimately, I always have my portable jump-start/air compressor/radio/roadside hazard light battery thingy that has a USB port to rely on.

  • Re:Killer feature. (Score:4, Interesting)

    by fnj ( 64210 ) on Wednesday September 08, 2010 @03:46AM (#33506124)

    Wow, I want these new cellphone chips in notebooks and network attached devices! They are far ahead of the watt sucking crap from Intel and AMD.

  • Re:Killer feature. (Score:1, Interesting)

    by Anonymous Coward on Wednesday September 08, 2010 @03:47AM (#33506140)

    What he said, with the exception of "smart".

    I want something like what Nokia 1100 was. Just a basic cell phone with two weeks battery life, not expensive, perfect for calling people. So easy that even my great-grandmother could use it. And if you lost it, you just bought new one for 25 euros.

  • Re:Apple? (Score:2, Interesting)

    by trickyD1ck ( 1313117 ) on Wednesday September 08, 2010 @04:21AM (#33506306)

    So what you're saying is that you want to pay your hard earned money for a PC that the developers will actively seek to prevent you from gaining root access on, who's apps can only come from one place

    Yes.

  • by mr_exit ( 216086 ) on Wednesday September 08, 2010 @04:25AM (#33506326) Homepage

    Except that they're shrinking the die size enough that these new dual core, 1.5Ghz + chips will use less power then the current Snapdragon ones.

  • by peppepz ( 1311345 ) on Wednesday September 08, 2010 @04:29AM (#33506342)

    How does lossless digital zoom compensate for lens quality?

    I just meant it as an example of the fact that current phones' camera sensors do offer more pixels than is needed for 720p recording, so they use it for nice extra things such as digital zoom. So why couldn't the sensors of the near future use them for 1080p recording instead?

    As for noise reduction, wouldn't a lower resolution sensor of the same physical size be better than an algorithm?

    A lower resolution sensor, with the same S/N ratio, would capture less detail. Just dismissing higher resolution sensors as a marketing gimmick ignores the technical advancements in image sensors.

    I suppose that higher resolution sensors are needed for still photography, where exposition times can be longer, the camera can be supposed to be still, and the captured image can be downloaded with no hurry from the sensor, so the noise is lower.
    Movie filming just reuses the same sensor for a different purpose, where indeed a lower resolution sensor would be more adequate and cheaper, but then it would make the phone shoot worse photos. </speculation>

  • David Lynch (Score:3, Interesting)

    by xororand ( 860319 ) on Wednesday September 08, 2010 @04:43AM (#33506404)
  • by Qubit ( 100461 ) on Wednesday September 08, 2010 @04:46AM (#33506426) Homepage Journal

    After Samsung "announced that it is adopting the Mali [GPU]...for its future graphics-enabled ...SoC ICs" [eetimes.com], it sounds plausible that the speedup and the lack of information about the GPU could relate to this Mali technology from ARM.

    ARM has recently released source [arm.com] for some parts of the Linux drivers for current Mali GPUs under GPLv2, which might be the first step towards ARM SoC's with fully-open GPU drivers.

    There are no guarantees, but at the moment it appears that ARM is much more receptive to the idea of open GPU drivers than Imagination Tech (PowerVR GPUs) or NVidea.

    I think it's a shame that AMD isn't moving faster w.r.t the embedded/mobile market. Sure, they're planning to make SoC's with a GPU on the same silicon [anandtech.com], but as of last week they're not currently interested in competing with ARM for market share [cnn.com]. And AMD's the chipmaker that's most actively supporting and creating open drivers for their graphics hardware.

    It'll be interesting to see where the hardware goes in the next couple of years. Can Intel (and AMD, if they get serious) pull marketshare from ARM, or will the RISC chip reign supreme?

  • by Lonewolf666 ( 259450 ) on Wednesday September 08, 2010 @05:44AM (#33506622)

    I think GP is talking about the fact that you only get the higher resolution with optics of matching quality. All optics have some inherent blurring, only the degree depends on quality (and with very high quality optics, you may run the diffraction limit: http://en.wikipedia.org/wiki/Diffraction-limited [wikipedia.org])

    So if you combine a high-resolution sensor with a second rate lens, all you achieve is that the blur where a sharp edge should be is spread across more pixels.
    To some degree this is OK because it makes the image appear less blocky, and a later interpolation to get smooth edges becomes unneccessary. But I think at some point the higher resolution of the sensor is just wasted.

  • Re:Killer feature. (Score:3, Interesting)

    by bemymonkey ( 1244086 ) on Wednesday September 08, 2010 @05:48AM (#33506640)

    No, I'm saying that given the same other hardware, this processor won't affect battery life negatively in a very noticable way.

    Battery life is already crap, and it's not because of the processors used. All this power optimization should be taking place where it's needed most... crappy AMOLED screens with twice the power draw of LCD when displaying anything remotely useful (i.e. not a mostly black screen), for instance.

  • by peppepz ( 1311345 ) on Wednesday September 08, 2010 @06:03AM (#33506676)
    See for yourself, I've not found much Qualcomm information but Texas Instruments publishes a lot of documentation [ti.com] about their smartphone platforms.

    From a quick read of the user quide I see that their OMAP3 platform apparently has two modules for the camera interface; a "Camera ISP" module which fetches data from the CCD in raw, yuv, rgb or jpeg format, and a "IVA" module, which appears to be a DSP with hardcoded functionality for mainstream codecs but can also be programmed for what they call "emerging" codecs. This module can then DMA its output to the application CPU module. All the cores are on the same SOC and are interconnected with two Sonics buses, one of which must bear the bandwidth of the data coming from the sensor.

    What's impressive is that there is much less hardcoded logic involved than one might think. The OMAP4 leaflet claims its IVA core can deliver 30fps 1080p encoding and decoding for h.264 hp, mpeg4 asp, vc-1 ap, mpeg2 mp and on2 vp7.

  • by hitmark ( 640295 ) on Wednesday September 08, 2010 @06:35AM (#33506770) Journal

    HDMI or displayport out? you will likely find more and more products with this port: http://en.wikipedia.org/wiki/PDMI [wikipedia.org]

    Consider, where before, only apple could show having a singular port, now all may have so. End result, you can dock any device to any tv without worrying about carrying the right cable. Should make parties more interesting, i suspect.

  • Re:Okay then (Score:5, Interesting)

    by ledow ( 319597 ) on Wednesday September 08, 2010 @06:45AM (#33506820) Homepage

    The GP2X is a dual 200Mhz processor handheld console that runs off 2 AA batteries and was released in 2005. It was not that revolutionary, it was not that expensive, it was not that difficult to program. You can program on both chips so long as you don't mind temporarily ditching the in-built MPEG-decoding firmware that runs on one of them (it's replaced on each bootup) or being very careful where you tread. The Nintendo DS was multiprocessor. And not in the unconventional sense of "one CPU and a handful of support chips", two real, full, general purpose, programmable chips. Most modern games consoles have more than one real chip, many of them using multiple specialised processors (e.g. IBM's Cell in the PS3 which is a "9-chip" component).

    Massive parallelism has been around forever and it's in consumer electronics already and has been for quite a while. It might not be *designed* like that but you have always had home computers with multiple processors that can be programmed to operate in parallel. There were people misusing floppy drive controller chips and sound processors to do all sorts, and the GPU is "another" processor now, and one that's extremely good at running lots of things in parallel. Do you think that your CPU could ever keep up with software-rendering on a modern game? Or that a single-core "general purpose" GPU could?

    Your professor is right, except for his timescales - parallelism is already here, right down to tiny embedded systems, and has been for years. Just that hardly anybody uses it without some seriously specialised programming design beforehand. That tells you quite a lot about how expensive it is to use effectively. Hell, to a new programmer, even threading and re-entrancy can be a huge nightmare for them and our solution at the moment is blocking locks and things like that (so that we can *think* about them being seperate linear instances). If you can become an expert in parallelism you'll probably have a good career ahead - but a specialist one. Everyone else is just waiting until they can pass everything off to a dedicated chip and get told when it's finished - they don't *WANT* to properly program parallel programs (tongue twister!), they just want everything to happen serially and faster, or the closest approximation to that that they can get.

    Seriously, when you program any graphics library, you just throw a huge amount of 3D object data into another processor and let it get on with it. You don't care if it runs at 6GHz, or whether it's got a separate internal processor for every pixel of the final image running at 1MHz, so long as it damn-well draws the screen as quickly as possible and tells you when it's done. Parallelism is a hidden facet of modern architecture precisely because it's so necessary and so damn complicated to do it right. Programmers, on the whole, are much happier with linear processes. It's taken games this long to pass off physics, AI, etc. to a separate thread, and we've had threading for DECADES.

    Parallel things are just a placeholder at the moment because we can't up the GHz much and if we could, it wouldn't help that much with the tasks we want to do. So even operating systems are handing off everything they can to other chips - Ethernet TCP offloading, GPU, sound-processors, RAID, you name it. It's all about making the general-purpose CPU's do as little as possible so the main use of the computer can continue uninterrupted. And parallelism is only used to increase the things we can do, not break tasks into much more efficient subtasks. Most people who have dual-core or above in the early days wanted it so that other things (e.g. antivirus) got out of the way of the real process (e.g. games!) so it could have as much time running in it's single thread as possible.

    Parallelism can see fantastic gains but it needs to be done from the design stage each time. Mainstream multi-core products are too much of a moving target to do anything more fancy than run a number of huge, single-threaded programs at the same time. That's *not* the same thing, and never will be. Parallelism is specialised. Games programmers would give their right arm for everyone to have a single 100GHz processor instead.

  • Re:Killer feature. (Score:2, Interesting)

    by bytta ( 904762 ) on Thursday September 09, 2010 @02:55AM (#33517780)
    Actually - this raises another question:
    Why the hell aren't companies making headless Linux servers using these processors?

    A 1GHz processor drawing max 40 mAh means that such a device at full load uses less than one Watt a day! (for the CPU)

    That's a couple of orders of magnitude over the current wall-wart Linux devices available today...

So... did you ever wonder, do garbagemen take showers before they go to work?

Working...