Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Cellphones Handhelds Technology

Dual-Core CPU Opens Door To 1080p On Smartphones 314

An anonymous reader writes "Following Qualcomm, Samsung is also close to launching a new smartphone processor with two cores. Based on ARM architecture, the new Orion processor promises five times the graphics performance of current chips and to enable 1080p video recording and playback. Next year, it seems, dual-core smart phones will be all the rage. Apple, which is generally believed to have the most capable processor in the market today, may be under pressure to roll out a dual-core iPhone next year as well."
This discussion has been archived. No new comments can be posted.

Dual-Core CPU Opens Door To 1080p On Smartphones

Comments Filter:
  • ...so you can drop calls twice as fast.
    • As an iPhone owner I'd like to make it perfectly clear that we don't need dual cores to drop calls twice as fast. We're doing OK in that department already, TYVM.
  • Okay then (Score:5, Interesting)

    by Jbcarpen ( 883850 ) on Wednesday September 08, 2010 @02:21AM (#33505768)
    My parallel programming professor likes to harp on the fact that nearly all new computers in the future will be multicore. Apparently he's right.
    • "My parallel programming professor likes to harp on the fact that nearly all new computers in the future will be multicore."

      The future is now then, that's not much of a prediction.
  • Re: (Score:2, Insightful)

    Comment removed based on user account deletion
    • by amRadioHed ( 463061 ) on Wednesday September 08, 2010 @02:29AM (#33505802)

      It seems they are talking about recording 1080p, not viewing 1080p. You don't have to view your recorded videos on the phone.

      • It seems they are talking about recording 1080p, not viewing 1080p. You don't have to view your recorded videos on the phone.

        it would be nice to be able to play it back onto a hooked up TV straight after taking it... that still requires the device be able to playback 1080p, which is far easier than encoding it...

      • by jrumney ( 197329 ) on Wednesday September 08, 2010 @02:54AM (#33505914)
        A crap cellphone lens is still just as crappy at 1080p as at 720p, in fact the higher resolution is more likely to enhance the flaws than at a lower resolution. 1080p on a cellphone is nothing more than a sales gimmick, just as 20Mpx on a compact camera is (or 5Mpx on a cellphone camera).
        • Why? Current phones' sensors have a much higher resolution than 720p, so they can use the extra detail to achieve lossless digital zoom. For example, the Nokia N8 can zoom "up to 3x" without interpolating pixels with this technique.

          What's wrong with using this pixel abundance to shoot 1080p instead? There would still be enough pixels to allow for noise reduction.

          • Re: (Score:3, Insightful)

            How does lossless digital zoom compensate for lens quality? As for noise reduction, wouldn't a lower resolution sensor of the same physical size be better than an algorithm?

            • Re: (Score:2, Interesting)

              by peppepz ( 1311345 )

              How does lossless digital zoom compensate for lens quality?

              I just meant it as an example of the fact that current phones' camera sensors do offer more pixels than is needed for 720p recording, so they use it for nice extra things such as digital zoom. So why couldn't the sensors of the near future use them for 1080p recording instead?

              As for noise reduction, wouldn't a lower resolution sensor of the same physical size be better than an algorithm?

              A lower resolution sensor, with the same S/N ratio, would capture less detail. Just dismissing higher resolution sensors as a marketing gimmick ignores the technical advancements in image sensors.

              I suppose that higher resolution sens

              • Re: (Score:3, Interesting)

                by Lonewolf666 ( 259450 )

                I think GP is talking about the fact that you only get the higher resolution with optics of matching quality. All optics have some inherent blurring, only the degree depends on quality (and with very high quality optics, you may run the diffraction limit: http://en.wikipedia.org/wiki/Diffraction-limited [wikipedia.org])

                So if you combine a high-resolution sensor with a second rate lens, all you achieve is that the blur where a sharp edge should be is spread across more pixels.
                To some degree this is OK because it makes the i

        • 1080p on a cellphone is nothing more than a sales gimmick

          It's irritatingly commendable that cellphone makers must create a market at 5 inch resolutions that doesn't even exist at 4 to 7 times that size for living room enjoyment.

          Where the hell is our overdue 1080p on sub-40-inch TVs? The industry is holding out at the low end: If it wasn't for this 3D LCD fad, we'd sooner be seeing 20 inch TV's finally get 1080p so they, and classroom slide projectors can finally catch up to 15-year old desktop resolutions.

          • Where the hell is our overdue 1080p on sub-40-inch TVs?

            On Google [google.com] you lazy ass idiot.

          • classroom slide projectors

            Remove the word classroom and slide from there. I think those have always been analog. I meant to hilight how 640x480 and 800x600 plagued projectors so that college presentations meant for @ 1024x768 and higher were butchered because the prof had to manually lower resolutions just so they could be over-head projected for the class. This was sacrificing font clarity and screen real estate and seems to still be a problem on our more modern business Smartboards when the video source is on PC's and laptops desi

      • Re: (Score:3, Insightful)

        by Lumpy ( 12016 )

        As an owner of 1080p recording equipment, let me say that any phone that records 1080p will be a giant turd.

        when you record at 1080p your lens quality becomes very important. even at 720p you can get away with the crappy NTSC lenses, but 1080p will show off bad glass right away.

        1080p in a phone is a ragingly stupid thing. It's stupid in all the consumer camcorders t hat are under $1800.00 right now as well. (Nasty purple fringing and plain old awful lenses on these things coupled with really low record

    • Re: (Score:2, Interesting)

      by dsanarco ( 1817934 )
      1080p on a smart phone screen is overkill, but it can output over hdmi to a screen capable of displaying 1080p
    • DNLA playback [wikipedia.org] is what is important, here. Now you can store - and play back (wirelessly, if the phone is so equipped) - HD video from your phone. That's what you need the resolution for. External display playback.
    • by mjwx ( 966435 ) on Wednesday September 08, 2010 @03:10AM (#33505964)

      I have a hard time understanding how 1080p is such a great feature on screens 4" or smaller in diameter.

      The feature is important for 1080p output, combined with HDMI makes a phone compatible with most projectors, LCD/LED TV's and modern monitors. I can easily see myself walking in and displaying a video or presentation stored on my phone. Ideal for impromptu sales pitches or just bringing a movie over to a friends place.

    • by pspahn ( 1175617 ) on Wednesday September 08, 2010 @03:25AM (#33506032)

      I have a hard time understanding how 1080p is such a great feature on screens 4" or smaller in diameter.

      You raise an interesting question, that which will likely be the next big paradigm in smart phones. Circular screens.

    • by gl4ss ( 559668 )
      it's not a terribly great feature if you've hooked to a regular sized phone screen, but it is after you hook it up to 1080 screen. it's a somewhat good indicator about available memory bandwiths and processor power as well. it is a worthy thing to have on your press release about a new soc if it can do it or not though.

      and this new samsung soc design will most definetely find it it's way into systems like next gen apple tv, plenty of samsung tv's and cheapo stb's.

      and even with a crappy lens it's still
  • by angry tapir ( 1463043 ) on Wednesday September 08, 2010 @02:28AM (#33505798) Homepage
    LG's new Optimus line will include smartphones running on Nvidia's Tegra 2 dual-core chips [goodgearguide.com.au].
  • ...with one of those bad boys in it. And an eSata interface. All your home server problems solved!

    • "Plug computing" ((c) Marvell) is the successor of that direction. There's stuff like this [globalscal...logies.com] hitting the market, though still single-core for now.

      Once these new Cortex A9s are more in the wild, I'd expect them to find a home in the wall wart machines as well.

  • by Anonymous Coward

    Now I can watch amazing 1080p on a 4.5" screen. My cinema experience is now complete.

    Smart phones don't need dual core. They need more RAM.

    App designers are guaranteed certain resources when the application runs on a phone. This is why a single-tasking paradigm was popular, because it simply guaranteed these resources. Multitasking requires sharing of memory. Without swap space enabled, memory may run out quickly. Android has mechanisms for saving a program's state and killing off the least-recently-used ap

    • Re: (Score:3, Interesting)

      by bemymonkey ( 1244086 )

      Yep, I have ~400MB available on my Desire (576MB in there total according to the spec sheets), and it's still not enough.

      My old Milestone (256MB RAM) was constantly killing off applications in the background because it was running out of RAM, sometimes not even saving the app states properly, causing me to lose my place (doesn't sound too bad, but it gets annoying quickly)... my Desire fares better, but there's still the occasional low memory kill when I have a lot of browser tabs open.

      What I want is a giga

  • Killer feature. (Score:3, Insightful)

    by RightSaidFred99 ( 874576 ) on Wednesday September 08, 2010 @02:32AM (#33505818)

    And by killer, I mean battery killer.

    I think smartphones need to go back to basics. I'd take a smartphone that lasted 4 days of normal use on a single charge anytime over a new one that does shit I don't really need anyway 10% (or even 30%) faster.

    Once they've got battery life back under control, get back on performance.

    • by davester666 ( 731373 ) on Wednesday September 08, 2010 @02:44AM (#33505874) Journal

      Fine. So you're a member of the 1% of all cellphone users that doesn't regularly connect their phone to their TV to watch HD movies.

      Maybe you should try joining the rest of us in the 21st century, with chargers at home, at the office and in our cars!

      • you're a member of the 1% of all cellphone users that doesn't regularly connect their phone to their TV to watch HD movies

        Did you mistype, or are you really claiming that 99% of all cellphone users regularly connect their phone to their TV to watch TV?

        Or is this a "whoooooosh" moment for me?

        • Re: (Score:3, Interesting)

          by pspahn ( 1175617 )

          Certainly a woosh.

          OTOH, when I bought my Evo, I read plenty about the poor battery performance. Not a big deal, I thought, as I didn't buy it because I make a lot of calls, and figured that I would have some buffer to work with because so.

          Unfortunately, what I didn't necessarily predict, is that I would be using wifi tethering as much as I do. This eats up battery pretty quickly. I bring my netbook to all sorts of places so that I am not tied down with school, and can work on and turn in assignments even

    • Re:Killer feature. (Score:5, Interesting)

      by sznupi ( 719324 ) on Wednesday September 08, 2010 @02:45AM (#33505880) Homepage

      Not necessarily; some tests by ARM/Symbian/Nokia strongly suggest that a n-core chip of x frequency is a good way to get considerable energy savings over a singlecore chip of n*x frequency. Of course whether or not it would be used that way is another thing...

    • Re:Killer feature. (Score:5, Informative)

      by bemymonkey ( 1244086 ) on Wednesday September 08, 2010 @03:22AM (#33506024)

      Battery life is fine if you keep the screen off. I get a standby power draw of roughly 5mA on average on my Desire. That works out to about 280h of standby time, and that's with a bunch of always-connected applications (Google Sync always active, an IM client, SIP client) in the background, and WiFi and Bluetooth on. Turn all that stuff off and I get values more around 3mA... 466h.

      Obviously a screen that draws almost 100x as much (seriously, at full power the AMOLED screen draws close to 300mA!) is going to kill off the battery very quickly.

      In comparison, the SoC uses very little power (full CPU load on the Desire's Snapdragon is 40mA higher than idle - tested with SetCPU's stress test) and scales very well with load. If you really want to increase use time, build more efficient screens... fuck the processor.

      • Re:Killer feature. (Score:4, Interesting)

        by fnj ( 64210 ) on Wednesday September 08, 2010 @03:46AM (#33506124)

        Wow, I want these new cellphone chips in notebooks and network attached devices! They are far ahead of the watt sucking crap from Intel and AMD.

        • Well, they ARE already partially available in NAS type devices. As a matter of fact, the cheapo Buffalo LinkStation Lite I have sitting out in the hall runs on an ARM chip of some sort, IIRC...

          Smartbooks and tablets are starting to run on ARM as well. Shouldn't be too long until your wishes are granted ;)

      • by syousef ( 465911 ) on Wednesday September 08, 2010 @04:48AM (#33506442) Journal

        Battery life is fine if you keep the screen off. I get a standby power draw of roughly 5mA on average on my Desire. That works out to about 280h of standby time, and that's with a bunch of always-connected applications (Google Sync always active, an IM client, SIP client) in the background, and WiFi and Bluetooth on. Turn all that stuff off and I get values more around 3mA... 466h.

        Obviously a screen that draws almost 100x as much (seriously, at full power the AMOLED screen draws close to 300mA!) is going to kill off the battery very quickly.

        In comparison, the SoC uses very little power (full CPU load on the Desire's Snapdragon is 40mA higher than idle - tested with SetCPU's stress test) and scales very well with load. If you really want to increase use time, build more efficient screens... fuck the processor.

        So what you're saying is that if you never actually look at your phone, or use it as a phone, you can run idle applications in the background. Colour me impressed.

        • Correct. That's why I never manage more than 12 hours on a single charge, and carry around an external USB power pack with a 4400mAh LiIon battery in it...

          The gist of what I was trying to say was, however, that the CPU is NOT the component that draws so much power on smartphones, but rather the screen, and to a lesser effect (contrary to popular belief) the radios.

      • Battery life is fine if you keep the screen off. I get a standby power draw of roughly 5mA on average on my Desire. That works out to about 280h of standby time, and that's with a bunch of always-connected applications (Google Sync always active, an IM client, SIP client) in the background, and WiFi and Bluetooth on. Turn all that stuff off and I get values more around 3mA... 466h.

        Obviously a screen that draws almost 100x as much (seriously, at full power the AMOLED screen draws close to 300mA!) is going to kill off the battery very quickly.

        In comparison, the SoC uses very little power (full CPU load on the Desire's Snapdragon is 40mA higher than idle - tested with SetCPU's stress test) and scales very well with load. If you really want to increase use time, build more efficient screens... fuck the processor.

        So what you're saying is this new dual core processor will be great for watching 1080p movies with the screen off (or on an e-ink screen) ?

        • Re: (Score:3, Interesting)

          by bemymonkey ( 1244086 )

          No, I'm saying that given the same other hardware, this processor won't affect battery life negatively in a very noticable way.

          Battery life is already crap, and it's not because of the processors used. All this power optimization should be taking place where it's needed most... crappy AMOLED screens with twice the power draw of LCD when displaying anything remotely useful (i.e. not a mostly black screen), for instance.

  • "Apple, which is generally believed to have the most capable processor in the market today"

    Huh? I thought Apple used the same processor, ARM 11, as Nokia.

    • by Slur ( 61510 ) on Wednesday September 08, 2010 @02:51AM (#33505906) Homepage Journal

      iPhone 4, iPod Touch gen.4, iPad, and Apple TV gen.2 all use the Apple A4 processor, which is an ARM+GPU manufactured by Samsung.

    • If you haven't noticed, everything Apple does is always "brilliant" and "innovative" according to the tech press. Doesn't matter if they are releasing something that is the same as everything else. For example the Apple TV gets praise lavished on it as an amazing on-demand streaming device, even though nearly every Blu-ray player with an ethernet port also does streaming and, of course, plays DVDs and Blu-rays on top of that.

      For that matter, it might even be Apple PR copied verbatim. It is amazing how many

      • by mjwx ( 966435 )

        For that matter, it might even be Apple PR copied verbatim. It is amazing how many press agencies will just reprint PR copy that the like. A PR firm will send out the "OMG t3h new stuffs!" memo, as PR firms do, and sites will pick it up and regurgitate a good bit of it verbatim.

        After that Apple will write benchmark for Anandtech who will publish it without question.

        I instantly know when I hear things like "Snappier", "User Experience", "Superior UI" and "responsive/ness" that the poster has absolutely

        • by pspahn ( 1175617 )

          Totally OT, but I remember visiting Anandtech when the kid was still in high school and thinking, wow, what a bright young dude. The last time I was there, however, it seems as though it's nothing but a bunch of what you just mentioned.

          Sorry, though, I don't even have a lawn.

          • by mjwx ( 966435 )

            Totally OT,

            Probably a bit but I've never seen one of his tests replicated. Plus he tends to leave out a lot of factors (such as the starting and ending dBm for the "Death Grip" tests he did).

            Sorry, though, I don't even have a lawn.

            I used to, then the housemate killed it (lawns do use a lot of water and need a lot of maintenance). However if it makes you feel better you can get off my woodchips.

    • or japanese or swedish or swiss or something like that . Apple is american .

    • ARM 11 isn't a processor, it is an ISA designation
    • Re: (Score:3, Informative)

      by iJed ( 594606 )
      The last iPhone to have an ARM 11 was the 3G. They have been Cortex A8 based since the 3GS.
    • Re: (Score:3, Informative)

      by TheRaven64 ( 641858 )
      Nope. ARM11 is ARM's two-generations-ago, implementing the ARMv6 instruction set (confused yet?). The iPhone, like pretty much every other high-end smartphone, uses the Cortex A8 core, which replaced the ARM11 at the top end. ARM11 is used for cheaper devices - Samsung sells a lot of them, because they are much cheaper than the A8. The iPhone uses Apple's A4 chip, manufactured by Samsung, which uses an unmodified Cortex A8 core and a small number of other components. The only thing that makes the A4 in
  • Obligatory XKCD (Score:2, Insightful)

    by CrazyJim1 ( 809850 )
    1080p [xkcd.com]
  • by stephanruby ( 542433 ) on Wednesday September 08, 2010 @02:46AM (#33505886)

    Apple, which is generally believed to have the most capable processor in the market today, may be under pressure to roll out a dual-core iPhone next year as well.

    This is silly. Apple is using Samsung's processor, an OEM version of the Hummingbird (which is not exclusively sold to Apple by any means). So if anyone has "the most capable [mobile] processor in the market today" (and even that statement could be debated), it's Samsung (certainly not Apple).

  • I want one of these, with the ability to: 1) Connect it to a Mouse/Keyboard/Monitor to act as a desktop via HDMI/USB/Bluetooth/whatever. 2) Dock it into a 10" touchscreen to act as a tablet.
    • Well, a 10" touchscreen is not the point. The point is that you want to connect it to ANY 45" TV via a hdmi link. You got to have a little bit of imagination there. couple it to a blue tooth mouse and aBT keyboard and a power adapter and youhave a mobile work station.

      However if you want to use it for media player, the market speaks forgets that not every 1080p is equal. The main problem is: how to get data INTO the phone. a HD blueray iamge is 40GB. With current SD card storage you can load one of such 108

  • ...on the phone by the manufacturers and carriers, what's the whole point of having that much power? Recording and watching 1080p video? Pfft....the lack of imagination is pathetic. I have a tons of apps that I'd like to work on, if the phone platform is as open as the PC platform. Laptops just don't have the mobility and form/shape required for a ubiquitous interaction.

    I just wish there are more manufacturers put out more high-end mobility devices for the MeeGo platform. Can't wait to get my hand on the

  • We're all worried about smartphone CPUs that can decode 1080p video when none of them have screens that can display it.

    Stop and think about it for a minute.

    Marketspeak has totally infiltrated discussion about display resolutions and I am branded as an idiot when I bring up that 720p involves a lot more than 720 vertical pixels, and that progressive scan doesn't mean shit on LCDs to begin with.

    I would love to be proven wrong here, but I have no idea why progressive v. interlaced is even brought up anymore.

    • Re: (Score:3, Interesting)

      by hitmark ( 640295 )

      HDMI or displayport out? you will likely find more and more products with this port: http://en.wikipedia.org/wiki/PDMI [wikipedia.org]

      Consider, where before, only apple could show having a singular port, now all may have so. End result, you can dock any device to any tv without worrying about carrying the right cable. Should make parties more interesting, i suspect.

  • Misleading Headline? (Score:5, Informative)

    by GaneshAnandTech ( 1896204 ) on Wednesday September 08, 2010 @03:39AM (#33506090) Homepage
    Unfortunately, the dual core CPU has got almost nothing to do with the 1080p encode or decode. These are handled by dedicated IPs (pre-designed blocks which are slotted into the chip) from companies like Imagination Technologies and Chips & Media. They would work as well with an single core Cortex-A8 as they do with the Cortex-A9.
  • 1920x1080 pixels x24 bit depth x30 fps ~= 1.4Gbps [wikipedia.org].
    I wonder whether multiple cores are enough or you maybe need something like DMA [wikipedia.org] or a data crossbar [wikipedia.org] into the hw architecture as well as whether the current SD cards can cope at that pace ...
    • by Chrisq ( 894406 )
      There are "high speed SD cards" that some camcorders need
      • by Chrisq ( 894406 )
        Of course what I missed is your assumption that they are storing uncompressed 1080p on the card. Nobody does that, it would be compressed greatly by H.264 or a similar codec.
        • Not storing ... transfering from the CCD to the compression engine.
          Compression, I understand, is mandatory for storage because of external memory bandwidth constraints ...
    • Nobody but the camera module would see that stream *uncompressed*. Just handling it would probably kill a Core i7, let alone h264-compressing it on the fly.
      • ... you mean the h.264 compression is done into the CCD?
        If it's run into any of the CPUs/GPUs, then the stream needs to be transfered to the CPU/GPU RAM ... at 1.4Gbps!
  • The original Nintendo DS has a dual core ARM CPU. Why is it such a big deal in a smartphone that costs several times as much more than five years later?
  • I'm not entirely certain, but I thought that the HTC Tilt 2 (AT&T and others) had two processors. Granted, one was for the phone subsystem, and the other for the OS, but still. Maybe someone can elaborate.

    Either way, I've been thinking we need to add more processing power to phones for a long time. They are way too slow for the things we want to be able to do with them. But, the offset is the battery life. :(

  • Wake me up when smart "phones" do 1080p 3D (two CCDs) @ 60fps, for 8h on one battery.

  • David Lynch (Score:3, Interesting)

    by xororand ( 860319 ) on Wednesday September 08, 2010 @04:43AM (#33506404)
  • Comment removed based on user account deletion
  • by yabos ( 719499 ) on Wednesday September 08, 2010 @11:44AM (#33509110)
    The iOS SDK has Grand Central Dispatch, which is Apple's easy way of dispatching and managing multiple threads. If you program your application with these APIs, as soon as a dual core iPhone comes out your application will take advantage of the 2nd core.

BLISS is ignorance.

Working...