Become a fan of Slashdot on Facebook


Forgot your password?
Intel Cellphones Handhelds Power Hardware

48-Core Chips Could Redefine Mobile Devices 285

CWmike writes "Intel researchers are working on a 48-core processor for smartphones and tablets, but it could be five to 10 years before it hits the market. Having a 48-core chip in a small mobile device would open up a whole new world of possibilities. 'If we're going to have this technology in five to 10 years, we could finally do things that take way too much processing power today,' said analyst Patrick Moorhead. 'This could really open up our concept of what is a computer... The phone would be smart enough to not just be a computer but it could be my computer.' Enric Herrero, a research scientist at Intel Labs in Barcelona, explained that with the prototype chip someone could, for instance, be encrypting an email while also working on other power-intensive apps at the same time — without hiccups. Same for HD video. Intel's Tanausu Ramirez said it could also boost battery life. 'The chip also can take the energy and split it up and distribute it between different applications,' he said. Justin Rattner, Intel's CTO, told Computerworld that a 48-core chip for small mobile devices could hit the market 'much sooner' than the researchers' 10-year prediction."
This discussion has been archived. No new comments can be posted.

48-Core Chips Could Redefine Mobile Devices

Comments Filter:
  • Desktop (Score:5, Insightful)

    by Anonymous Coward on Tuesday October 30, 2012 @12:37PM (#41819239)

    Let's put a 48-core processor on a desktop or laptop before we talk about tablets or phones...

    • Re:Desktop (Score:5, Interesting)

      by tom17 ( 659054 ) on Tuesday October 30, 2012 @12:44PM (#41819345) Homepage

      Why not concentrate on tablets and phones first?

      1. They are the fastest growing segment and "everyone will have one"
      2. This will then be your primary computing device that follows you around. It's with you when you need it, because...
      3. It's easier to use a fully mobile device as a workstation device (just add a keyboard/monitor and fashionable pointing device) than it is to use a workstation device as a pocket computer.
      4. Power savings from this kind of architecture are more critical on pocket devices
      5. ...
      6. PROFIT! (Sorry)

      I think it's a good way forwards. But that's just opinion so...

      • Re:Desktop (Score:5, Insightful)

        by ByOhTek ( 1181381 ) on Tuesday October 30, 2012 @12:55PM (#41819483) Journal

        Because desktops have one less criteria to meet than tablets and phones - they don't have nearly as small of a power envelope.

        The desktop, therefore could be seen as a logical step in the progression to getting it on the phone/tablet.

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          And I don't want my 48-core machine to be fucking mobile. It means I would then be expected to take my work everywhere with me. Fuck that. Until we learn how to respect the sanctity of vacation time in the US and bring up the average vacation length for workers to something near European standard I would rather see this in a non-mobile version first.

          • Re: (Score:2, Interesting)

            by Anonymous Coward

            I guess it works for me as I can easily separate my working life from my personal life (Yes, I live in North America but I derived my only-work-at-work mentality from the other side of the pond).

          • Re:Desktop (Score:5, Funny)

            by Mike Buddha ( 10734 ) on Tuesday October 30, 2012 @02:48PM (#41820905)

            And I don't want my 48-core machine to be fucking mobile. It means I would then be expected to take my work everywhere with me. Fuck that. Until we learn how to respect the sanctity of vacation time in the US and bring up the average vacation length for workers to something near European standard I would rather see this in a non-mobile version first.

            I don't think you have anything to worry about. They're not making a portable deep fryer, this is a computer.

      • Re:Desktop (Score:5, Interesting)

        by jonadab ( 583620 ) on Tuesday October 30, 2012 @01:13PM (#41819715) Homepage Journal
        > Why not concentrate on tablets and phones first?

        Because people expect significantly more from desktops than from phones.

        The article says this:
        > The phone would be smart enough to not just be a computer but it could be my computer.

        That would make any sense at all if, in addition to processing power, the phone also had multiple gigabytes of primary memory and could utilize multiple peripherals for input (keyboard, mouse, etc) and output (monitors, printers, speaker systems), store hundreds of gigabytes of data, connect to multiple networks (including high-speed wired ones), and run desktop applications.

        Traditionally, even the smartest phones aren't expected to do any of that. It isn't mostly the processing power that's holding them back.
        • by wisty ( 1335733 )

          SSH + HDMI + USB + EC2 + Wifi + a fibre.

          Too much lag for gaming / interactive 3D work though.

        • by fatphil ( 181876 )
          TV out and HDMI out have been on phones for a while.
          USB host mode, under the guise of OTG, likewise - so that's your peripherals and wired network.
          They've had a gig of main memory for a while too (and still swap!). Moore's law would make one expect that many gigabytes will be coming soon.
          64GB flash memory storage has been on phones for ages, there's no reason to not expect that to grow rapidly too.

          There's nothing about modern phone technology that isn't as capable as a typical desktop 5 years ago. There's n
        • That would make any sense at all if, in addition to processing power, the phone also had multiple gigabytes of primary memory and could utilize multiple peripherals for input (keyboard, mouse, etc) and output (monitors, printers, speaker systems), store hundreds of gigabytes of data, connect to multiple networks (including high-speed wired ones).

          Let's see: 2GB primary memory (a little light but hey, it's almost 6 months old) 96GB nonvolatile storage, dual core 1.5ghz, 100+Mbit A/B/G/N wifi plus 100Mbit LTE (wires are for old people)... full monitor output via HDMI, plus bluetooth or usb peripherals. I'm sure in another year or two they will be able to add whatever it is that is still missing...

      • Re:Desktop (Score:5, Insightful)

        by Type44Q ( 1233630 ) on Tuesday October 30, 2012 @01:15PM (#41819741)

        This will then be your primary computing device that follows you around.

        This will then be your primary computing device that:

        A) you leave on the roof of your car.
        B) gets dropped in the toilet.
        C) you spill your beverage on.
        D) gets chewed up by your dog.
        E) you get mugged for.
        F) you leave in your hotel room.
        G) you have confiscated by the authorities (should you find yourself at the wrong place/time)
        H) gets reverse-engineered/stress-tested by your toddler

        Shall I continue? Seriously; fuck all this smartphone nonsense; give me a borderline-disposable Nokia 2600-series and I'll happily call it a day! :p

        • When they said ``computing device that follows you around,'' they might've meant as ``your own personal terminator, follows you around and protects you'' :-)

          • by oPless ( 63249 )

            ``your own personal terminator, follows you around and protects you'' :-)

            Okay, I'll get right on it.

            What size would you like? 4ft,5ft, 5ft 5", 6ft, 6ft6", or 7ft?

        • Great point though; these are devices that have to be bought a second time far more often than a desktop, didn't even think about the potential value to intel in that fact.
        • Likely we'll either all have moved to cloud storage, or a better form of 'store at home, access anywhere via the internet' will have been developed for normal users.

        • This is why I think the ones that come out with a $50 disposable smartphone are gonna make a fucking killing, because all it takes is seeing what a 2 year old can do to an iPhone ONE time to make you realize that having insanely expensive small devices is a BAD idea for families. In that vein I've seen a lot of my customers giving up their iPhones for those sub $100 Android phones at Walmart, they don't get locked into a contract and if the phone gets flushed or stomped you don't have a damned coronary like
        • I don't mean to be a dick here... but..

          I've never had any of that stuff happen, and I've been a smartphone user since before they were called smartphones.

          If you don't want one, you don't want one... fine. There's no need to rationalize it.

        • Re:Desktop (Score:4, Funny)

          by argStyopa ( 232550 ) on Tuesday October 30, 2012 @03:19PM (#41821327) Journal

          This will then be your primary computing device that:

          A) you leave on the roof of your car.
          B) gets dropped in the toilet.
          C) you spill your beverage on.
          D) gets chewed up by your dog.
          E) you get mugged for.
          F) you leave in your hotel room.
          G) you have confiscated by the authorities (should you find yourself at the wrong place/time)
          H) gets reverse-engineered/stress-tested by your toddler

          You say that as if such destruction wouldn't cause you to (insurance-covered or not) buy ANOTHER phone thus stimulating market activity, as if that's not good?
          Look guys, we need to work together to rescue the economy. /Keynes

      • Re: (Score:3, Insightful)

        by jythie ( 914043 )
        Tablets and phones will likely continue to represent scaled down devices.

        48 core chips would be silly to release for small devices first. Developing software for a large number of cores is non-trivial, and is often inefficient or unstable... it is one of those things like AI or fusion that has always be 'right around the corner' for general purpose computing, but has neve really arrived.... college classes still teach the techniques poorly and people who 'learn it on their own' also usually produce poor a
      • Because desktops are where you work out the power problems and is the test bed of the parts that end up in mobile? Its no coincidence that the parts start on desktops, the GPUs start as discretes and IGPs, because you don't have to worry about hitting the insanely low heat and power requirements on the mobiles as you refine the process while you still get paid.

        This is one advantage AMD and Intel has had over their competitors, because they don't have to get the power and cooling right off the bat, they c

    • by tomhath ( 637240 )
      I'm pretty sure that's kind of the plan here. A single device that's powerful enough to replace what we have today. You can put it on your desktop if you want, or slip it into your pocket.
    • Re: (Score:3, Insightful)

      by Lord Lode ( 1290856 )

      Fully agree:

      5 years ago, the top of the line Intel consumer desktop CPU's had 4 cores.
      today, the top of the line Intel consumer desktop CPU's have 4 cores.

      So, get to it please! :)

      • No... 5 years ago the top of the line *mobile* CPUs had 2 cores.
        Today the top of the line mobile CPUs have 4 much more powerful cores.
        It turns out that the mobile CPUs are now used on most desktops too, but there are still more powerful desktop CPUs.

        If, however, you want the top of the line *desktop* CPU, the 3930K has 6 cores and wipes the floor with any quad core CPU in existence... assuming you are running software that can actually take advantage of all those cores. Since a whole lot of even "multithre

        • Also, IRRC, weren't the quad core desktop CPUs from 4 years ago hyperthreaded, meaning 2 physical and 2 logical? That would make example, by comparison, a 12 core, as I believe Intel has sadly gone back to hyperthreading their CPUs.

        • The 3930K seems the one exception, but it's Sandy Bridge, and LGA 2011 chipset is borderline consumer! For Ivy Bridge, the newer architecture, it only goes up to 4 cores. And for the next architecture, Haswell, not even released yet, Wikipedia has the following feature to say: "Mainstream up to quad-core.".

          I'm curious why Intel went from 1 core to 4 cores in a short period of time, and then remains stuck at 4 cores for more than half a decade.

          • The same reason they're stuck at 3-4GHz. Heat and power usage. They have to shrink to an even smaller manufacturing process to avoid those issues.

        • by PRMan ( 959735 )

          Actually, the core multitasking engine as rewritten by Mark Russinovich and company in Windows now supports multiple threads very well in Windows 7. Run a program that is not written for multiple threads (even a batch file) and watch as all your CPUs level out to nearly equal amounts (within ~10% usually). This happens with nearly any process these days.

          So, 6 cores would work just fine for people. I think the real reason for the limit is that Intel doesn't want to cannibalize the overpriced virtualizat

      • Your top of the line must be different than mine.

        Everyone always says that Apple makes consumer electronics, yet they have a 12-core Mac Pro that they've been selling for 2+ years now...

        • Yes, and the mac pros typically have TWO processors not 1. We are talking about cores per CPU here. Try to keep up.

      • by Kjella ( 173770 )

        today, the top of the line Intel consumer desktop CPU's have 4 cores.

        More like 6. And the LGA2011 platform is also nice if for some reason you need 64GB (8x8GB) of RAM. The real issue for Intel is that most of things that "enthusiasts" care about don't scale that well. Sure, there's always people that need 3D rendering or whatever that could scale to 8+ cores easily, but they're more the "workstation" market. You're much more likely to find enthusiasts with a 3770K overclocked to 4-5 GHz and a few 2000+ MHz sticks of RAM than anything the LGA2011 offers. It's been that way s

      • 640 cores should be enough for anybody
      • Speaking of 5 years ago, we're still waiting for this [] to go commercial.

    • Re:Desktop (Score:5, Informative)

      by Z00L00K ( 682162 ) on Tuesday October 30, 2012 @12:48PM (#41819395) Homepage

      Any desktop with a decent GPU has more than that already.

      But the difference between a desktop and a phone makes it harder to get good performance on the desktop with many cores - it's memory bandwidth that's the bottleneck. On a phone you can dedicate cores to certain well-defined tasks and optimize them for that.

      • Programming for the GPU is also trickier - it's a specialised task, and so hiring a programmer experienced in the field is going to be a bit trickier and more expensive than hiring a programmer who can work with general-purpose processors.
        • Not really. GPUs and CPUs are both general purpose processors, just optimised for different classes of algorithm. If you have an algorithm that makes sense on a GPU, then it's pretty easy to write OpenCL C to implement it. It's often much easier to implement efficiently than the CPU version. Getting the most possible performance out of a GPU is not so easy, but the same is true for a CPU.
    • by Jamu ( 852752 )
      You probably already have that many. Except they're in your graphics card.
    • No point. Windows/microsoft doesn't have a clue how to harness and use that many cores anyways
    • ...explained that with the prototype chip someone could, for instance, be encrypting an email while also working on other power-intensive apps at the same time — without hiccups. Same for HD video.

      It sounds to me like the LG Optimus G [] is already there with its quad-core Snapdragon.

      I just played with one last week and I was quite amazed. The Optimus G really blew out of the water my Jellybeans Nexus 7 and my Jellybeans Galaxy Nexus with its super smooth UI and its HD video multi-tasking capabilities.

    • It's already been here since last year: []

      Oops, that was just Intel's promise in 2006. Nvm, carry on.

    • Re:Desktop (Score:4, Interesting)

      by Anonymous Coward on Tuesday October 30, 2012 @02:10PM (#41820387)

      From Rob Pike []:

      Twenty years ago, you expected a phone to be provided everywhere you went, and that phone worked the same everywhere. At a friend's house, or a restaurant, or a hotel, or a pay phone, you could pick up the receiver and make a call. You didn't carry a phone around with you; phones were part of the infrastructure. Computers, well, that was a different story. As laptops came in, people started carrying computers around with them everywhere. The reason was to have the state stored on the computer, not the computer itself. You carry around a computer so you can access its disk.

      In summary, it used to be that phones worked without you having to carry them around, but computers only worked if you did carry one around with you. The solution to this inconsistency was to break the way phones worked rather than fix the way computers work.

    • People think too one-dimensionally I think. They figure out that 2 or 4 cores means something is faster, therefore 48 cores would be even better (and then they get so excited they say something incredibly stupid like "redefine mobile devices"). But think in more than one dimeions. Maybe make the base CPU faster instead, add more memory, optimize things that are bottlenecks, etc.

      48 cores is a really bad idea given the clumsy shared memory model current implementations use.

      And what slows down smart phones

  • excessive (Score:2, Insightful)

    by fizzer06 ( 1500649 )
    will it include a car battery?
    • by tom17 ( 659054 )

      I think this way forward likely HELPS power usage if anything.

      Let's take 2 power-equivalent processors. Both use 1W at the lowest non-idle power setting. Processor A is a 2 core model, B is a 48 core.

      Each core on the 2 core model A uses approx 0.5W (ignoring overheads etc)
      Each core on the 48 core model B uses approx 0.02W (ignoring overheads etc)
      In this instance, it's likely that each of the model B cores will be one 24th as computationally powerful as a model A core.

      If you have some trivial tasks (i.e. mos

      • (ignoring overheads etc)

        I think I found the problem with your assumptions. Ignoring overheads can get you into a lot of trouble in parallel computing.

      • That's not how those things scale, though. In your example, for you device to be borderline useable, you'd have to use about 18 cores, minimum, at most times. Assuming perfectly threaded software, inifnitely scalable. Sleep state is a different beast, but then you can simply put an extra low power core that takes over in such cases, like the Tegra 3. Consider there are tons of ARM clusters around. When power consumption is equal, a system with dozens of cores is incredibly inneficient against an Intel Core

  • by tom17 ( 659054 ) on Tuesday October 30, 2012 @12:38PM (#41819251) Homepage

    And we can stop then because, well, you know, 640 cores should be enough for anyone.

  • It's just Larrabee (Score:4, Insightful)

    by Balial ( 39889 ) on Tuesday October 30, 2012 @12:40PM (#41819277) Homepage

    Keep floggin' that dead Larrabee horse, Intel.

  • Maybe...but not soon (Score:5, Interesting)

    by Overzeetop ( 214511 ) on Tuesday October 30, 2012 @12:41PM (#41819293) Journal

    Sure, you could put the power of today's typical desktop in a phone with such a beast, but by then we should have desktop boxes with an order of magnitude more power than one - and we'll find a way to "need" that extra power on the desktop. It's not just about chip capability and battery life - I'm mean you can always plug in a phone and run it full power 24/7. But you start running up against the limits of thermal dissipation. It's no surprise that maximum TDP has not changed a whole lot on the desktop per processor. We're still limited by the ability to aircool a chip that's really a 130+/- watt heater. That won't change. Remember also that until you standardize a dock with a real video connector (or crazy fast wireless video - not this compressed crap we use for movies), you're still limited to that little tiny window on the portable device.

    The biggest potential savings is if they can shut down 47 cores and run one thread at low power when I'm not "using" it so the battery lasts as long as possible.

    • A while back I was working on site with a customer... a major featurephone manufacturer, who will remain nameless so as to protect the incompetent. We discovered that the phone would crash if our software ran the CPU flat-out, even if it was at a lower priority thread than anything else.

      It turns out that the sorry excuse for an operating system this thing ran was doing system-critical tasks in the idle thread, so if the CPU didn't idle some message queues would overflow and the OS would reboot. When I ask

  • by Anonymous Coward
    Hmm. 48 cores. More processing power. Better battery life. And how large of a backpack will I need to transport those long-life batteries? Just trying to plan ahead.
  • by ducomputergeek ( 595742 ) on Tuesday October 30, 2012 @12:42PM (#41819303)

    had to be said.

    • by slim ( 1652 )

      +1 Obligatory

    • You're joking, but half-seriously, imagine if a manufacturer would make a cheap 'tablet' without a display and with physical network ports and just enough ventilation that you could stack multiple tablets as high as you please without overheating. Imagine a Beowulf cluster of those!

      I imagine the power adapters and the power strips connected in series would look really silly.

  • by CajunArson ( 465943 ) on Tuesday October 30, 2012 @12:43PM (#41819317) Journal

    Just before the AMD board executed him, Dirk Meyer screamed out: "MOAR COARZ!" And today we have Bulldozer.

    The moral of the story is, MOAR != more all of the time. Especially in a freakin' cellphone, where, despite what some Slashdotters think, the primary use case is *not* performing massively parallel scientific simulations.

  • 10 years? (Score:3, Informative)

    by Omnifarious ( 11933 ) * <> on Tuesday October 30, 2012 @12:46PM (#41819367) Homepage Journal

    Hah! The Parallela Kickstarter project was from a group that already had a 64 core CPU that consumed only a watt or 4 of power when running full-tilt. If this takes Intel 10 years to design they'll be left in the dust.

    BTW, they wanted the money to fund developing the mask for the Fab technique needed to mass produce them for $100 apiece instead of the few hundred apiece they now cost to manufacture. The chips already exist.

  • The phone would be smart enough to not just be a computer but it could be my computer.

    Anyone have a clue what he means by "my computer"? All the computers I currently own and use are my computer already.

    • Currently your PC is a rather dumb tool, a useful tool but dumb. The fantasy is that stuff like Siri makes computers actually useful other then for finding cat pictures. Because outside work related coding and some gaming, I don't use my PC for much that other devices couldn't also do. Yes, I watch movies on it but I can watch movies on a projector if they still sell them. I can play music on a wax cylinder.

      Yes, there are some small advantages but not nearly as many as you would like.

      Play romantic music com

  • 48 Cores? (Score:5, Informative)

    by QuantumHack ( 58048 ) * on Tuesday October 30, 2012 @12:47PM (#41819377) Homepage

    Not useful without a serious change in computer architecture.

    Amdahl's law []. It's a bitch, baby.

    • Re:48 Cores? (Score:5, Informative)

      by c0lo ( 1497653 ) on Tuesday October 30, 2012 @01:53PM (#41820213)
      I see your Amdahl's Law and raise you a Gustafson's law [].
  • by bradley13 ( 1118935 ) on Tuesday October 30, 2012 @12:47PM (#41819391) Homepage

    Haven't we already been here? When multi-core processors first became widely available, I recall a study that showed that anything over 8 cores was counterproductive. First, very few people have enough background stuff running to need more processing power than that. Second, coordinating multi-tasking on multiple cores requires a lot of complex work by the operating system, unless you just dedicate one to each process (not to each thread - that opens up problems with cache and data consistency). The benchmarks on desktop computers showed that adding more than 8 cores to a general purpose system actually slowed the system down due to added OS overhead.

    About the only way this many cores can be useful is for graphics processing (or, in TFA, video processing): many simple cores work in parallel for the same process, on different parts of the same data. This, of course, is what graphics chips already do for a living.

    • . unless you just dedicate one to each process (not to each thread - that opens up problems with cache and data consistency).

      How so? Any issues with cache consistency have to do with each core having their own L1/L2 caches but sharing the same memory. This is what hardware based cache coherence protocols like MESI were invented for and have nothing to with running multiple processes vs. multiple threads. Are you're referring to the fact that threads in the same process share the same address space? There has to be care taken to serialize access to critical sections(such as using a lock based on a mutex), and while blocking threa

    • Haven't we already been here? When multi-core processors first became widely available, I recall a study that showed that anything over 8 cores was counterproductive.

      This assumes he means equally-powerful cores.

      I can easily see having 8 high-speed cores for games and intensive processing, 4 super-low-speed ones for background tasks during idle moments with the screen turned off, and some medium-speed cores for normal use with the screen on.

      As far as efficiently using all 48 cores for one task goes -- the programming capable of doing that would be horrendously difficult or embarrassingly easy without much room in between. Amdahl's law [] shows that even with mythical 99% sc

  • Already here, kinda (Score:5, Interesting)

    by AmiMoJo ( 196126 ) * <mojo@world3.nBLUEet minus berry> on Tuesday October 30, 2012 @12:48PM (#41819397) Homepage Journal

    Modern phones already have many, many cores in them. They are just not general purpose cores available for use by the OS or applications.

    For example the radios in phones are software defined, meaning they have an RF front end that just feeds the signal to a dedicated signal processing core to do decoding. They have at least one for the mobile network radio, one for WiFi, one for Bluetooth, one for NFC, one for GPS. The audio codec will have a signal processor that can do mixing, digital filtering and parametric equalization. The cameras will each have processors to handle some image processing before the data is handed to the main CPU.

    You could combine all these cores into a single CPU, and then you would have the advantage of being able to use them for other things when they are not busy handling the aforementioned hardware. The problem is that these more general purpose cores tend to use a lot more power than dedicated ones designed specifically for one application, but presumably Intel things it will get the power budget down to something acceptable in 10 years time (or batteries will get a lot better).

    • Most of those cores are DSPs. x86 variants are crap for DSP related operations. TBH x86 is crappy for pretty much everything.

      The way we should be going is reconfigurable logic. For example when an mp3 is played, the device is reconfigured to contain a hardware codec. This can work on an async clock so it will only tick on data availability. When it's not being played, it turns of the macrocells that it was built on. There should be analogue and digital macrocells which can take on the RF and computational

  • by banbeans ( 122547 ) on Tuesday October 30, 2012 @12:49PM (#41819407)

    Want to impress the pointy heads talk about more cores.
    Throwing more cores is not the solution to all problems and actually opens up a whole bunch of new ones.
    I wish that more attention would be spent to solve real problems rather than trying to be buzzword compliant.

  • The O/S supports the additional resources. Who has am O/S today that has a proven roadmap to get there?
  • by Speare ( 84249 ) on Tuesday October 30, 2012 @12:53PM (#41819459) Homepage Journal

    Until you revise the whole way people write software, adding cores is useful to a very limited point. Today's software can be split at one core per thread, or one core per process. If you try to get two cores to work on the same thread, you just increase serial contention, not decrease it.

    Even thread-happy Java is only working on maybe 3-5 threads at a time, the rest are sleeping until a device wakes it, or until a certain time has elapsed. A new compiler may be able to help a little bit, but it's just going to be creating very short-lived micro-threads when it detects those few opportunities for them.

    Graphics hardware is great for many parallel cores, because it's the same tight problem with different data, endlessly repeated. Multiply these 4x4s please. Fill these pixels please. Endlessly. Same goes for encryption, and maybe a few bits of video game AI logic. Not many other software naturally fits to using many cores.

  • unless there is some breakthrough in battery life in the next 5-10 years, having a 48 core cpu run on a mobile device for 5 minutes before depleting is worthless.
    • Who says it has to run all 48 cores all the time?
      Lets assume this does become your primary computing device. It could run 2 or 4 cores on battery, rnen when you dock it and need some real power, it ramps up to all 48 cores.
      That is just off the top of my head. I am sure there are many other viable options. Also:
      Batteries will probably get better.
      Power needed by each core will probably reduce.
  • Why does a cell phone need 48 processor cores? How about we force all the shitty mobile app developers to make shit work on one core before we throw 47 more at them to abuse with their terrible resource management skills.
  • by crow ( 16139 ) on Tuesday October 30, 2012 @12:58PM (#41819535) Homepage Journal

    Current phones do just fine with HD video and multitasking. Sure, some may glitch occasionally, but more due to software design than lack of CPU. This will do little to nothing for the things we use our phones for today.

    What it will enable is new classes of features, such as real-time video and voice processing. With that sort of CPU power, you can do voice recognition without sending the audio over the network for analysis. Who knows what people will think of doing for video analysis?

    Though for the most part, the added cores will be powered down, doing nothing but putting a good bragging number on the spec sheet.

  • projected uses (Score:5, Insightful)

    by Anonymous Coward on Tuesday October 30, 2012 @01:02PM (#41819585)

    Core 1-12 : DRM
      - these cores will check all audio/video/ebook files for copyright infringement

    Core 13-24 : TPM
      - these cores will implement TPM and secure the DRM portion

    Core 25-37 : Genuine Advantage Checking
      - these coes will check that the system state is valid, and all license keys are valid and updated

    Core 38-40 : Virus Checking
      - these cores will implement malware checks and virus checks

    Core 41-47 : OS and Sandboxing/Security
    - these cores will run the base os, and run all applications in sandbox mode

    Core 48 : User Application
    - this core will be available for running user applications in the performance reduced sandbox mode. Priority is given to cores 1-47, in order of decreasing priority.

  • by Todd Knarr ( 15451 ) on Tuesday October 30, 2012 @01:06PM (#41819637) Homepage

    My phone won't become my computer because it's not about the CPU power. Hasn't been for ages now. The average phone already has more CPU power than the average desktop user makes use of. It's more about:

    • Memory. My computer has something like 16x as much RAM as my phone, which means it can run a lot more stuff simultaneously. It can run all those system-tray programs, without breaking a sweat. But RAM means power, and putting 8GB of RAM into a phone increases the power draw (modern memory requires not just constant power but constant refresh access) and decreases the battery life.
    • Size. My computer has 2 27" monitors on it, making for a lot more screen real-estate to work on. And I need it when I'm simultaneously writing a document, referring to a spreadsheet, several e-mail messages and a couple of Web sites, keeping track of a couple of IM conversations, and let's not forget Visual Studio with a dozen files open in it. You can't have that kind of screen space on a phone, you're physically limited by the size that can be conveniently held in one hand.
    • Peripherals. I mentioned monitors. There's also my good Model M keyboard (you don't appreciate a good keyboard unless you're working in a job where you're typing nearly constantly for hours on end), the high-end mouse, the small input tablet, the good 5.1 speaker set for the music, the scanner for getting paper documents into electronic form... You can't attach all that to a phone in any useful way.
    • Portability. Yes you can solve all the shortcomings of a phone with a dock and attached peripherals, but why? By the time you're done, you've removed the things you wanted from a phone: the ability to carry it in one hand, and to not have it tied down with wires. We're seeing with Win8 what the downsides are of trying to design a system for both phone/tablet and desktop uses, and you end up not being satisfactory for either so the idea of grabbing the phone out of the dock and going ends up hamstrung by that.
    • Price. One reason desktops are cheap is that they can use commodity parts and have enough open space in the case that they don't have to worry about power so much. Phones are almost 100% custom-built with a lot of work going in to designing parts that can be packed into that small a package with no ventilation at all in 100-degree ambient heat and not incinerate themselves. They're going to inevitably be more expensive than a desktop just because of that. And while Microsoft may be willing to hemorrhage money on hardware because it supports other revenue streams, phone manufacturers can't.

    So while more cores may help phones do phone-like things better, especially combined with nifty ideas like Google Glass, but it's not going to help the phone replace my desktop.

    • I really don't mind that they keep trying to add different new and improved widgets to every gadget we have. They clearly don't have a foolproof way to truly understand how something would be useful to an end user without stumbling on it anyway. I see some of the things they do in SciFi (minority report for example) as someone trying to take a stab at what people will want to do/benefit from in the not to distant future. While some of it ends up being mimicked in reality, that often results in the real
  • by CCarrot ( 1562079 ) on Tuesday October 30, 2012 @01:06PM (#41819639)

    Sure, but how *fast* would these micro-cores run? Will we wind up with a single core not being able to handle anything on it's own?

    Plus, adding more cores for shared tasks increases the management overhead. It's like a project: more resources can help get the job done faster...if they're properly co-ordinated, communicating properly, and everyone isn't waiting for one particular resource to finish it's task before the others can proceed. All of this takes significant overhead time IRL, I can't imagine it would be much different for flocks of cores...

    I can see this allowing more simultaneous processes to run without bumping into one another, but basic speed for each process might go way, way down.

  • ownership (Score:4, Insightful)

    by vlm ( 69642 ) on Tuesday October 30, 2012 @01:09PM (#41819673)

    The phone would be smart enough to not just be a computer but it could be my computer.

    As trends continue it will not be "my computer" it'll remain my service provider's computer which they graciously let me use a small fraction of its capabilities for a monthly fee.

  • While on the surface it may seem that "Big Brother" (or the competition, or China - assuming that they're not one and the same) is less likely to have his nose in your PC given that the intent of a phone is to maintain external communications links, one would do well to remember that those teeny commo chips that fit onto a phone's dinky mobo would fit onto a massive PC mobo, too. Which is something that has probably occurred to anybody who remembers the Tandy 1000 TX - and so remembers anonymous custom chi
  • by TheSkepticalOptimist ( 898384 ) on Tuesday October 30, 2012 @01:25PM (#41819909)

    I'm still waiting for the 80-core PC.

    How about realizing one of your predictions before creating another. You never achieved an 80-core PC and barely made a dent into the mobile CPU market so don't bother predicting anything for the mobile market until you are player.

    Also Intel, its time to change your architecture. Packing in billions of transistors into a core to support legacy instructions is not an efficient design anymore. I don't want 4, 48 or 80 cores with defunct MMX and obsolete x86 instructions. Stop making CPU's where every core contains the entire history of Intel.

  • by Funk_dat69 ( 215898 ) on Tuesday October 30, 2012 @01:25PM (#41819921)

    Intel like to throw claims like this out there to try and win mind-share.
    It means nothing, but sounds impressive in a vague, buzz-wordy way. It's just marketing.

    I'm not surprised their vague future predictions are aimed at mobile now. They desperately need mind-share in that segment.

    "Look! We're relavent in mobile! We'll have FORTY-EIGHT cores! All with frickin' laser beams attached to their heads!"

  • I'm pretty sure that if you told someone 10 years ago we'd have quad core processors running at over 1GHz with loads of graphics, ram, etc on a smart mobile phone, they'd think why can't you use it as your main computer? In 5-10 years by the time this 48 core would be feasible, desktop computing will have grown/changed as well. I'm not sure our mobile devices will ever be our main computers. A link to them perhaps, but its far easier to offload the work to a remote station than do it all onboard.
    • With even the cheap desktops coming with mobile CPU's now, we're just seeing a widening gap between power users and the mainstream computer owner. I upgraded my wife's computer to a wimpy Sempron 145 and she couldn't be happier with finally having 8GB of RAM (not to mention it being DDR3) and not really needing more CPU power. While I on the other hand am upgrading to an Ivy Bridge i7 this week and am not totally sure if I went high enough.

  • The issue they're trying to solve isn't "how do we make more powerful phones?", it's "how can we lower power usage without sacrificing power?".

    If you try to improve performance the same way desktops did (higher clocks, great gobs of cache, and more execution units inside the core), you'd raise performance, but you'd also massively increase your power draw without giving you an easy way to lower it when it's not in use. Underclocking only gets you so far - ideally, you'd need to be able to shut down executio

  • by CBravo ( 35450 ) on Tuesday October 30, 2012 @03:06PM (#41821175)
    There are two issues with all this multi-core speak we've been hearing:

    There is hardly any code, other than a few optimized libraries, that use multi core processors. They try to make it sound unimportant but it is the largest hurdle. There is a reason people don't think of a 3930k as being 50% faster than a 3820! Other than some failed libraries like OpenMP; there are no valid programming models to use either multithreading and/or networking transparantly (since it should not functionally make any difference). Ergo: The developers are still making their single-thread code.

    The advantage of multi core processors can be that you can lower you clockspeeds and/or memory speeds on secondary tasks. Hardly any research is done on that (I have myself but let's call that an 'academic' proof of concept).
  • by Rexdude ( 747457 ) on Tuesday October 30, 2012 @03:28PM (#41821459)

    Reminds me of an old Soviet Russia joke(more here []):

    A Polish tourist comes back home after visiting the USSR. He carries two very large and heavy suitcases. On his wrist is a new Soviet-made watch. He tells the customs man: "This is a new Soviet watch. It's a wonder unknown in the capitalist countries. You see, it shows time, the rate of your pulse beats, the phases of the Moon, the weather in Warsaw, Moscow, and New York, and more and more!"

    "Yes, it's a wonder," the customs man agrees. "And what is it you have in these big suitcases?"

    "Oh, it's just the batteries for that watch."

  • by nurb432 ( 527695 ) on Tuesday October 30, 2012 @04:52PM (#41823015) Homepage Journal

    Ya, with nanobattery life these things will have, people might actually go out and expirence the real world again, while their phone/tablet charges.

The IQ of the group is the lowest IQ of a member of the group divided by the number of people in the group.