Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel Cloud

Intel Declares Independence From PC, Prioritizes Cloud, IoT and 5G Efforts 153

A week after announcing 12,000 job cuts, Intel CEO Brian Krzanich has shared vision for the company, hinting a shift in its prime focus away from PC business. In a blog post, Krzanich said that the company will be actively growing its data center business. The chip maker also plans to focus on chips and technologies for IoT devices. "The biggest opportunity in the Internet of Things is that it encompasses just about everything in our lives today-- it's ubiquitous," Krzanich said. The company also plans to boost its memory chips business and make a push towards utilizing them in data centers and various cloud services. Intel said that it has made several investments in this field, noting the $16 billion acquisition of Altera last year. The company says it will be playing a big role in the move to 5G connectivity. "Connectivity is fundamental to every one of the cloud-to-thing segments we will drive," he writes.

Over the years, Intel has failed to keep up with Moore's Law, an axiom that semiconductor density will double about every two years. The company previously extended the timeframe to 2.5 years, but Krzanich assures customers that the they are working to make further advances in order to meet the goal. "Moore's Law is fundamentally a law of economics, and Intel will confidently continue to harness its value," Krzanich said. PCWorld has extensively reported on this.
This discussion has been archived. No new comments can be posted.

Intel Declares Independence From PC, Prioritizes Cloud, IoT and 5G Efforts

Comments Filter:
  • The microcomputer first replaced the low end mini-computer and word processing systems.
    Next they replaced larger mini-computers like the PDP-11.
    Later they replaced the Super-minicomputers like the VAX.
    The cell phone will probably replace the PC.
    Imagine a Windows Phone that you put on your desk and hook up a USB 3 connector to your monitor that provided power and a network connection to your phone and video to the display. Mouse and keyboard can be USB connected to the monitor or bluetooth. You could even us

    • by Junta ( 36770 )

      Much hinges on whether MS can deliver a good and *perceived* to be good mobile experience. As it stands, today people prefer to have two distinct devices, in order not to run MS on their phone. Or it could be because MS is being obstinate phones being ARM, limiting the 'desktop' versatility.

      • It's not even hatred of MS that doomed their phones. I tried one in a store, and it was actually pretty usable. Arguably better than the contemporary Android phones.

        But they had no apps. Back then, Google Maps was the top dog, and it wasn't on Windows Phone. Maybe one app out of a dozen that I wanted actually ran on it. And I couldn't find replacements for half of them.

        If people don't develop for the platform, it's not gonna live long. Even the perception of "no apps" is deadly because it makes people---bot

    • And your watch will be you next phone.
    • It was a good post until "microsoft may just win the mobile phone market." This has already been won by Android (and to a certain extent by Apple as well). Microsoft can't win something they have already lost. Especially when they haven't even hardly begun the race when everyone else is already at home watching the highlights on the 11 o'clock news.

      • Everyone thought Netscape was king of the hill. Mind you, that was a very different situation, but it shows that any platform that is on top can fall, and underdogs just may take the fallen crown.

        • Netscape v Microsoft was a David/Goliath matchup when you look at the companies behind the software. Plus Microsoft used illegal means to promote IE over Navigator.

          The phone OS market is entirely different. Both Android and iOS are supported by large corporations. Both are entrenched with an ecosystem of apps dependent on them, and users' money sunk into those apps.

          Navigator actually did have some unique plugins, but other browsers eventually implemented their plugin API. And users didn't have their money i

          • I know Netscape lost, but did IE really win long term? I mean, from my perspective, not - most people I know only use IE when forced to...

            • Yes. They had the best and most popular browser for a long time.

              To be quite frank, browsers really aren't that much more capable than they were in 2000. Take AJAX. You could do the exact same shit with IFRAMEs. Except, your audience were running 300Mhz processors, so really, you couldn't do shit.

              Not that improvements haven't been made, but it's really the hardware that made the difference.

              • I think V8 made some measurable gains (turning a giant steaming pile into a slightly smaller, slightly cooler steaming pile...)

                Too bad "we can't be trusted" to develop plugins like the NaCl project anymore. Quick search looks like pNaCl is trying to address that... but I smell Javascript again...

      • by LWATCDR ( 28044 )

        You mean like Nintendo and Sega where the winners in the Console war... until Sony showed up. Then it was Nintendo and Sony until Microsoft showed up...
        Or how DEC and IBM where real computers companies while Apple made toys...
        Or IBM won the PC war?
        It may be over for now but not forever.

      • Except the mobile phone market hasn't been replaced yet so the race to the end of that particular technology isn't at an end, and no one has finished. If they think they are done and are relaxing eating chips in front of the tube then someone else does have an opportunity.

    • What kind of FPS can I get running The Division on my phone in 1440P?

      • That's the thing. For casual use maybe your fone will be the "next computer" but as long as there's a PC gaming industry, they will keep pushing PC performance boundries.
        • by ranton ( 36917 )

          That's the thing. For casual use maybe your fone will be the "next computer" but as long as there's a PC gaming industry, they will keep pushing PC performance boundries.

          No reason video acceleration needs to be done in the cell phone. That could happen in some module connected to the monitor / tv. This could easily be the duty of the next generation of consoles, with them simply being external GPUs which give your mobile devices better gaming experiences.

          • True, and I believe I saw some external video cards recently being made. But in terms of performance, I don't think having an external video card receiving data via USB-C or the likes will ever match or exceed one directly plugged in to a motherboard. But who knows.
            • by ranton ( 36917 )

              True, and I believe I saw some external video cards recently being made. But in terms of performance, I don't think having an external video card receiving data via USB-C or the likes will ever match or exceed one directly plugged in to a motherboard. But who knows.

              I would have agreed with that, but after looking up the throughput of PCIe 3.0 and USB-C, it looks like they have very similar speeds. USB-C provides 1.25 GB/s while PCIe 3.0 is about 1 GB/s. There is the distance factor but it appears the speeds could be similar.

              • by Holi ( 250190 )
                Not USB 3.1, Thunderbolt 3, which at 40 Gbps seems to be fine for current video cards.
        • by LWATCDR ( 28044 )

          Unless streaming games takes over. I also doubt that the Cell phone will take over the CAD/CAM, Video production, or any other HPC tasks. But for the masses that get on facebook, use quicken, and run office a phone maybe good enough. With an X86, monitor, mouse, and keyboard a phone running Windows 10 could also run all those legacy applications that a lot of people can not live without.

      • by LWATCDR ( 28044 )

        Today probably not all that fast but in two years? Besides gaming and other HPC tasks will not go away but for the I need to run Office and Quicken crowd phones are getting very close to good enough.

        • by Anonymous Coward

          Today probably not all that fast but in two years? Besides gaming and other HPC tasks will not go away but for the I need to run Office and Quicken crowd phones are getting very close to good enough.

          So when everybody's done spending money on one of these new phones and a monitor that supports this scheme, they get to sit down to an experience that is inferior to a half-decade old Sandybridge laptop they could have bought off of eBay for 50 dollars. Wow. The Future(TM).

    • The car of the future: Intel Inside!

      Drive up to 40% faster while using 20% less energy! And with Turbo-boost technology, win those street races every time! Multi-core technology allows you to be more productive on and off the road!

      *** Warning: Overclocking may lead to engine overheating and -- Your car has encountered an unknown error and

    • Yeah yeah, the PC is dead we know. I have heard this same thing repeated countless times over the last couple decades. In order to make such a claim you have to also claim that our thin client ability has magically evolved so that we can all work on a 4" screen. Oh wait, that has not happened. So people are once again claiming "MAGIC IS REAL" and "REALITY IS FOR PEOPLE WHO CAN'T FANTASIZE LIKE WE DO!".

      Could you claim "hey, lots of people work with smaller screens?" Sure. Many people can only afford to

      • by LWATCDR ( 28044 )

        No but it will be a different form factor. Did you read what I wrote?
        Just what would you call a phone with an x86 running Windows using a full sized monitor, keyboard, and mouse? It is just a small form factor PC.

        • If I put a PC in a 4x24x22 frame, or a 8x12x12 frame, or a 2x2x2 frame, it's still a PC. Why? The easy but hard to fathom might be that the instruction set the chip uses, etc.. is all called PC. Probably more importantly, it's not a usable PC because of the form factor. It's usable because of the peripheral devices we use to access it. I'm still connecting an external FULL sized keyboard, some type of tracking device (mouse), and at least one reasonable sized (24") Monitor to be able to use the PC.

          The

          • by LWATCDR ( 28044 )

            I didn't say the PC was dead I said the phone would replace your PC. It is a bit different than just a new chassis since it will also be your phone and have long battery life and fit in your pocket. Replacing the PC does not mean that PC will go away completely. IBM still makes minicomputers and while the company formally known as SUN probably calls them servers they are closer to mini computers in many ways.
            Tradition desktops will still be used by gamers, for CAD/CAM, and other HPC uses but I can see the p

      • Agreed that PC's are not going away. The shift in development spending towards datacenters makes sense if you believe that market growth will be in datacenters. This is likely given that the world is saturated with smartphones (and pcs) which are access terminals to applications like social media which run in datacenters.

    • I doubt that Apple wants to do that since it is not all that interested in the enterprise market.

      That's not exactly true. They just aren't interested in the SERVER market. But in the past few years, they have taken a renewed and serious interest in the enterprise market overall.

      Dig down into this deceptively-fluffy page [apple.com]. There's actually a lot of information there.

  • So Intel, who has focused on the performance enthusiast crowd, now sees dollar signs by trying to put their chips on all your home devices. They're sticking with the one existing market they have a chance at making money from. What a shock.

    Seems strange to go from having one main competitor to having many in the embedded world.
    • They see that traditional computing devices are on the decline. They are changing with the times and positioning themselves to ride the wave of smaller devices and emerging technologies.

      • by Mashiki ( 184564 )

        They're on a decline in the Americas and parts of Europe. That's mainly because your average person doesn't need a desktop to check the news or their email. That doesn't hold true of course to programmers, gamers, game developers and so on. It also doesn't hold true in many parts of Europe, Asia or the Middle East or Africa either, where the PC is increasing marketshare because there is no saturation.

    • The elephant in the room is that this is the year that intel falls behind in semiconductor process technology.

      Intel has been enjoying a comfortable lead, but this year TSMC opens up a 10nm fab. Intel recently disclosed that they wont be able to open up a 10nm fab until at least the end of NEXT year.

      Dont believe their press-release bullshit hype. Intel forecasts big trouble for itself from here on out, because they just dont have the lead in technology anymore.

      Things would be different if Intel was ru
  • by NotDrWho ( 3543773 ) on Wednesday April 27, 2016 @03:22PM (#51999867)

    "Don't worry, we'll be happy to take over."

    • by Anonymous Coward

      This truly sounds like a good opportunity for AMD to take some market share from Intel. If Intel ends the research and development by massive layoffs, spends billions on buying crappy companies (McAfee, Altera, etc..) and focuses just on trendy fads (cloud, experience, and IoT), they are doomed. The current Intel CEO seems to be following Microsoft's footprints from market dominance to quick destruction.

      • by ranton ( 36917 )

        This truly sounds like a good opportunity for AMD to take some market share from Intel. If Intel ends the research and development by massive layoffs, spends billions on buying crappy companies (McAfee, Altera, etc..) and focuses just on trendy fads (cloud, experience, and IoT), they are doomed. The current Intel CEO seems to be following Microsoft's footprints from market dominance to quick destruction.

        Yeah, and companies that focused on fads such as personal computers in the early 80's were doomed as well. We have a lot to learn from visionaries like IBM's John Akers who dismissed PC's as a fad best handled by companies with no vision, like Microsoft and Intel.

        • The point was laying off your own core competency engineers and buying up companies to plug together some quick business model has had...problems in the pase, as demonstrated over and over and over.

  • Trumped (Score:2, Insightful)

    by Anonymous Coward

    "Moore's Law is fundamentally a law of economics, and Intel will confidently continue to harness its value," Krzanich said.

    Sorry, but Laws of physics inevitably trump "laws" of economics.

    • by Bob_Who ( 926234 )

      I agree, but we might need to find a better verb than "trump".

      As for economics, laying of 12,000 people really sucks when its so easy to low ball the next graduating class and quietly hire others, as needed. Its a human resource style that interchanges employees as if they were silicon chips and not carbon based life forms.

      • As for economics, laying of 12,000 people really sucks when its so easy to low ball the next graduating class and quietly hire others, as needed.

        Thats not what Intel is doing. They are laying off people because their own forecasts are grim. They lost the lead in fab process, and by the time they finally get a 10nm fab up and running they will only be catching up to the two other semiconductor manufacturers than have already had 10nm fabs running off chips for a year.

        If you have Intel stock, its time to sell.

  • Hyp (Score:5, Insightful)

    by Alomex ( 148003 ) on Wednesday April 27, 2016 @03:31PM (#51999931) Homepage

    The internet of things is mostly hype, as people with nests at home have learned . Sure you can talk to your thermostat at home via the internet, but why would you want to?

    • What good does it do you? What problem does it solve and does it solve the problem in an economically advantageous manner?

      As far as I can see, it doesn't. Would it be nice to have my fridge tell me everything in it? Certainly. Would it be nice if it printed out a list of everything it is now missing? Again, yes. But so what? That saves me all of five minutes. Because I am certainly not going to leave the next part, the shopping, to some third party like Amazon or Google or WalMart. This isn't me being a lud

      • My "connected" thermostat has sensors in multiple rooms, that enables better control of temperature in the whole house, not just the one place the thermostat is mounted. Since installing it, we are more comfortable (less over and under cooling at various times), and the electric bill dropped enough to "pay back" cost of the thermostat and extra sensors in just about 1 year.

        All in all, I think it is economically advantageous to us, and probably would be for most people in a typical 3+ bedroom house, especia

        • by JanneM ( 7445 )

          But what does the IoT bit have to do with it, though? Could just have a local thermostat with sensors in every room. Optionally passively download prices from your electricity provider if you want to get really fancy with scheduling your water heater.

          You don't need the thing to be accessible over the net, and it most certainly doesn't need to synchronize all your usage data with some IT company somewhere. Our fridge manages to save us quite a lot of money simply by learning what time of day we're likely to

          • Yep, if there were a non "cloud dependent" alternative, I would have bought that. At the moment, I needed these features and this is all that's on offer.

            Spot me $25M and I'll start a company to develop a cloud free thermostat, it should be on the market within 48 months after the money hits my account - I promise.

        • by Holi ( 250190 )
          I want that, I just want the brains that control it to be in my house, not on some company's server who are taking all that data about my house and selling it to data miners.
          • I hear you, and agree - I really want mine to offer the connectivity it does and just "serve" the data from itself instead of the cloud - perhaps with an option where the company compensates me $5/month for sharing my data with them for as long as they are that interested in collecting it at that price.

            "The cloud" makes configuration a little simpler for people who want to see what temperature it is in their house when they are away - but not enough to force the whole thing to run through their servers with

    • Exactly. I'm been disconnecting things from the Internet lately, not connecting.

      Once the lovely wears off, you're left with why. I haven't found a good answer yet.

      • I've got one "connected" light switch - it turns on at dusk, off at 10pm - outside lights... and they can be controlled from our phones. Works, works well, I'd probably install a half dozen more of the things around the house if they weren't $50 each.

    • until IoT runs its course, it will sustain a bunch of businesses that make connected (and spying) toys.

      after enough bad things happen, the fad will die down and remote access and remote control will still be here (which is good) but hopefully the spying and datamining will stop.

      that's my hope.

      I like iot, but I don't like the spying and phone-homing. I DIY it so I know my own stuff isn't doing anything I didn't tell it to. most people don't DIY it, though.

    • by b0bby ( 201198 )

      I got an Ecobee, mostly for the fact that I can tell the fan to run a certain number of minutes per hour and it has a remote sensor so it can tell that my upstairs is getting too hot (thermostat is on the ground floor). The remote access is a plus; I can set it to vacation mode after I have left, for example. Not worth doing it just to get that, but once you're buying a little wifi computer for your wall, why not have access remotely?

    • I have a different "IoT" thermostat, and my best use for it is to set it in "vacation mode" from my phone, after I've already left on vacation and forgot to do it before I left. It's also handy to have the thermostat interface on my phone so I don't have to walk my lazy butt over to the thermostat to interact with it. The temperature/humidity and system operation logging and graphing functions are kind of informative to look at / compare how long the system runs when set at 74 vs 76 degrees - a couple of

    • Your specific applications for IoT are mostly hype. Replace IoT with "sensor network" which is what it was called in the 90s and you'll realise your entire life has benefited from IoT already. It's more than just some overpriced thermostat and a lightbulb which you can control from your phone.

  • Obviously the market is slowing down and that's reflected in my own PC builds as I've gone longer and longer between refreshes as the technology held up longer. But if they're going to shift development to appliances it sounds like they're giving up on the enthusiast market altogether. I can be happy with AMD too but that's gonna affect the whole market of motherboard manufacturers as well.

    Also I think the IoT market has reached saturation. I don't want or need internet access to my refrigerator and how

    • Indeed. I built a new Haswell based machine a couple years back for gaming and it still has more power than I need. I don't see myself building another any time in the foreseeable future.
    • I'd buy a new machine, but the significant performance increase I'm looking for just isn't available. I'd like to see the single-thread instruction rate for moderate-sized programs (something that can be held in on-chip cache) double. There are optimizations that can be done to allow one core to go really fast, and I don't think they're being done.
  • Most of what Intel is pushing (Edison, Galileo, etc) in terms of IoT hardware is more expensive than existing solutions (Raspberry Pi, Arduino) which is pretty antithetical to many IoT applications where what you want is low cost. So I'm a bit skeptical that they are really serious about moving into that space, or at least if they are, they need to up their game. The Pi Zero, or the recently Kickstarted C.H.I.P. ($9 all-in-one computer with wifi and li-poly charging circuit) are much more aligned with IoT a

    • Most of what Intel is pushing (Edison, Galileo, etc) in terms of IoT hardware is more expensive than existing solutions (Raspberry Pi, Arduino) which is pretty antithetical to many IoT applications where what you want is low cost. So I'm a bit skeptical that they are really serious about moving into that space, or at least if they are, they need to up their game. The Pi Zero, or the recently Kickstarted C.H.I.P. ($9 all-in-one computer with wifi and li-poly charging circuit) are much more aligned with IoT applications.

      To get most business cases to close, you have big problems to address with respect to cost, power, and connectivity, and from what I'm seeing Intel isn't really fielding anything competitive.

      The last time that Intel was seriously in the Embedded game was with the 8051. And even that had some ridiculous design quirks compared to microcontrollers from Motorola, NEC, ST, etc.

    • The issue with Edison and Galileo is that they tend to be crippled platforms. It's been a while since I last looked at them, but I always ask the same questions. Does it have Ethernet? WiFi? 5V/3.3V serial port? Bunch of pins for miscellaneous I/O? SATA port or SD card? Enough RAM to run a standard version of Linux? HDMI and keyboard ports for testing?

      The latest Raspberry Pi has most of that. However, there always seems to be a catch with Intel products. Intel doesn't want the Edison or Galileo to comp

  • Sort of wrong (Score:4, Interesting)

    by BlueCoder ( 223005 ) on Wednesday April 27, 2016 @03:43PM (#52000015)

    Yes it should refocus on servers. But rather it should be the "home server" market. Maybe even home cluster.

    The market for desktops has mostly gone away, replaced by laptops and tablets. These people only needed a PC for spread sheets, simple word processing, and running a web browser.

    The gaming machines market is the same if not bigger.

    Professional content creation workstations are bigger than ever.

    Same money is being spent; just in different form factors.

    There is one area that isn't being exploited and marketed enough. The private cloud; i.e. home servers.

    Particular when mixed with virtual machines it's something that needs to happen more. Store all your media at home in one place. Use the online cloud only for immediate stuff and for backups. There is huge potential for streaming. A home server can do transcoding on the fly and a dozen other things all at the same time.

    It's more about the demise of the low end desktop which in the least has been replaces by cigar box systems.

    • It would be awesome to see an emphasis on home servers and server clusters. I run owncloud and I recently clustered it (two web server VM's, 3 mysql galera VM's; two pfSense/HAProxy VM's). All of this runs on hardware with Intel chips (one Xeon E3, one i7, one NUC - all Haswell). I might even get another server I have so many VM's. :-)

      Unfortunately, the biggest barrier to entry for running a home server is the internet connection, not the hardware. Dynamic IP's can cause your cloud to be unavailable unt

    • A "home server" is just a PC. Servers in the "cloud" are optimized for extremely heavy workloads. But when a server serves just one home, the workload is super light, and any computer can handle it.

      • Only in the USA. In much of the rest of the world the home server is whatever users the least power, quite different from "a PC".

        The home server is turning into an appliance. These appliances feature ARM quite heavily.

      • Only in the USA. In much of the rest of the world the home server is whatever users the least power, quite different from "a PC".

        The home server is turning into an appliance, powered by ARM.

    • Yes it should refocus on servers. But rather it should be the "home server" market. Maybe even home cluster.

      There is no money in that whatsoever, because people expect to get those machines cheap, and they are not disappointed. There are craploads of tiny little computers which do that job already, and they are already cheap.

  • My trusty i7 920 @ 3.8 Ghz was pretty much the plateau for Intel chips. Sure I could 'upgrade' to a 4.5 ghz for $1800 but really no reason to. I kept waiting for a 5 Ghz stock chip to upgrade to, but it never materialized. So, goodbye Intel. It was fun. Go ZEN.
  • Intel has already lost the embedded game in every way. Intel's chips suck too much power, cost too much and are too closed off when you compare them to the competition. Their obsession with x86 has made their failure a forgone conclusion.

  • by Anonymous Coward

    Moving from de-facto monopoly position on expensive cpus to competing with dozens of companies producing cheapest possible crap for IoT devices. Yeah, that's a good plan.

  • "The chip maker also plans to focus on chips and technologies for IoT devices"

    Before connecting 'things' to the Internet, how about fixing the defective MMU [intel.com] that comes in most WinTEL [computerworld.com] personal computers.
  • As we've seen time & time again, Intel is a one-trick pony. Granted, they do that one trick very well. But, in order to have any hope for success in any venture other than x86, they really need to spin off a separate company and allow it to run independently.

    These large multi-billion $ corps are simply unable to innovate due to their corporate cultures and stifling hierarchies. All innovation happens with the small, nimble guys who, once they have a viable product, get gobbled up by the big guys.

Behind every great computer sits a skinny little geek.

Working...