Intel Declares Independence From PC, Prioritizes Cloud, IoT and 5G Efforts 153
A week after announcing 12,000 job cuts, Intel CEO Brian Krzanich has shared vision for the company, hinting a shift in its prime focus away from PC business. In a blog post, Krzanich said that the company will be actively growing its data center business. The chip maker also plans to focus on chips and technologies for IoT devices. "The biggest opportunity in the Internet of Things is that it encompasses just about everything in our lives today-- it's ubiquitous," Krzanich said. The company also plans to boost its memory chips business and make a push towards utilizing them in data centers and various cloud services. Intel said that it has made several investments in this field, noting the $16 billion acquisition of Altera last year. The company says it will be playing a big role in the move to 5G connectivity. "Connectivity is fundamental to every one of the cloud-to-thing segments we will drive," he writes.
Over the years, Intel has failed to keep up with Moore's Law, an axiom that semiconductor density will double about every two years. The company previously extended the timeframe to 2.5 years, but Krzanich assures customers that the they are working to make further advances in order to meet the goal. "Moore's Law is fundamentally a law of economics, and Intel will confidently continue to harness its value," Krzanich said. PCWorld has extensively reported on this.
Over the years, Intel has failed to keep up with Moore's Law, an axiom that semiconductor density will double about every two years. The company previously extended the timeframe to 2.5 years, but Krzanich assures customers that the they are working to make further advances in order to meet the goal. "Moore's Law is fundamentally a law of economics, and Intel will confidently continue to harness its value," Krzanich said. PCWorld has extensively reported on this.
Your phone is the next PC. (Score:2)
The microcomputer first replaced the low end mini-computer and word processing systems.
Next they replaced larger mini-computers like the PDP-11.
Later they replaced the Super-minicomputers like the VAX.
The cell phone will probably replace the PC.
Imagine a Windows Phone that you put on your desk and hook up a USB 3 connector to your monitor that provided power and a network connection to your phone and video to the display. Mouse and keyboard can be USB connected to the monitor or bluetooth. You could even us
Re: (Score:2)
Much hinges on whether MS can deliver a good and *perceived* to be good mobile experience. As it stands, today people prefer to have two distinct devices, in order not to run MS on their phone. Or it could be because MS is being obstinate phones being ARM, limiting the 'desktop' versatility.
Re: (Score:3)
It's not even hatred of MS that doomed their phones. I tried one in a store, and it was actually pretty usable. Arguably better than the contemporary Android phones.
But they had no apps. Back then, Google Maps was the top dog, and it wasn't on Windows Phone. Maybe one app out of a dozen that I wanted actually ran on it. And I couldn't find replacements for half of them.
If people don't develop for the platform, it's not gonna live long. Even the perception of "no apps" is deadly because it makes people---bot
Re: (Score:3)
Re: (Score:1)
The problem is... where do you put that huge heat-sink and fan for the Extreme editions of the new Intel Moble Core i9 CPU on your new smartphone?
Re: (Score:2)
Dump the heat into your bloodstream via the watchband... Now, where do you get the power for all that heat?
Re: (Score:1)
Zero point energy collectors.
Re: (Score:2)
From blood glucose. Or cholesterol.
Re: (Score:3)
It was a good post until "microsoft may just win the mobile phone market." This has already been won by Android (and to a certain extent by Apple as well). Microsoft can't win something they have already lost. Especially when they haven't even hardly begun the race when everyone else is already at home watching the highlights on the 11 o'clock news.
Re: (Score:1)
Everyone thought Netscape was king of the hill. Mind you, that was a very different situation, but it shows that any platform that is on top can fall, and underdogs just may take the fallen crown.
Re: (Score:3)
Netscape v Microsoft was a David/Goliath matchup when you look at the companies behind the software. Plus Microsoft used illegal means to promote IE over Navigator.
The phone OS market is entirely different. Both Android and iOS are supported by large corporations. Both are entrenched with an ecosystem of apps dependent on them, and users' money sunk into those apps.
Navigator actually did have some unique plugins, but other browsers eventually implemented their plugin API. And users didn't have their money i
Re: (Score:2)
I know Netscape lost, but did IE really win long term? I mean, from my perspective, not - most people I know only use IE when forced to...
Re: (Score:2)
Yes. They had the best and most popular browser for a long time.
To be quite frank, browsers really aren't that much more capable than they were in 2000. Take AJAX. You could do the exact same shit with IFRAMEs. Except, your audience were running 300Mhz processors, so really, you couldn't do shit.
Not that improvements haven't been made, but it's really the hardware that made the difference.
Re: (Score:2)
I think V8 made some measurable gains (turning a giant steaming pile into a slightly smaller, slightly cooler steaming pile...)
Too bad "we can't be trusted" to develop plugins like the NaCl project anymore. Quick search looks like pNaCl is trying to address that... but I smell Javascript again...
Re: (Score:1)
You mean like Nintendo and Sega where the winners in the Console war... until Sony showed up. Then it was Nintendo and Sony until Microsoft showed up...
Or how DEC and IBM where real computers companies while Apple made toys...
Or IBM won the PC war?
It may be over for now but not forever.
Re: (Score:2)
Except the mobile phone market hasn't been replaced yet so the race to the end of that particular technology isn't at an end, and no one has finished. If they think they are done and are relaxing eating chips in front of the tube then someone else does have an opportunity.
Re: (Score:2)
What kind of FPS can I get running The Division on my phone in 1440P?
Re: (Score:2)
Re: (Score:2)
That's the thing. For casual use maybe your fone will be the "next computer" but as long as there's a PC gaming industry, they will keep pushing PC performance boundries.
No reason video acceleration needs to be done in the cell phone. That could happen in some module connected to the monitor / tv. This could easily be the duty of the next generation of consoles, with them simply being external GPUs which give your mobile devices better gaming experiences.
Re: (Score:2)
Re: (Score:2)
True, and I believe I saw some external video cards recently being made. But in terms of performance, I don't think having an external video card receiving data via USB-C or the likes will ever match or exceed one directly plugged in to a motherboard. But who knows.
I would have agreed with that, but after looking up the throughput of PCIe 3.0 and USB-C, it looks like they have very similar speeds. USB-C provides 1.25 GB/s while PCIe 3.0 is about 1 GB/s. There is the distance factor but it appears the speeds could be similar.
Re: (Score:2)
Re: (Score:2)
Unless streaming games takes over. I also doubt that the Cell phone will take over the CAD/CAM, Video production, or any other HPC tasks. But for the masses that get on facebook, use quicken, and run office a phone maybe good enough. With an X86, monitor, mouse, and keyboard a phone running Windows 10 could also run all those legacy applications that a lot of people can not live without.
Re: (Score:1)
Today probably not all that fast but in two years? Besides gaming and other HPC tasks will not go away but for the I need to run Office and Quicken crowd phones are getting very close to good enough.
Re: (Score:1)
Today probably not all that fast but in two years? Besides gaming and other HPC tasks will not go away but for the I need to run Office and Quicken crowd phones are getting very close to good enough.
So when everybody's done spending money on one of these new phones and a monitor that supports this scheme, they get to sit down to an experience that is inferior to a half-decade old Sandybridge laptop they could have bought off of eBay for 50 dollars. Wow. The Future(TM).
Re: (Score:1)
The car of the future: Intel Inside!
Drive up to 40% faster while using 20% less energy! And with Turbo-boost technology, win those street races every time! Multi-core technology allows you to be more productive on and off the road!
*** Warning: Overclocking may lead to engine overheating and -- Your car has encountered an unknown error and
Ugh, not this crap again (Score:2)
Yeah yeah, the PC is dead we know. I have heard this same thing repeated countless times over the last couple decades. In order to make such a claim you have to also claim that our thin client ability has magically evolved so that we can all work on a 4" screen. Oh wait, that has not happened. So people are once again claiming "MAGIC IS REAL" and "REALITY IS FOR PEOPLE WHO CAN'T FANTASIZE LIKE WE DO!".
Could you claim "hey, lots of people work with smaller screens?" Sure. Many people can only afford to
Re: (Score:2)
No but it will be a different form factor. Did you read what I wrote?
Just what would you call a phone with an x86 running Windows using a full sized monitor, keyboard, and mouse? It is just a small form factor PC.
Logic much? (Score:2)
If I put a PC in a 4x24x22 frame, or a 8x12x12 frame, or a 2x2x2 frame, it's still a PC. Why? The easy but hard to fathom might be that the instruction set the chip uses, etc.. is all called PC. Probably more importantly, it's not a usable PC because of the form factor. It's usable because of the peripheral devices we use to access it. I'm still connecting an external FULL sized keyboard, some type of tracking device (mouse), and at least one reasonable sized (24") Monitor to be able to use the PC.
The
Re: (Score:2)
I didn't say the PC was dead I said the phone would replace your PC. It is a bit different than just a new chassis since it will also be your phone and have long battery life and fit in your pocket. Replacing the PC does not mean that PC will go away completely. IBM still makes minicomputers and while the company formally known as SUN probably calls them servers they are closer to mini computers in many ways.
Tradition desktops will still be used by gamers, for CAD/CAM, and other HPC uses but I can see the p
Re: (Score:2)
"So now you say, what about a computer that uses a phone-like UI when it's untethered, then switches to a desktop-like UI when it's near its base. "
You mean like Windows 10?
"When the phone has enough power to run photoshop filters while I watch. Or to compile a decent sized codebase."
The first may be sooner than you think thank to GPUs,CUDA, and OpenCL.
The second well not that many people compile code. Of course I say that with VS 2015 running on my 32GB Xeon equipped PC with an SSD. I am really talking ab
Re: (Score:2)
Agreed that PC's are not going away. The shift in development spending towards datacenters makes sense if you believe that market growth will be in datacenters. This is likely given that the world is saturated with smartphones (and pcs) which are access terminals to applications like social media which run in datacenters.
Re: (Score:2)
I doubt that Apple wants to do that since it is not all that interested in the enterprise market.
That's not exactly true. They just aren't interested in the SERVER market. But in the past few years, they have taken a renewed and serious interest in the enterprise market overall.
Dig down into this deceptively-fluffy page [apple.com]. There's actually a lot of information there.
Not really news (Score:1)
Seems strange to go from having one main competitor to having many in the embedded world.
Re: (Score:1)
They see that traditional computing devices are on the decline. They are changing with the times and positioning themselves to ride the wave of smaller devices and emerging technologies.
Re: (Score:2)
They're on a decline in the Americas and parts of Europe. That's mainly because your average person doesn't need a desktop to check the news or their email. That doesn't hold true of course to programmers, gamers, game developers and so on. It also doesn't hold true in many parts of Europe, Asia or the Middle East or Africa either, where the PC is increasing marketshare because there is no saturation.
Re: (Score:2)
Intel has been enjoying a comfortable lead, but this year TSMC opens up a 10nm fab. Intel recently disclosed that they wont be able to open up a 10nm fab until at least the end of NEXT year.
Dont believe their press-release bullshit hype. Intel forecasts big trouble for itself from here on out, because they just dont have the lead in technology anymore.
Things would be different if Intel was ru
Re: (Score:2)
They don't really compete with TSMC in a substantial way though.
When AMD can buy time on TSMC's 10nm fab while Intel can't buy time on anyones fabs... you are obviously just an ignorant person that doesnt understand the business.
It is technically possible for Intel to buy time on another manufacturers fabs, but it would be suicide for that company as they will lose their Pure-play or IDM status.
"Sounds like a great idea!" said AMD (Score:5, Insightful)
"Don't worry, we'll be happy to take over."
Re: (Score:1)
This truly sounds like a good opportunity for AMD to take some market share from Intel. If Intel ends the research and development by massive layoffs, spends billions on buying crappy companies (McAfee, Altera, etc..) and focuses just on trendy fads (cloud, experience, and IoT), they are doomed. The current Intel CEO seems to be following Microsoft's footprints from market dominance to quick destruction.
Re: (Score:2)
This truly sounds like a good opportunity for AMD to take some market share from Intel. If Intel ends the research and development by massive layoffs, spends billions on buying crappy companies (McAfee, Altera, etc..) and focuses just on trendy fads (cloud, experience, and IoT), they are doomed. The current Intel CEO seems to be following Microsoft's footprints from market dominance to quick destruction.
Yeah, and companies that focused on fads such as personal computers in the early 80's were doomed as well. We have a lot to learn from visionaries like IBM's John Akers who dismissed PC's as a fad best handled by companies with no vision, like Microsoft and Intel.
Re: (Score:2)
The point was laying off your own core competency engineers and buying up companies to plug together some quick business model has had...problems in the pase, as demonstrated over and over and over.
Trumped (Score:2, Insightful)
"Moore's Law is fundamentally a law of economics, and Intel will confidently continue to harness its value," Krzanich said.
Sorry, but Laws of physics inevitably trump "laws" of economics.
Re: (Score:2)
I agree, but we might need to find a better verb than "trump".
As for economics, laying of 12,000 people really sucks when its so easy to low ball the next graduating class and quietly hire others, as needed. Its a human resource style that interchanges employees as if they were silicon chips and not carbon based life forms.
Re: (Score:2)
As for economics, laying of 12,000 people really sucks when its so easy to low ball the next graduating class and quietly hire others, as needed.
Thats not what Intel is doing. They are laying off people because their own forecasts are grim. They lost the lead in fab process, and by the time they finally get a 10nm fab up and running they will only be catching up to the two other semiconductor manufacturers than have already had 10nm fabs running off chips for a year.
If you have Intel stock, its time to sell.
Hyp (Score:5, Insightful)
The internet of things is mostly hype, as people with nests at home have learned . Sure you can talk to your thermostat at home via the internet, but why would you want to?
More to the point (Score:2)
What good does it do you? What problem does it solve and does it solve the problem in an economically advantageous manner?
As far as I can see, it doesn't. Would it be nice to have my fridge tell me everything in it? Certainly. Would it be nice if it printed out a list of everything it is now missing? Again, yes. But so what? That saves me all of five minutes. Because I am certainly not going to leave the next part, the shopping, to some third party like Amazon or Google or WalMart. This isn't me being a lud
Re: (Score:2)
My "connected" thermostat has sensors in multiple rooms, that enables better control of temperature in the whole house, not just the one place the thermostat is mounted. Since installing it, we are more comfortable (less over and under cooling at various times), and the electric bill dropped enough to "pay back" cost of the thermostat and extra sensors in just about 1 year.
All in all, I think it is economically advantageous to us, and probably would be for most people in a typical 3+ bedroom house, especia
Re: (Score:2)
But what does the IoT bit have to do with it, though? Could just have a local thermostat with sensors in every room. Optionally passively download prices from your electricity provider if you want to get really fancy with scheduling your water heater.
You don't need the thing to be accessible over the net, and it most certainly doesn't need to synchronize all your usage data with some IT company somewhere. Our fridge manages to save us quite a lot of money simply by learning what time of day we're likely to
Re: (Score:2)
Yep, if there were a non "cloud dependent" alternative, I would have bought that. At the moment, I needed these features and this is all that's on offer.
Spot me $25M and I'll start a company to develop a cloud free thermostat, it should be on the market within 48 months after the money hits my account - I promise.
Re: (Score:2)
Re: (Score:2)
I hear you, and agree - I really want mine to offer the connectivity it does and just "serve" the data from itself instead of the cloud - perhaps with an option where the company compensates me $5/month for sharing my data with them for as long as they are that interested in collecting it at that price.
"The cloud" makes configuration a little simpler for people who want to see what temperature it is in their house when they are away - but not enough to force the whole thing to run through their servers with
Re: (Score:3)
Exactly. I'm been disconnecting things from the Internet lately, not connecting.
Once the lovely wears off, you're left with why. I haven't found a good answer yet.
Re: (Score:2)
I've got one "connected" light switch - it turns on at dusk, off at 10pm - outside lights... and they can be controlled from our phones. Works, works well, I'd probably install a half dozen more of the things around the house if they weren't $50 each.
Re: (Score:2)
until IoT runs its course, it will sustain a bunch of businesses that make connected (and spying) toys.
after enough bad things happen, the fad will die down and remote access and remote control will still be here (which is good) but hopefully the spying and datamining will stop.
that's my hope.
I like iot, but I don't like the spying and phone-homing. I DIY it so I know my own stuff isn't doing anything I didn't tell it to. most people don't DIY it, though.
Re: (Score:2)
I got an Ecobee, mostly for the fact that I can tell the fan to run a certain number of minutes per hour and it has a remote sensor so it can tell that my upstairs is getting too hot (thermostat is on the ground floor). The remote access is a plus; I can set it to vacation mode after I have left, for example. Not worth doing it just to get that, but once you're buying a little wifi computer for your wall, why not have access remotely?
Re: (Score:2)
I have a different "IoT" thermostat, and my best use for it is to set it in "vacation mode" from my phone, after I've already left on vacation and forgot to do it before I left. It's also handy to have the thermostat interface on my phone so I don't have to walk my lazy butt over to the thermostat to interact with it. The temperature/humidity and system operation logging and graphing functions are kind of informative to look at / compare how long the system runs when set at 74 vs 76 degrees - a couple of
Re: (Score:2)
Your specific applications for IoT are mostly hype. Replace IoT with "sensor network" which is what it was called in the 90s and you'll realise your entire life has benefited from IoT already. It's more than just some overpriced thermostat and a lightbulb which you can control from your phone.
What's this mean for gaming PCs? (Score:2)
Obviously the market is slowing down and that's reflected in my own PC builds as I've gone longer and longer between refreshes as the technology held up longer. But if they're going to shift development to appliances it sounds like they're giving up on the enthusiast market altogether. I can be happy with AMD too but that's gonna affect the whole market of motherboard manufacturers as well.
Also I think the IoT market has reached saturation. I don't want or need internet access to my refrigerator and how
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Oh sure - I didn't think they'd maintain Moore's law forever but they don't seem to have any other ideas to get around that or to further the PC market.
Make 'em faster.
Make 'em faster.
Make 'em faster.
Oh... we can't do that anymore? Oh well... let's do something else.
memory/mb bandwidth is still a problem and intel hasn't really dug into that area yet. Nor have they looked at getting away from the aged x86 instruction set and using some new sets that are more conducive to performance. (But muh Microsoft a
Not impressed by IoT efforts so far (Score:2)
Most of what Intel is pushing (Edison, Galileo, etc) in terms of IoT hardware is more expensive than existing solutions (Raspberry Pi, Arduino) which is pretty antithetical to many IoT applications where what you want is low cost. So I'm a bit skeptical that they are really serious about moving into that space, or at least if they are, they need to up their game. The Pi Zero, or the recently Kickstarted C.H.I.P. ($9 all-in-one computer with wifi and li-poly charging circuit) are much more aligned with IoT a
Re: (Score:2)
Most of what Intel is pushing (Edison, Galileo, etc) in terms of IoT hardware is more expensive than existing solutions (Raspberry Pi, Arduino) which is pretty antithetical to many IoT applications where what you want is low cost. So I'm a bit skeptical that they are really serious about moving into that space, or at least if they are, they need to up their game. The Pi Zero, or the recently Kickstarted C.H.I.P. ($9 all-in-one computer with wifi and li-poly charging circuit) are much more aligned with IoT applications.
To get most business cases to close, you have big problems to address with respect to cost, power, and connectivity, and from what I'm seeing Intel isn't really fielding anything competitive.
The last time that Intel was seriously in the Embedded game was with the 8051. And even that had some ridiculous design quirks compared to microcontrollers from Motorola, NEC, ST, etc.
Re: (Score:2)
The issue with Edison and Galileo is that they tend to be crippled platforms. It's been a while since I last looked at them, but I always ask the same questions. Does it have Ethernet? WiFi? 5V/3.3V serial port? Bunch of pins for miscellaneous I/O? SATA port or SD card? Enough RAM to run a standard version of Linux? HDMI and keyboard ports for testing?
The latest Raspberry Pi has most of that. However, there always seems to be a catch with Intel products. Intel doesn't want the Edison or Galileo to comp
Re: (Score:3)
Even then, though, why get an Atom when you could get something like the TI MSP430? Low power, cheap eval boards, dirt cheap in bulk, and already fairly established. Intel is just putting products out there with the idea that anything with the Intel name on it will sell, but they aren't offering anything better suited to the markets they claim to be targeting. IoT is going to be big, but it's going to be stuff so cheap as to be disposable - think Amazon's buttons, stuff like that. I don't see Intel having m
Re: (Score:2)
Even then, though, why get an Atom when you could get something like the TI MSP430? Low power, cheap eval boards, dirt cheap in bulk, and already fairly established.
The MSP430 is a bit pathetic, because it never managed to build a community, probably in turn because their dev env is garbage. Why would you even say MSP instead of atmega?
Re: (Score:2)
Well, mainly pointing that out because parent poster was arguing that Pi/Arduino weren't for anything that would be rolled out en masse. Whereas the MSP430 always struck me as a more industry-focused solution, with the dev kit focused more on that environment than making things easy for the kid in his garage, as with Arduino. That said, certainly atmels and pics are their ilk are ubiquitous - and the overall point is that's all you need for most IoT applications, and the Atom or anything else Intel is bring
Sort of wrong (Score:4, Interesting)
Yes it should refocus on servers. But rather it should be the "home server" market. Maybe even home cluster.
The market for desktops has mostly gone away, replaced by laptops and tablets. These people only needed a PC for spread sheets, simple word processing, and running a web browser.
The gaming machines market is the same if not bigger.
Professional content creation workstations are bigger than ever.
Same money is being spent; just in different form factors.
There is one area that isn't being exploited and marketed enough. The private cloud; i.e. home servers.
Particular when mixed with virtual machines it's something that needs to happen more. Store all your media at home in one place. Use the online cloud only for immediate stuff and for backups. There is huge potential for streaming. A home server can do transcoding on the fly and a dozen other things all at the same time.
It's more about the demise of the low end desktop which in the least has been replaces by cigar box systems.
Re: (Score:2)
It would be awesome to see an emphasis on home servers and server clusters. I run owncloud and I recently clustered it (two web server VM's, 3 mysql galera VM's; two pfSense/HAProxy VM's). All of this runs on hardware with Intel chips (one Xeon E3, one i7, one NUC - all Haswell). I might even get another server I have so many VM's. :-)
Unfortunately, the biggest barrier to entry for running a home server is the internet connection, not the hardware. Dynamic IP's can cause your cloud to be unavailable unt
Re: (Score:2)
A "home server" is just a PC. Servers in the "cloud" are optimized for extremely heavy workloads. But when a server serves just one home, the workload is super light, and any computer can handle it.
Re: (Score:2)
Only in the USA. In much of the rest of the world the home server is whatever users the least power, quite different from "a PC".
The home server is turning into an appliance. These appliances feature ARM quite heavily.
Re: (Score:2)
Only in the USA. In much of the rest of the world the home server is whatever users the least power, quite different from "a PC".
The home server is turning into an appliance, powered by ARM.
Re: (Score:2)
Yes it should refocus on servers. But rather it should be the "home server" market. Maybe even home cluster.
There is no money in that whatsoever, because people expect to get those machines cheap, and they are not disappointed. There are craploads of tiny little computers which do that job already, and they are already cheap.
This Pretty Much Ensures AMD Is My Next Purchase (Score:1)
Re: (Score:2)
lost before they started (Score:2)
Intel has already lost the embedded game in every way. Intel's chips suck too much power, cost too much and are too closed off when you compare them to the competition. Their obsession with x86 has made their failure a forgone conclusion.
Declaring the course of their death spiral (Score:2, Insightful)
Moving from de-facto monopoly position on expensive cpus to competing with dozens of companies producing cheapest possible crap for IoT devices. Yeah, that's a good plan.
Intel shifting focus away from PC business .. (Score:1)
Before connecting 'things' to the Internet, how about fixing the defective MMU [intel.com] that comes in most WinTEL [computerworld.com] personal computers.
The one-trick pony tries again... (Score:2)
These large multi-billion $ corps are simply unable to innovate due to their corporate cultures and stifling hierarchies. All innovation happens with the small, nimble guys who, once they have a viable product, get gobbled up by the big guys.
Re:Meh.... (Score:5, Informative)
That used not to be the case, then it was the case, now it much isn't the case again, particularly for Intel.
Intel's desktop only goes to 4 cores, a small number of PCIe lanes, no ECC memory support, 2 memory channels, single socket, and won't go to very high TDP.
Servers go to much higher core counts, pcie lanes, ecc memory, 4 memory channels, and more sockets, and will drive TDP through the roof to get more powerful if needed.
Re: (Score:1)
Heh, that kinda is marketspeak for "Intel is moving out of the low margin, already saturated consumer market into something with indefinite growth potential"
Re: (Score:2, Interesting)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
That is very true. The majority of business applications, spreadsheets, word processing, etc, can all run pretty well on an average smartphone. All you need is a desktop size screen, a USB and network port and the "CPU" can be embedded in the screen.
What I would like to see someday is a smartphone dock, where you can plug the smartphone into a monitor/mouse/keyboard and use the smartphone like a desktop PC.
Re: (Score:1)
Are you sure you aren't just using marketing speak?
Intel defines a Pentium as desktop-not-server (despite whatever server applications that a user might come up with) and defines a E5 Xeon as server-not-desktop (despite whatever desktop applications that a user might come up with).
The entire product array is whatever-you-want-to-use-it-for and it's Intel who fancifully claims that some are desktop and some are server.
Re: (Score:1)
Not much of a difference between a so-called PC processor and a server CPU. Mostly marketing speak.
You clearly don't work on server or desktop CPUs.
Increased IO bandwidth, hardware automatic fail over, more cores, support for data-center relevant interfaces, engineered full load, full temperature, 100% duty cycle operation over the lifetime of the device, parity and error correction everywhere inside and out. These are the things that make the server CPUs worthwhile.
Re: (Score:1)
Not much of a difference between a so-called PC processor and a server CPU. Mostly marketing speak.
You clearly don't work on server or desktop CPUs.
Increased IO bandwidth, hardware automatic fail over, more cores, support for data-center relevant interfaces, engineered full load, full temperature, 100% duty cycle operation over the lifetime of the device, parity and error correction everywhere inside and out. These are the things that make the server CPUs worthwhile.
I will give you increased bandwidth, it's quite expensive to increase bandwidth and the server parts like Xeon E5 and beyond do have increased bandwidth. As do the extreme edition consumer parts which are essentially Xeon E5s.
Core count is really a profit optimization thing, larger chips are more expensive due to taking up more space. Considering the profits and manufacturing technology Intel has, they could easily make consumer products at current prices with higher core counts and make a profit. How do yo
Re: (Score:2)
Re: (Score:2)
There's some difference in caching and virtualization support, the latter of which I really think needs to become a client PC product anyway. Running a HW virtualized OS under linux is very useful for gaming, but right now both nvidia and Intel like to think of HW virtualization as a server feature and charge a bunch of money. But given that Windows is no longer the obvious operating system for either every day use or developer use, HW virtualization is a lot more important.
Re: (Score:2)
There's some difference in caching and virtualization support, the latter of which I really think needs to become a client PC product anyway. Running a HW virtualized OS under linux is very useful for gaming, but right now both nvidia and Intel like to think of HW virtualization as a server feature and charge a bunch of money. But given that Windows is no longer the obvious operating system for either every day use or developer use, HW virtualization is a lot more important.
My Broadwell NUC has virtualization support. Hardly a server CPU. Yes server SKUs all do the virtualization whereas not all the desktop and low power CPUs do, but if that's what you want, you can get it in a desktop CPU. For me it's great for SW development.
Re: (Score:2)
You can. If you get the Skylake gen i.e. Pentium G4400, upcoming Celeron G3900, i3 6100 etc. then your memory ceiling is 64GB up from the older 32 GB. You can even get ECC on Celeron/Pentium but at the cost of an expensive motherboard wirh C2xx chipset. (there is IOMMU too)
Re: (Score:2)
Nor is there a big gap between a typical laptop chip and the higher-end "mobile" stuff. Or rather, the gap is quickly diminishing. As for desktops, I'm not sure there is much of a low-end market anymore and the high-end can probably use Xeons or their derivatives just fine.
Gamers might have some issues with this, as their needs are not so much reliability but raw performance.
Re: (Score:2)
Re: (Score:2)
Whomever smelt it dealt it...
Re: (Score:2)
Re: (Score:2)
Ignore lists are found on just about every SJW site. Revently, Twitter and Reddit both added/extended tools for SJWs to ignore users who don't agree with them.
Re: (Score:2)
I don't think he heard you.
Re: (Score:2)
Haswell i5s should be under $200 by now. What's the hold-up?
They havent actually gotten good yields from either of their last 2 shrinks and the next shrink isnt scheduled until the end of 2017. Meanwhile TSMC opens up a 10nm fab this year.
Intel is now as fragile as Motorola was.