New submitter nrjperera (2669521) submits news of a new laptop from HP that's in Chromebook (or, a few years ago, "netbook") territory, price-wise, but loaded with Windows 8.1 instead. Microsoft has teamed up with HP to make an affordable Windows laptop to beat Google Chromebooks at their own game. German website Mobile Geeks have found some leaked information about this upcoming HP laptop dubbed Stream 14, including its specifications. According to the leaked data sheet the HP Stream 14 laptop will share similar specs to HP's cheap Chromebook. It will be shipped with an AMD A4 Micro processor, 2GB of RAM, 32GB of flash storage and a display with 1,366 x 768 screen resolution. Microsoft will likely offer 100GB of OneDrive cloud storage with the device to balance the limited storage option.
Slashdot is powered by your submissions, so send in your scoop
MojoKid (1002251) writes AMD is launching a new family of products today, but unless you follow the rumor mill closely, it's probably not something you'd expect. It's not a new CPU, APU, or GPU. Today, AMD is launching its first line of solid state drives (SSDs), targeted squarely at AMD enthusiasts. AMD is calling the new family of drives, the Radeon R7 Series SSD, similar to its popular mid-range line of graphics cards. The new Radeon R7 Series SSDs feature OCZ and Toshiba technology, but with a proprietary firmware geared towards write performance and high endurance. Open up one of AMD's new SSDs and you'll see OCZ's Indilinx Barefoot 3 M00 controller on board—the same controller used in the OCZ Vector 150, though it is clocked higher in these drives. That controller is paired to A19nm Toshiba MLC (Multi-Level Cell) NAND flash memory and a DDR3-1333MHz DRAM cache. The 120GB and 240GB drives sport 512MB of cache memory, while the 480GB model will be outfitted with 1GB. Interestingly enough, AMD Radeon R7 Series SSDs are some of the all-around, highest-performing SATA SSDs tested to date. IOPS performance is among the best seen in a consumer-class SSD, write throughput and access times are highly-competitive across the board, and the drive offered consistent performance regardless of the data type being transferred. Read performance is also strong, though not quite as stand-out as write performance.
Lucas123 writes An AMD website in China has leaked information about the upcoming release of a line of SSDs aimed at gamers and professionals that will offer top sequential read/write speeds of 550MB/s and 530MB/s, respectively. AMD confirmed the upcoming news, but no pricing was available yet. The SSDs will come in 120GB, 240GB and 480GB capacities and will use Toshiba's 19-nanometer flash lithography technology. According to IHS, AMD is likely entering the gaming SSD market because desktop SSD shipments are expected to experience a 39% CAGR between now and 2018.
MojoKid (1002251) writes "AMD updated its family of Kaveri-based A-Series APUs for desktop systems recently, namely the A10-7800 and the A6-7400K. The A10-7800 has 12 total compute cores, 4 CPU and 8 GPU cores, with average and maximum turbo clock speeds of 3.5GHz and 3.9GHz, respectively. The A6-7400K arrives with 6 total cores (2CPU, 4 GPU) and with the same clock frequencies. ... The AMD A10-7800 APU's performance is somewhat mixed, though it is a decent performer overall. Its Steamroller-based CPU cores do not do much to make up ground versus Intel's processors, so in the more CPU-bound workloads, Intel's dual-core Core i3-4330 competes favorably to AMD's quad-cores. And in terms of IPC and single-thread performance Intel maintains a big lead. Factor graphics into the equation, however, and the tides turn completely. The GCN-based graphics engine in Kaveri is a major step-up over the previous-gen, and much more powerful than Intel's mainstream offerings. The A10-7800's power consumption characteristics are also more desirable versus the Richland-based A10-6800K."
Dputiger (561114) writes "It has been almost two years since AMD launched the FirePro W9000 and kicked off a heated battle in the workstation GPU wars with NVIDIA. AMD recently released the powerful FirePro W9100, however, a new card based on the same Hawaii-class GPU as the desktop R9 290X, but aimed at the professional workstation market. The W9100's GPU features 2,816 stream processors, and the card boasts 320GB/s of memory bandwidth, and six mini-DisplayPorts, all of which support DP1.2 and 4K output. The W9100 carries more RAM than any other AMD GPU as well, a whopping 16GB of GDDR5 on a single card. Even NVIDIA's top-end Quadro K6000 tops out at 12GB, which means AMD sits in a class by itself in this area. In terms of performance, this review shows that the FirePro W9100 doesn't always outshine its competition, but its price/performance ratio keep it firmly in the running. But if AMD continues to improve its product mix and overall software support, it should close the gap even more in the pro GPU market in the next 18-24 months."
MojoKid writes: "When NSA whistleblower Edward Snowden came forth last year with U.S. government spying secrets, it didn't take long to realize that some of the information revealed could bring on serious repercussions — not just for the U.S. government, but also for U.S.-based companies. The latest to feel the hit? None other than Apple, and in a region the company has been working hard to increase market share: China. China, via state media, has today declared that Apple's iPhone is a threat to national security — all because of its thorough tracking capabilities. It has the ability to keep track of user locations, and to the country, this could potentially reveal "state secrets" somehow. It's being noted that the iPhone will continue to track the user to some extent even if the overall feature is disabled. China's iPhone ousting comes hot on the heels of Russia's industry and trade deeming AMD and Intel processors to be untrustworthy. The nation will instead be building its own ARM-based "Baikal" processor.
redletterdave (2493036) notes that Taiwan Semiconductor Manufacturing Co. (TSMC) has shipped its first batch of microprocessors to Apple as the iPhone maker looks to diversify its overseas suppliers. Apple will continue to rely on Samsung for its microprocessors, but as the rivalry between Apple and Samsung heats up in the mobile and soon wearable arenas, the deal with TSMC allows Apple to be less reliant on Samsung and therefore have more leverage with respect to price negotiations for future chips, as TSMC has supplanted Samsung Electronics as Apple's chief chipmaker for iPhones and iPads. Since 2011, Apple has been striking deals with other display and chip makers around Asia to reduce its dependence on Samsung. As a result of this slowdown in sales, Samsung on Monday announced operating income for its fiscal second quarter had sunk to a two-year low, blaming 'weak' sales of low- and medium-end smartphones, strong competition and subpar demand.
It may not be a household name like Intel or AMD, but TSMC is the world's biggest chip maker by revenue.
It may not be a household name like Intel or AMD, but TSMC is the world's biggest chip maker by revenue.
An anonymous reader writes with this news from Tass: Russia's Industry and Trade Ministry plans to replace U.S. microchips (Intel and AMD), used in government's computers, with domestically-produced micro Baikal processors in a project worth dozens of millions of dollars, business daily Kommersant reported Thursday. The article is fairly thin, but does add a bit more detail: "The Baikal micro processor will be designed by a unit of T-Platforms, a producer of supercomputers, next year, with support from state defense conglomerate Rostec and co-financing by state-run technological giant Rosnano. The first products will be Baikal M and M/S chips, designed on the basis of 64-bit nucleus Cortex A-57 made by UK company ARM, with frequency of 2 gigahertz for personal computers and micro servers."
An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."
An anonymous reader writes "Phoronix last week tested 65 graphics cards on open source drivers under Linux and the best result was generally with the open source AMD Radeon drivers. This week they put out a 35-graphics-card comparison using the proprietary AMD/NVIDIA drivers (with the other 30 cards being too old for the latest main drivers) under Ubuntu 14.04. The winner for proprietary GPU driver support on Linux was NVIDIA, which shouldn't come as much of a surprise given that Valve and other Linux game developers are frequently recommending NVIDIA graphics for their game titles while AMD Catalyst support doesn't usually come to games until later. The Radeon OpenGL performance with Catalyst had some problems, but at least its performance per Watt was respectable. Open-source fans are encouraged to use AMD hardware on Linux while those just wanting the best performance and overall experience should see NVIDIA with their binary driver."
Vigile writes: NVIDIA announced its latest dual-GPU flagship card, the GeForce GTX Titan Z, at the GPU Technology Conference in late March with a staggering price point of $2999. Since that time, AMD announced and released the Radeon R9 295X2, its own dual-GPU card with a price tag of $1499. PC Perspective finally put the GTX Titan Z to the test and found that from a PC gamer's view, the card is way overpriced for the performance it offers. At both 2560x1440 and 3840x2160 (4K), the R9 295X2 offered higher and more consistent frame rates, sometimes by as much as 30%. The AMD card also only takes up two slots (though it does have a water cooling radiator to worry about) while the NVIDIA GTX Titan Z is a three-slot design. The Titan Z is quieter and uses much less power, but gamers considering a $1500 or $3000 graphics card selection are likely not overly concerned with power efficiency.
An anonymous reader writes "How good are open source graphics drivers in 2014 given all the Linux gaming and desktop attention? Phoronix has tested 65 different GPUs using the latest open source drivers covering Intel HD Graphics, NVIDIA GeForce, AMD Radeon, and AMD FirePro hardware. Of the 65 GPUs tested, only 50 of them had good enough open source driver support for running OpenGL games and benchmarks. Across the NVIDIA and AMD hardware were several pages of caveats with different driver issues encountered on Linux 3.15 and Mesa 10.3 loaded on Ubuntu 14.04. Intel graphics on Linux were reliable but slow while AMD's open-source Linux support was recommended over the NVIDIA support that doesn't currently allow for suitable graphics card re-clocking. Similar tests are now being done with the proprietary Linux drivers."
Dputiger writes: "Since NVIDIA debuted its GameWorks libraries there's been allegations that they unfairly disadvantaged AMD users or prevented developers from optimizing code. We've taken these questions to developers themselves and asked them to weigh in on how games get optimized, why NVIDIA built this program, and whether its an attempt to harm AMD customers. 'The first thing to understand about [developer/GPU manufacturer] relations is that the process of game optimization is nuanced and complex. The reason AMD and NVIDIA are taking different positions on this topic isn't because one of them is lying, it’s because AMD genuinely tends to focus more on helping developers optimize their own engines, while NVIDIA puts more effort into performing tasks in-driver. This is a difference of degree — AMD absolutely can perform its own driver-side optimization and NVIDIA's Tony Tamasi acknowledged on the phone that there are some bugs that can only be fixed by looking at the source. ... Some of this difference in approach is cultural but much of it is driven by necessity. In 2012 (the last year before AMD's graphics revenue was rolled into the console business), AMD made about $1.4 billion off the Radeon division. For the same period, NVIDIA made more than $4.2 billion. Some of that was Tegra-related and it's a testament to AMD's hardware engineering that it competes effectively with Nvidia with a much smaller revenue share, but it also means that Team Green has far more money to spend on optimizing every aspect of the driver stack.'"
crookedvulture writes "In an article published by Forbes earlier this week, AMD lashed out at NVIDIA's GameWorks program, which includes Watch Dogs and other popular titles, such as Call of Duty: Ghosts, Assassin's Creed IV, and Batman: Arkham Origins. Technical communications lead for PC graphics Robert Hallock alleged that GameWorks deliberately cripples performance on AMD hardware. He also claimed that developers are prevented from working with AMD on game optimizations. The Forbes piece was fairly incriminating, but it didn't include any commentary from the other side of the fence. NVIDIA has now responded to the allegations, and as one might expect, it denies them outright. Director of engineering for developer technology Cem Cebenoyan says NVIDIA has never barred developers from working with AMD. In fact, he claims that AMD's own developer relations efforts have prevented NVIDIA from getting its hands on early builds of some games. AMD has said in the past that it makes no effort to prevent developers from working with NVIDIA. So, we have another round of he said, she said, with gamers caught in the middle and performance in newer titles hanging in the balance."
Via Phoronix comes news that Mesa 10.2 will be released in a few days with several interesting new features. Highlights include OpenGL 2.1 support for Freedreno (the driver for the Qualcomm graphics chips), video encoding and decoding on GCN Radeons using the new OpenMAX state tracker, and initial support for Intel's upcoming Cherryview Atom SoC. Progress is being made toward OpenGL 4 support, and the llvmpipe software rasterizer finally supports OpenGL 3.2. The release won't feature a few things: the Intel Sandybridge driver still does not support OpenGL 3.3, the R9 290 Radeons are still not working (despite claims by AMD a couple of years ago that cards starting with the Radeon 8000 series would be supported by the Free Software driver at hardware release time), and OpenCL support is still experimental.
As reported by Phoronix, the Haiku operating system "has added (untested) support for the newest AMD Radeon graphics cards to its open-source driver for the BeOS-compatible operating system." (Specifically, that support is for the "Mullins" and "Hawaii" graphics processors.) Impressive that this project keeps the BeOS flag raised and continues to modernize; Haiku has been around since 2001 — years longer than Be, Inc. itself lasted.
jfruh writes: "AMD has never been able to match Intel for profits or scale, but a decade ago it was in front on innovation — the first to 1GHz, the first to 64-bit, the first to dual core. A lack of capital has kept the company barely holding on with cheap mid-range chips since; but now AMD is flush with cash from its profitable business with gaming consoles, and is preparing an ambitious new architecture for 2016, one that's distinct from the x86/ARM hybrid already announced."
MojoKid (1002251) writes "Over the past nine months, we've seen the beginnings of a revolution in how video games are displayed. First, Nvidia demoed G-Sync, its proprietary technology for ensuring smooth frame delivery. Then AMD demoed its own free standard, dubbed FreeSync, that showed a similar technology. Now, VESA (Video Electronics Standard Association) has announced support for "Adaptive Sync," as an addition to DisplayPort. The new capability will debut with DisplayPort 1.2a. The goal of these technologies is to synchronize output from the GPU and the display to ensure smooth output. When this doesn't happen, the display will either stutter due to a mismatch of frames (if V-Sync is enabled) or may visibly tear if V-Sync is disabled. Adaptive Sync is the capability that will allow a DisplayPort 1.2a-compatible monitor and video card to perform FreeSync without needing the expensive ASIC that characterizes G-Sync. You'll still need a DP1.2a cable, monitor, and video card (DP1.2a monitors are expected to ship year end). Unlike G-Sync, a DP1.2a monitor shouldn't cost any additional money, however. The updated ASICs being developed by various vendors will bake the capability in by default."
rcht148 (2872453) writes "Rich Geldreich (game/graphics programmer) has made a blog post on the quality of different OpenGL Drivers. Using anonymous titles (Vendor A: Nvidia; Vendor B: AMD; Vendor C: Intel), he plots the landscape of game development using OpenGL. Vendor A, jovially known as 'Graphics Mafia' concentrates heavily on performance but won't share its specifications, thus blocking any open source driver implementations as much as possible. Vendor B has the most flaky drivers. They have good technical know-how on OpenGL but due to an extremely small team (money woes), they have shoddy drivers. Vendor C is extremely rich. It had not taken graphics seriously until a few years ago. They support open source specifications/drivers wholeheartedly but it will be few years before their drivers come to par with market standards. He concludes that using OpenGL is extremely difficult and without the blessings of these vendors, it's nearly impossible to ship a major gaming title."
crookedvulture (1866146) writes "AMD just revealed that it has two all-new CPU cores in the works. One will be compatible with the 64-bit ARMv8 instruction set, while the other is meant as an x86 replacement for the Bulldozer architecture and its descendants. Both cores have been designed from the ground up by a team led by Jim Keller, the lead architect behind AMD's K8 architecture. Keller worked at Apple on the A4 and A4 before returning to AMD in 2012. The first chips based on the new AMD cores are due in 2016."