Graphics

Epic's VR Demo Scene For the GTX 980 Now Runs On Morpheus PS4 Headset At 60 FPS 33

Posted by timothy
from the blit-blit-bloop-bleep dept.
An anonymous reader writes: Originally created as a Unreal Engine 4 demo scene to push the limits of VR-capable graphics on the Oculus Rift 'Crescent Bay' prototype VR headset, Showdown is now running flawlessly at 60 FPS on Morpheus, Sony's PS4 VR headset. The demo was previously only able to run at Oculus' 90 FPS target VR framerate on the Nvidia GTX 980, a GPU which costs nearly $200 more than the PS4 itself. To the delight of UE4 developers, the performance improvement comes from general optimizations to UE4 on PS4, rather than specific optimizations to Showdown.
The Media

WSJ Crowdsources Investigation of Hillary Clinton Emails 224

Posted by timothy
from the tag-this-story-recursive dept.
PvtVoid writes: The Wall Street Journal now has a page up that encourages readers to sift through and tag Hillary Clinton's emails on Benghazi. Users can click on suggested tags such as "Heated", "Personal", "Boring", or "Interesting", or supply their own tags. What could possibly go wrong? I'm tagging this story "election2016."
Handhelds

Asus ZenFone 2 Performance Sneak Peek With Intel Z3580 Inside 106

Posted by timothy
from the doesn't-work-with-google-fi-though dept.
MojoKid writes: Asus just finally made their ZenFone 2 available for sale in the US. It's an Intel-powered smartphone running Android Lollipop that's compatible with AT&T and T-Mobile, and other cellular networks that utilize GSM technology, like Straight Talk, MetroPCS, and Cricket Wireless among others.The device is packing a quad-core Intel Atom Z3580 (2.3GHz) with PowerVR G6430 graphics and 4GB of RAM, along with Intel 7262 and Intel 2230 modem tech, a 5.5" Full HD screen, a 13MP rear camera, dual-SIM support and 802.11ac Wi-Fi. The high-end model can be had for only $299, unlocked. A $199 version with 2GB of RAM and a slightly slower Intel Atom Z3560 is also available. In the benchmarks, the Zenfone 2 offers competent though middling performance but considering Asus has priced the ZenFone 2 so aggressively, it's sure to grab some attention at retail with consumers looking for a contract-free commitment.
AMD

AMD Details High Bandwidth Memory (HBM) DRAM, Pushes Over 100GB/s Per Stack 98

Posted by timothy
from the lower-power-higher-interest dept.
MojoKid writes: Recently, a few details of AMD's next-generation Radeon 300-series graphics cards have trickled out. Today, AMD has publicly disclosed new info regarding their High Bandwidth Memory (HBM) technology that will be used on some Radeon 300-series and APU products. Currently, a relatively large number of GDDR5 chips are necessary to offer sufficient capacity and bandwidth for modern GPUs, which means significant PCB real estate is consumed. On-chip integration is not ideal for DRAM because it is not size or cost effective with a logic-optimized GPU or CPU manufacturing process. HBM, however, brings the DRAM as close to possible to the logic die (GPU) as possible. AMD partnered with Hynix and a number of companies to help define the HBM specification and design a new type of memory chip with low power consumption and an ultra-wide bus width, which was eventually adopted by JEDEC 2013. They also develop a DRAM interconnect called an "interposer," along with ASE, Amkor, and UMC. The interposer allows DRAM to be brought into close proximity with the GPU and simplifies communication and clocking. HBM DRAM chips are stacked vertically, and "through-silicon vias" (TSVs) and "bumps" are used to connect one DRAM chip to the next, and then to a logic interface die, and ultimately the interposer. The end result is a single package on which the GPU/SoC and High Bandwidth Memory both reside. 1GB of GDDR5 memory (four 256MB chips), requires roughly 672mm2. Because HBM is vertically stacked, that same 1GB requires only about 35mm2. The bus width on an HBM chip is 1024-bits wide, versus 32-bits on a GDDR5 chip. As a result, the High Bandwidth Memory interface can be clocked much lower but still offer more than 100GB/s for HBM versus 25GB/s with GDDR5. HBM also requires significantly less voltage, which equates to lower power consumption.
Windows

How Windows 10 Performs On a 12-inch MacBook 240

Posted by Soulskill
from the burning-questions dept.
An anonymous reader writes: As Microsoft prepares for the launch of Windows 10, review sites have been performing all sorts of benchmarks on the tech preview to evaluate how well the operating system will run. But now a computer science student named Alex King has made the most logical performance evaluation of all: testing Windows 10's performance on a 2015 MacBook. He says, "Here's the real kicker: it's fast. It's smooth. It renders at 60FPS unless you have a lot going on. It's unequivocally better than performance on OS X, further leading me to believe that Apple really needs to overhaul how animations are done. Even when I turn Transparency off in OS X, Mission Control isn't completely smooth. Here, even after some Aero Glass transparency has been added in, everything is smooth. It's remarkable, and it makes me believe in the 12-inch MacBook more than ever before. So maybe it's ironic that in some regards, the new MacBook runs Windows 10 (a prerelease version, at that) better than it runs OS X."
Graphics

A Look At GTA V PC Performance and Image Quality At 4K 72

Posted by timothy
from the so-you're-saying-more-is-better dept.
MojoKid writes: Rockstar's Grand Theft Auto series has been wildly successful for many years now, offering some of the edgiest story lines, game play tactics and objectives the gaming industry has ever seen. With psychopathic main characters, you are left in the depraved communities of Los Santos and Blaine County, to walk a path few would dare choose in real life. And it's rather entertaining of course, that you're tasked with leaving a virtual world worse off than you found it, consequences be damned. But what does it take to run GTA V at 4K (3840X2160) resolution? This article takes a look at that, as well as how it scales over multiple NVIDIA GeForce GTX 980 GPUs, along with some screen shots that look at image quality at Ultra HD resolution. It's safe to say one strong, high-end GPU will get the job done, but two in SLI or CrossFire are better of course, if you want to max out all IQ settings.
Graphics

Oculus Rift Hardware Requirements Revealed, Linux and OS X Development Halted 227

Posted by Soulskill
from the sad-penguin dept.
An anonymous reader writes: Oculus has selected the baseline hardware requirements for running their Rift virtual reality headset. To no one's surprise, they're fairly steep: NVIDIA GTX 970 / AMD 290 equivalent or greater, Intel i5-4590 equivalent or greater, and 8GB+ RAM. It will also require at least two USB 3.0 ports and "HDMI 1.3 video output supporting a 297MHz clock via a direct output architecture."

Oculus chief architect Atman Binstock explains: "On the raw rendering costs: a traditional 1080p game at 60Hz requires 124 million shaded pixels per second. In contrast, the Rift runs at 2160×1200 at 90Hz split over dual displays, consuming 233 million pixels per second. At the default eye-target scale, the Rift's rendering requirements go much higher: around 400 million shaded pixels per second. This means that by raw rendering costs alone, a VR game will require approximately 3x the GPU power of 1080p rendering." He also points out that PC graphics can afford a fluctuating frame rate — it doesn't matter too much if it bounces between 30-60fps. The Rift has no such luxury, however.

The last requirement is more onerous: WIndows 7 SP1 or newer. Binstock says their development for OS X and Linux has been "paused" so they can focus on delivering content for Windows. They have no timeline for going back to the less popular platforms.
Space

How SpaceX and the Quest For Mars Almost Sunk Tesla Motors 126

Posted by Soulskill
from the rocket-bucks-versus-car-bucks dept.
braindrainbahrain writes: Elon Musk and his rocket company are well known to Slashdottters. This article and book excerpt tell the story of the creation of SpaceX and how it almost sank Musk's other company, Tesla Motors. Musk recalls, "I could either pick SpaceX or Tesla or split the money I had left between them. That was a tough decision. If I split the money, maybe both of them would die. If I gave the money to just one company, the probability of it surviving was greater, but then it would mean certain death for the other company." But then, at the last moment, years of work at SpaceX finally paid off: "[O]n Dec. 23, 2008, SpaceX received a wonderful shock. The company won a $1.6 billion contract for 12 NASA resupply flights to the space station. Then the Tesla deal ended up closing successfully, on Christmas Eve, hours before Tesla would have gone bankrupt. Musk had just a few hundred thousand dollars left and could not have made payroll the next day." Also, it turns out the inspiration for SpaceX was the idea of sending mice to Mars.
Intel

Intel NUC5i7RYH Broadwell Mini PC With Iris Pro Graphics Tested 80

Posted by timothy
from the why-pay-for-big-any-more? dept.
MojoKid writes: In addition to ushering in a wave of new notebooks and mobile devices, Intel's Broadwell microarchitecture has also found its way into a plethora of recently introduced small form factor systems like the company's NUC platform. The new NUC5i7RYH is a mini-PC packing a Core i7-5557U Broadwell processor with Iris Pro graphics, which makes it the most powerful NUC released to date. There's a 5th-gen Core i7 CPU inside (dual-core, quad-thread) that can turbo up to 3.4GHz, an Iris Pro 6100 series integrated graphics engine, support for dual-channel memory, M.2 and 2.5" SSDs, 802.1ac and USB 3.0. NUCs are generally barebones systems, so you have to build them up with a drive and memory before they can be used. The NUC5i7RYH is one of the slightly taller NUC systems that can accommodate both M.2 and 9.5mm 2.5 drives and all NUCs come with a power brick and VESA mount. With a low-power dual-core processor and on-die Iris Pro 6100-series graphics engine, the NUC5i7RYH won't offer the same kind of performance as systems equipped with higher-powered processors or discrete graphics cards, but for everyday computing tasks and casual gaming, it should fit the bill for users that want a low profile, out-of-the-way tiny PC.
Graphics

The Decline of Pixel Art 175

Posted by Soulskill
from the one-palette-at-a-time dept.
An anonymous reader writes: Blake Reynolds, lead artist for a pair of popular mobile games, has put up a post about the decline of pixel art in games. He decries the current state of "HD fetishism" in the industry, saying that games with great pixel art get needlessly marked down in reviews for their pixelation, while games that have awful — but high-res — art get glowing praise. He walks through a number of examples showing how pixel art can be well done or poorly done, and how it can be extremely complex despite the lower resolution. But now pixel artists are running into not only the expectation of high-definition content, but technological obstacles as well. "Some devices blur Auro [their game]. Some devices stretch it. Some devices letterbox it. No matter how hard I worked to make the art in Auro as good as I could, there's no way a given person should be expected to see past all those roadblocks. Making Auro with higher-resolution art would have made it more resistant to constantly-changing sizes and aspect ratios of various devices." Reynolds says his studio is giving up on pixel art and embracing the new medium, and recommends other artists do the same. "Don't let the medium come between you and your audience. Speak in a language people can understand so that they can actually see what makes your work great without a tax."
Security

GPU Malware Can Also Affect Windows PCs, Possibly Macs 49

Posted by samzenpus
from the protect-ya-neck dept.
itwbennett writes: A team of anonymous developers who recently created a Linux rootkit that runs on graphics cards has released a new proof-of-concept malware program that does the same on Windows. A Mac OS X implementation is also in the works. The problem the developers are trying to highlight lies not with the operating systems, such as Windows or Linux, nor with the GPU (graphics processor unit) vendors, but rather with existing security tools, which aren't designed to scan the random access memory used by GPUs for malware code.
Security

Proof-of-Concept Linux Rootkit Leverages GPUs For Stealth 67

Posted by Soulskill
from the jellyfish-eating-penguins dept.
itwbennett writes: A team of developers has created a rootkit for Linux systems that uses the processing power and memory of graphics cards instead of CPUs in order to remain hidden. The rootkit, called Jellyfish, is a proof of concept designed to demonstrate that completely running malware on GPUs is a viable option. Such threats could be more sinister than traditional malware programs, according to the Jellyfish developers, in part because there are no tools to analyze GPU malware, they said.
AMD

AMD Outlines Plans For Zen-Based Processors, First Due In 2016 166

Posted by samzenpus
from the check-it-out dept.
crookedvulture writes: AMD laid out its plans for processors based on its all-new Zen microarchitecture today, promising 40% higher performance-per-clock from from the x86 CPU core. Zen will use simultaneous multithreading to execute two threads per core, and it will be built using "3D" FinFETs. The first chips are due to hit high-end desktops and servers next year. In 2017, Zen will combine with integrated graphics in smaller APUs designed for desktops and notebooks. AMD also plans to produce a high-performance server APU with a "transformational memory architecture" likely similar to the on-package DRAM being developed for the company's discrete graphics processors. This chip could give AMD a credible challenger in the HPC and supercomputing markets—and it could also make its way into laptops and desktops.
Graphics

My High School CS Homework Is the Centerfold 628

Posted by timothy
from the awfully-thin-skin dept.
theodp writes: To paraphrase the J. Geils Band, Maddie Zug's high school computer science homework is the centerfold. In a Washington Post op-ed, Zug, a student at the top-ranked Thomas Jefferson High School for Science and Technology, argues that a centerfold does not belong in the classroom. "I first saw a picture of Playboy magazine's Miss November 1972 a year ago as a junior at TJ," Zug explains. "My artificial intelligence teacher told our class to search Google for Lena Soderberg (not the full image, though!) and use her picture to test our latest coding assignment...Soderberg has a history with computer science. In the 1970s, male programmers at the University of Southern California needed to test their image-processing algorithm. They scanned what they had handy: the centerfold of a Playboy magazine. Before long, the image became a convention in industry and academia." (Wikipedia has a nice background, too.)
Graphics

Square Enix Witch Chapter Real-Time CG DX12 Demo Impresses At Microsoft BUILD 87

Posted by timothy
from the killer-graphics dept.
MojoKid writes: Computer generated graphics have come a long way in the past several years and are starting to blur the line between animation and real actors. One of the more difficult tasks for CG artists is to recreate human emotions, especially crying, though you wouldn't know it after watching a tech demo that Square Enix showed off at the Microsoft BUILD Developer Conference. The real-time tech demo is called Witch Chapter 0 [cry] and is part of a research project that studies various next generation technologies. For this particular demo, Square Enix put a lot of research into real-time CG technology utilizing DirectX 12 in collaboration with Microsoft and NVIDIA, the company said. It's an ongoing project that will help form Square Enix's Luminous Studio engine for future games. The short demo shows some pretty impressive graphics, with an amazing level of detail. As the camera zooms in, you can clearly see imperfections in the skin, along with glistening effects from areas where the face is wet with either tears or water
Android

LG G4 and Qualcomm's Snapdragon 808 Benchmarked 45

Posted by samzenpus
from the looking-at-the-numbers dept.
MojoKid writes: LG officially lifted the veil on its new G4 flagship Android phone this week and the buzz has been fairly strong. LG's display prowess is well known, along with their ability to pack a ton of screen real estate into a smaller frame with very little bezel, as they did with the previous generation G3. However, what's under the hood of the new LG G4 is probably just as interesting as the build quality and display, for some. On board the LG G4 is a Qualcomm Snapdragon 808, the six-core little brother of the powerful and power-hungry Snapdragon 810 that's found in HTC's One M9. The One M9 is currently one of the fastest Android handsets out there, but its battery life suffers as a result. So with a six-core Snapdragon and a slightly tamer Adreno 418 graphics engine on board, but also with 3GB of RAM, it's interesting to see where the G4 lands performance-wise. It's basically somewhere between the HTC One M9 (Snapdragon 810) and the Snapdragon 805 in the Nexus 6 in CPU bound workloads, besting even the iPhone 6, but much more middle of the pack in terms of graphics and gaming.
KDE

KDE Plasma 5.3 Released 53

Posted by Soulskill
from the onward-and-upward dept.
jrepin writes: The KDE community has released Plasma 5.3, a major new version of the popular, open source desktop environment. The latest release brings much enhanced power management, better support for Bluetooth, and improved Plasma widgets. Also available is a technical preview of Plasma Media Center shell. In addition, Plasma 5.3 represents a big step towards support for the Wayland windowing system. There are also a few other minor tweaks and over 300 bugfixes. Here is the full changelog, and here's the package download wiki page.
Graphics

NVIDIA Quadro M6000 12GB Maxwell Workstation Graphics Tested Showing Solid Gains 66

Posted by samzenpus
from the check-it-out dept.
MojoKid writes: NVIDIA's Maxwell GPU architecture has has been well-received in the gaming world, thanks to cards like the GeForce GTX Titan X and the GeForce GTX 980. NVIDIA recently took time to bring that same Maxwell goodness over the workstation market as well and the result is the new Quadro M6000, NVIDIA's new highest-end workstation platform. Like the Titan X, the M6000 is based on the full-fat version of the Maxwell GPU, the G200. Also, like the GeForce GTX Titan X, the Quadro M6000 has 12GB of GDDR5, 3072 GPU cores, 192 texture units (TMUs), and 96 render outputs (ROPs). NVIDIA has said that the M6000 will beat out their previous gen Quadro K6000 in a significant way in pro workstation applications as well as GPGPU or rendering and encoding applications that can be GPU-accelerated. One thing that's changed with the launch of the M6000 is that AMD no longer trades shots with NVIDIA for the top pro graphics performance spot. Last time around, there were some benchmarks that still favored team red. Now, the NVIDIA Quadro M6000 puts up pretty much a clean sweep.
GUI

Qt Creator 3.4.0 Released 20

Posted by timothy
from the well-that's-cute dept.
jones_supa writes: Qt Creator 3.4.0 has been released with many new features. Qt Creator is a C/C++ IDE with specialized tools for developing Qt applications, and it works great for general-purpose projects as well. The new version comes with a C++ refactoring option to move function definitions out of a class declaration, auto-completion for signals and slots in Qt5-style connects, experimental Qt Test and Qt Quick Tests support in the Professional and Enterprise edition, support for 64-bit Android toolchains, and various other improvements. More details on the new version can be found in the official announcement and the changelog.
Education

Virtual Reality Games Can Improve Memory Retention of Safety Instructions 36

Posted by samzenpus
from the don't-forget dept.
vrml writes: Using a virtual reality (VR) headset to experience risky situations as immersive 3D games improves memory retention of passenger safety instructions, according to research published in the IEEE Transactions on Visualization and Computer Graphics, and illustrated by a YouTube video. Researchers recruited occasional flyers: half of them played a VR gaming experience of an airliner water landing and evacuation, while the other half studied a real airline safety card. After one week, passengers who had studied the safety card suffered a significant loss of knowledge, while passengers who had played the VR game fully retained the safety knowledge gained. The research group has now made available its emergency water landing experience also for the Oculus Rift.