AMD Launches World's First Mobile DirectX 11 GPUs 169
J. Dzhugashvili writes "Less than 4 months after releasing the first DX11 desktop graphics card, AMD has followed up with a whole lineup of mobile graphics processors based on the same architecture. The new Mobility Radeon HD 5000 lineup includes four different series of GPUs designed to serve everything from high-end gaming notebooks to mainstream thin-and-light systems. AMD has based these processors on the same silicon chips as its desktop Radeon HD 5000-series graphics cards, so performance shouldn't disappoint. The company also intends to follow Nvidia's lead by offering notebook graphics drivers directly from its website, as opposed to relying on laptop vendors to provide updates."
Re:People Still Use DirectX??? (Score:3, Informative)
It's just a tool.
Linux support is coming, we promise! (Score:5, Informative)
Support in the open-source drivers is being written as fast as ATI can verify and declassify docs. Also the r600/r700 3D code should be mostly reusable for these GPUs.
Re:Driver Quality? (Score:1, Informative)
Re:Innovationz!!!! (Score:3, Informative)
Embedded systems may only be using a screen resolution of 640x480 or 800x600 rather than dual monitor 2048x1536. That's one energy/time saving. Then there won't be 900+ stream processors like the high-end gaming cards, there might just be 128 or 256. There's another saving. Anti-aliasing will be disabled as well, so that saves some processing time and power as well.
You will still have texture mapping, shadowing effects using fragment shaders, but just not as many triangles as the current gaming engines will all the effects turned on.
Re:People Still Use DirectX??? (Score:2, Informative)
Ok so...
XBOX 360 RISC
PS3 RISC
PS2 RISC
iPhone RISC
Most, if not all Mobile devices RISC
Wii RISC
Sun systems RISC
Need I go on?
If you need power and efficiency, you use RISC. Always. Try to come up with anywhere near as many examples for CISC.
Re:People Still Use DirectX??? (Score:2, Informative)
And that's why modern x86 processors are basically RISC processors with a decoder on them for legacy x86 instructions. Your comments haven't been insightful for quite some time now.
Re:People Still Use DirectX??? (Score:2, Informative)
So then -- and this is a genuine question -- why are RISC based devices so much more powerful while using a lower clock speed, and consuming less power?
For example, this video was recently referenced in a /. post a few days ago: http://www.youtube.com/watch?v=W4W6lVQl3QA [youtube.com]
Where an atom processor at 1.6GHz was just about on-par with a 500MHz ARM based processor.
Re:Driver Quality? (Score:2, Informative)
Obviously you have never tried running Linux on a system with a ATI graphics card.
Obviously you have never tried running Linux on a system with a nVidia graphics card.
It's seriously a PITA to get new drivers working on a new kernel with an old card. Anything pre-GeForce 8 may have annoying issues. Not a problem for desktop linux with a new videocard - but if you were setting up a Myth box on that old Athlon XP w/ 6600GT, you may be in for a headache.
Avoid distros like Ubuntu with automatic kernel updates. One update and suddenly your graphics drivers won't work and X won't start. Then it's back down to the CLI to figure out why the fully supported drivers with full 6600GT support don't work with your 6600GT.
P.S. I've been jaded by automatic updates.
ATI at it again... (Score:2, Informative)
Just upgraded my brother's laptop over the holiday. Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice. Option one, use the open source drivers which provide no 3d acceleration. Basically 3D is completely unusable. Option two, use an older distribution which has the required version of X, kernel support, and all dependent software. And with the second option comes all the associated security issues of running an old and unsupported distro. He chose to run a current distro and be stuck with 2d-only acceleration. All of the 3d games he had on his laptop are now completely unplayable; measured in fractions of frames per second.
It turns out ATI decided they would simply stop supporting his GPU and AFAIK, they have not released any 3D documentation on it. This is exactly the reason I've gone out of my way to never buy ATI. They drop support of cards like crazy leaving users completely stuck. And in something like laptops, which is exactly what this article is about, that means your entire laptop is now obsolete.
I don't care how many ATI fanboys there are that want to bash NVIDIA for providing binary blobs - the fact is, their stuff works and works well and best of all, they don't leave their users high and dry. The only problems I've had with NVIDIA was years ago when their first started providing 64-bit Linux drivers. So say what you will to support ATI, at the end of the day, they are still doing the same old thing and hurting their customers. Case in point, I have an nvidia video card which is older than my brothers laptop which is still supported by NVIDIA's drivers.
So what do you want as a user? Stuff that works year after year or a company (ATI) telling you when your equipment is obsolete and that you need to replace the entire computer?
For Linux there is still only one 3D option - NVIDIA. Period.
Re:People Still Use DirectX??? (Score:3, Informative)
Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?
I know you've specifically excluded Carmack here, but nonetheless, I think his opinion is not exactly irrelevant:
"DX9 is really quite a good API level. Even with the D3D side of things, where I know I have a long history of people thinking I'm antagonistic against it. Microsoft has done a very, very good job of sensibly evolving it at each step—they're not worried about breaking backwards compatibility—and it's a pretty clean API. I especially like the work I'm doing on the 360, and it's probably the best graphics API as far as a sensibly designed thing that I've worked with."
(the original interview that contained that quote seems to be offline, sadly, so I cannot give you the primary source, but googling for that phrase should give plenty of secondary sources)
Re:Driver Quality? (Score:2, Informative)
I have a system with a Radeon 9800 Pro card in it, no problems so far with Ubuntu.
A BenQ Joybook with X300, and a Toshiba Satellite with HD3470. And I have been running ubuntu since 7.10 to 9.10 with ATI drivers in these machines. Issues, such as flickering video and incompatibility between 3D acceleration and Compiz do exist you know. I can only Google Earth on top of compiz fine only just recently (9.04 & 9.10) if I'm not mistaken. Xinerama support, which was excellent in 8.xx became unusable in 9.04. I can't hook the notebook to projector during the 8.xx series if compiz is running.
Re:Driver Quality? (Score:3, Informative)
Well, perhaps it’b BECAUSE THEY STILL ARE!
I have written many lengthy comments about it. When they did still use APIs that were so old, that after being deprecated for a long time, they were taken completely out of the kernel. Rendering the drivers useless.
The same thing now happened with Xorg 1.7.
And how long ago did neither compositing, nor xrandr work? One or two months?
Hell, video still does not work. (Oh, it renders it. But unless you want to see huge black and white blots of over and underexposure at the same time, while having huge blocking in that tiny color space in-between, you can not call it “working”.)
Also, acceleration is NIL.
And let’s not forget that I can reproducibly crash the driver, by compiling the kernel or a big program in a terminal. Or swich a monitor off when in console mode. Basically everything where that crutch called “atieventsd” does not receive an event.
And don’t even dare to ask about proper OpenGL 3.0 + GLSL support.
And for the Linux driver being a the piece of shit that the Windows driver is, with a emergency layer wrapped around by a one-man team (seriously: ATi Linux driver development is one poor guy), that’s still impressive!
I will never again buy an ATi card, unless they open-source EVERYTHING! No exceptions. And then I wait a year on top of that, for the Xorg team to catch up.
You can say what you want about nVidia’s binary blob. But when I could not use my brand-new HD 4850 at all, a year ago, I was very happy that the onboard nVidia chip “just worked”. No hassle. emerge nvidia-drivers, and DONE.
Re:Most of the game world (Score:3, Informative)
Well... it IS easier to buy a video card that says DX10, and know that a game that says DX10 is going to run on it. Trying to keep up with all the extensions your card is going to support or not when you're at the store looking at games on a shelf would be a nightmare.