Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Graphics Portables Technology

AMD Launches World's First Mobile DirectX 11 GPUs 169

J. Dzhugashvili writes "Less than 4 months after releasing the first DX11 desktop graphics card, AMD has followed up with a whole lineup of mobile graphics processors based on the same architecture. The new Mobility Radeon HD 5000 lineup includes four different series of GPUs designed to serve everything from high-end gaming notebooks to mainstream thin-and-light systems. AMD has based these processors on the same silicon chips as its desktop Radeon HD 5000-series graphics cards, so performance shouldn't disappoint. The company also intends to follow Nvidia's lead by offering notebook graphics drivers directly from its website, as opposed to relying on laptop vendors to provide updates."
This discussion has been archived. No new comments can be posted.

AMD Launches World's First Mobile DirectX 11 GPUs

Comments Filter:
  • by confused one ( 671304 ) on Thursday January 07, 2010 @04:51PM (#30687330)
    And that makes perfect sense if you're targeting all those different platforms. There may even be perfectly reasonable reasons to use OpenGL over DirectX based on your coding requirements and the APIs. However, if you're target audience is Window and Windows Embedded only, and there are no requirements that are better served by OpenGL, there's no reason not to use DirectX.

    It's just a tool.
  • by MostAwesomeDude ( 980382 ) on Thursday January 07, 2010 @05:10PM (#30687572) Homepage

    Support in the open-source drivers is being written as fast as ATI can verify and declassify docs. Also the r600/r700 3D code should be mostly reusable for these GPUs.

  • Re:Driver Quality? (Score:1, Informative)

    by Anonymous Coward on Thursday January 07, 2010 @05:13PM (#30687616)
    Yes, it was a major issue when the game released, ie. essentially every 5xxx series card getting hardlocks; the recent drivers appear to have more or less fixed it after gearbox just ignored the issue for a couple months. Was also a huge issue with catalyst AI and breaking textures, but that was also a fix from the last catalyst update. I guess it's less of an ATi driver issue and more developers botching it these days, ATi is definitely doing better lately. see: http://gbxforums.gearboxsoftware.com/showthread.php?t=78815&highlight=radeon [gearboxsoftware.com]
  • by mikael ( 484 ) on Thursday January 07, 2010 @05:19PM (#30687672)

    Embedded systems may only be using a screen resolution of 640x480 or 800x600 rather than dual monitor 2048x1536. That's one energy/time saving. Then there won't be 900+ stream processors like the high-end gaming cards, there might just be 128 or 256. There's another saving. Anti-aliasing will be disabled as well, so that saves some processing time and power as well.

    You will still have texture mapping, shadowing effects using fragment shaders, but just not as many triangles as the current gaming engines will all the effects turned on.

  • by Nabeel_co ( 1045054 ) on Thursday January 07, 2010 @05:52PM (#30688052) Homepage

    Ok so...
    XBOX 360 RISC
    PS3 RISC
    PS2 RISC
    iPhone RISC
    Most, if not all Mobile devices RISC
    Wii RISC
    Sun systems RISC
    Need I go on?

    If you need power and efficiency, you use RISC. Always. Try to come up with anywhere near as many examples for CISC.

  • by Lunix Nutcase ( 1092239 ) on Thursday January 07, 2010 @05:55PM (#30688114)

    And that's why modern x86 processors are basically RISC processors with a decoder on them for legacy x86 instructions. Your comments haven't been insightful for quite some time now.

  • by Nabeel_co ( 1045054 ) on Thursday January 07, 2010 @06:20PM (#30688450) Homepage

    So then -- and this is a genuine question -- why are RISC based devices so much more powerful while using a lower clock speed, and consuming less power?

    For example, this video was recently referenced in a /. post a few days ago: http://www.youtube.com/watch?v=W4W6lVQl3QA [youtube.com]

    Where an atom processor at 1.6GHz was just about on-par with a 500MHz ARM based processor.

  • Re:Driver Quality? (Score:2, Informative)

    by BikeHelmet ( 1437881 ) on Thursday January 07, 2010 @06:31PM (#30688586) Journal

    Obviously you have never tried running Linux on a system with a ATI graphics card.

    Obviously you have never tried running Linux on a system with a nVidia graphics card.

    It's seriously a PITA to get new drivers working on a new kernel with an old card. Anything pre-GeForce 8 may have annoying issues. Not a problem for desktop linux with a new videocard - but if you were setting up a Myth box on that old Athlon XP w/ 6600GT, you may be in for a headache.

    Avoid distros like Ubuntu with automatic kernel updates. One update and suddenly your graphics drivers won't work and X won't start. Then it's back down to the CLI to figure out why the fully supported drivers with full 6600GT support don't work with your 6600GT.

    P.S. I've been jaded by automatic updates.

  • ATI at it again... (Score:2, Informative)

    by GooberToo ( 74388 ) on Thursday January 07, 2010 @06:41PM (#30688724)

    Just upgraded my brother's laptop over the holiday. Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice. Option one, use the open source drivers which provide no 3d acceleration. Basically 3D is completely unusable. Option two, use an older distribution which has the required version of X, kernel support, and all dependent software. And with the second option comes all the associated security issues of running an old and unsupported distro. He chose to run a current distro and be stuck with 2d-only acceleration. All of the 3d games he had on his laptop are now completely unplayable; measured in fractions of frames per second.

    It turns out ATI decided they would simply stop supporting his GPU and AFAIK, they have not released any 3D documentation on it. This is exactly the reason I've gone out of my way to never buy ATI. They drop support of cards like crazy leaving users completely stuck. And in something like laptops, which is exactly what this article is about, that means your entire laptop is now obsolete.

    I don't care how many ATI fanboys there are that want to bash NVIDIA for providing binary blobs - the fact is, their stuff works and works well and best of all, they don't leave their users high and dry. The only problems I've had with NVIDIA was years ago when their first started providing 64-bit Linux drivers. So say what you will to support ATI, at the end of the day, they are still doing the same old thing and hurting their customers. Case in point, I have an nvidia video card which is older than my brothers laptop which is still supported by NVIDIA's drivers.

    So what do you want as a user? Stuff that works year after year or a company (ATI) telling you when your equipment is obsolete and that you need to replace the entire computer?

    For Linux there is still only one 3D option - NVIDIA. Period.

  • by shutdown -p now ( 807394 ) on Thursday January 07, 2010 @07:40PM (#30689214) Journal

    Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?

    I know you've specifically excluded Carmack here, but nonetheless, I think his opinion is not exactly irrelevant:

    "DX9 is really quite a good API level. Even with the D3D side of things, where I know I have a long history of people thinking I'm antagonistic against it. Microsoft has done a very, very good job of sensibly evolving it at each step—they're not worried about breaking backwards compatibility—and it's a pretty clean API. I especially like the work I'm doing on the 360, and it's probably the best graphics API as far as a sensibly designed thing that I've worked with."

    (the original interview that contained that quote seems to be offline, sadly, so I cannot give you the primary source, but googling for that phrase should give plenty of secondary sources)

  • Re:Driver Quality? (Score:2, Informative)

    by PixetaledPikachu ( 1007305 ) on Thursday January 07, 2010 @08:06PM (#30689438)

    I have a system with a Radeon 9800 Pro card in it, no problems so far with Ubuntu.

    A BenQ Joybook with X300, and a Toshiba Satellite with HD3470. And I have been running ubuntu since 7.10 to 9.10 with ATI drivers in these machines. Issues, such as flickering video and incompatibility between 3D acceleration and Compiz do exist you know. I can only Google Earth on top of compiz fine only just recently (9.04 & 9.10) if I'm not mistaken. Xinerama support, which was excellent in 8.xx became unusable in 9.04. I can't hook the notebook to projector during the 8.xx series if compiz is running.

  • Re:Driver Quality? (Score:3, Informative)

    by Hurricane78 ( 562437 ) <deleted @ s l a s h dot.org> on Thursday January 07, 2010 @08:38PM (#30689642)

    Well, perhaps it’b BECAUSE THEY STILL ARE!

    I have written many lengthy comments about it. When they did still use APIs that were so old, that after being deprecated for a long time, they were taken completely out of the kernel. Rendering the drivers useless.

    The same thing now happened with Xorg 1.7.

    And how long ago did neither compositing, nor xrandr work? One or two months?

    Hell, video still does not work. (Oh, it renders it. But unless you want to see huge black and white blots of over and underexposure at the same time, while having huge blocking in that tiny color space in-between, you can not call it “working”.)
    Also, acceleration is NIL.

    And let’s not forget that I can reproducibly crash the driver, by compiling the kernel or a big program in a terminal. Or swich a monitor off when in console mode. Basically everything where that crutch called “atieventsd” does not receive an event.

    And don’t even dare to ask about proper OpenGL 3.0 + GLSL support.

    And for the Linux driver being a the piece of shit that the Windows driver is, with a emergency layer wrapped around by a one-man team (seriously: ATi Linux driver development is one poor guy), that’s still impressive!

    I will never again buy an ATi card, unless they open-source EVERYTHING! No exceptions. And then I wait a year on top of that, for the Xorg team to catch up.

    You can say what you want about nVidia’s binary blob. But when I could not use my brand-new HD 4850 at all, a year ago, I was very happy that the onboard nVidia chip “just worked”. No hassle. emerge nvidia-drivers, and DONE.

  • by Arterion ( 941661 ) on Thursday January 07, 2010 @08:50PM (#30689724)

    Well... it IS easier to buy a video card that says DX10, and know that a game that says DX10 is going to run on it. Trying to keep up with all the extensions your card is going to support or not when you're at the store looking at games on a shelf would be a nightmare.

Our business in life is not to succeed but to continue to fail in high spirits. -- Robert Louis Stevenson

Working...