AMD Launches World's First Mobile DirectX 11 GPUs 169
J. Dzhugashvili writes "Less than 4 months after releasing the first DX11 desktop graphics card, AMD has followed up with a whole lineup of mobile graphics processors based on the same architecture. The new Mobility Radeon HD 5000 lineup includes four different series of GPUs designed to serve everything from high-end gaming notebooks to mainstream thin-and-light systems. AMD has based these processors on the same silicon chips as its desktop Radeon HD 5000-series graphics cards, so performance shouldn't disappoint. The company also intends to follow Nvidia's lead by offering notebook graphics drivers directly from its website, as opposed to relying on laptop vendors to provide updates."
People Still Use DirectX??? (Score:2, Interesting)
Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?
Let's just go over the platforms I work on:
PC graphics development - OpenGL
Linux graphics development - OpenGL
Mac graphics development - OpenGL
Android graphics development - OpenGL ES
iPhone graphics development - OpenGL ES
Embedded ARM based system development - OpenGL ES
even some OpenGL for console development.
Re: (Score:3, Informative)
It's just a tool.
Re:People Still Use DirectX??? (Score:4, Interesting)
I read your post and it occurred to me that it illustrates perfectly a key problem with software development today: short sightedness.
In an age of fast multiprocessing, it only makes sense to do everything you can to create abstraction layers that will ensure:
1. My software will have the widest possible audience regardless of platform. $$$
2. I will be able to extend the application, or create a new one with minimal effort by reusing modules I've already created to do hard things well/fast. $$$ (in form of turn-around time/effort)
3. If a vendor decides to break something in their firmware/hardware - I only have to fix one module that drives the given hardware - *NOT* the application itself. $$$ (ditto)
Flexibility, resiliency, more cash in your pocket...I don't see a down side to taking this approach. On modern gaming rigs in particular, there is no reason NOT to use OpenGL - for all it's perceived limitations compared to a tweaked out directX X86 app.
As a gamer myself, I look at it from another angle: I have Linux, Mac machines as well as a high-end Windows game rig - to host games (I like to create and share my own maps/scenarios in some games) cost efficiently I prefer to use the Linux server, and play on my Windows box....using and tweaking WINE in order to run the game (I'm not made of money and can't cost-justify a full compliment of windows servers - which also would waste resources since I am a *nix developer too). Getting WINE to work with some of the niche games I play is a royal pain. If the developers of said games took my advice, I would be running their games natively under linux with minimal headaches.
Flexibility and choice is good for the widest audience. Vendor lock-in is bad - and only serves a few types of people (the corporation$$$ and simple gamer-$$$). The funny thing is, these companies stand to make more money than they would under their lock-n strategy if they would think long term and build flexible extensible applications that benefit the largest audience. Lucky for me most of the titles I currently enjoy have taken this approach; I will continue to gravitate to those that do, and deny $$$ to those that won't.
Re: (Score:2)
This all sounds good in theory, but I don't think it works in practice.
- The Linux user is really small, so there's not much point in developing games for them; and worse, we're all tech-savvy, so most Linux gamers have a Windows computer somewhere if there is a game they want to run. So your first point is completely moot.
- I don't think game development benefits nearly as much from code reuse as most other software development.
Anyways, I think the current state of game development speaks in my favor. Take
Re: (Score:2)
The question is, with all your arguments favoring OpenGL over Direct3D, why are developers still using the latter?
And why do the developers of many portable 3D libraries and game development libraries (Crystal Space, OGRE, etc.) support Direct3D, even though they already have OpenGL support, which would supposedly work just fine?
Re: (Score:2)
There's a marketing window for most projects, whether it's a tie-in to a sport/movie/show or making sure your game isn't considered outdated upon release.
Software as a business is about releasing a product that is "good enough." Spending more time costs money, both in actual cost and potential costs (you could have the same dev team working on another project).
Tweaking may get more eyes, but it doesn't necessarily mean more money.
Re:People Still Use DirectX??? (Score:4, Insightful)
Re: (Score:3, Insightful)
Yeah those "poor sods" making multi-million dollar grossing titles. Seriously, I'm all for OpenGL. I like it because it does make ports easier and I'd like to see more games available on Linux and Mac.
The snide "are people STILL using technology X?" comments when technology X is the clear market leader are just annoying though.
Re: (Score:2)
As I point out at length in my post above, ironically they would make more money if they did cater to a wider audience. Of course, that would require 2 things:
1. Long Term Thought.
2. Abandoning unmaintainable hard coded monolithic structured program cores that spin on tweaked out low level directX hardware APIs...
Unfortunately everyone wants to be a rock star - so everyone is more concerned about the size of their bank account without consideration for the size it would be if they made titles that endure
Re: (Score:2)
Your "obvious" ideas require a significantly higher level of investment which 99% of the time won't pay off.
Sometimes it's like listening to an idiot harp on.
Most of the game world (Score:4, Interesting)
As well as a good deal of other Windows graphic programs. You can stick your head in the sand and pretend that Microsoft Windows isn't a major player, but you are fooling only yourself. Windows development matters a whole lot, and DX is the native API and thus many use it.
However, in this case the reference is to features of the card. See OpenGL is really bad about staying up to date with hardware. They are always playing catchup and often their "support" is just to have the vendors implement their own extensions. So when a new card comes out, talking about it in terms of OpenGL features isn't useful.
Well, new versions of DirectX neatly map to new hardware features. Reason is MS works with the card vendors. They tell the vendors what they'd like to see, the vendors tell them what they are working on for their next gen chips and so on. So a "DX11" card means "A card that supports the full DirectX 11 feature set." This implies many things, like 64-bit FP support, support for new shader models, and so on. IT can be conveniently summed up as DX11. This sets it apart to a DX10 card like the 8800. While that can run with DX11 APIs, it doesn't support the features. Calling it DX10 means it supports the full DX10 feature set.
So that's the reason. If you want to yell and scream how OpenGL should rule the world, you can go right ahead, however the simple fact of the matter is DirectX is a major, major player in the graphics market.
Re: (Score:2)
``However, in this case the reference is to features of the card. See OpenGL is really bad about staying up to date with hardware.''
How can that be, when it allows vendors to add their own extensions? Add a feature to your hardware, add an extension to OpenGL so programmers can use it. No need for delays.
``They are always playing catchup and often their "support" is just to have the vendors implement their own extensions.''
Is there a problem with that? I mean, yes, it would be nicer if features were immedia
Re: (Score:3, Informative)
Well... it IS easier to buy a video card that says DX10, and know that a game that says DX10 is going to run on it. Trying to keep up with all the extensions your card is going to support or not when you're at the store looking at games on a shelf would be a nightmare.
X [] O (Score:2)
Xbox 360 graphics development - DirectX
XNA (Xbox 360 indie games) graphics development - a managed API based on DirectX
Re:People Still Use DirectX??? (Score:5, Insightful)
Only all the AAA games on Windows, but clearly you are far more important than them.
Re: (Score:3, Informative)
Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?
I know you've specifically excluded Carmack here, but nonetheless, I think his opinion is not exactly irrelevant:
"DX9 is really quite a good API level. Even with the D3D side of things, where I know I have a long history of people thinking I'm antagonistic against it. Microsoft has done a very, very good job of sensibly evolving it at each step—they're not worried about breaking backwards compatibility—and it's a pretty clean API. I especially like the work I'm doing on the 360, and it's probabl
Re:People Still Use DirectX??? (Score:4, Funny)
Re: (Score:3, Insightful)
Is this 1990 again? We are back to RISC vs CISC? Intel and AMD showed that decoding CISC to RISC microps can be just as fast as RISC. They gain some performance advantage on the instruction cache hit rate vs pure RISC at the expense of some hardware logic(This only comes into play when compared to very low power devices)
Re: (Score:3, Insightful)
(This only comes into play when compared to very low power devices)
Which of course means "this only comes into play when looking at most widespread devices, shipping at least order of magnitude more units than x86"
Re: (Score:2)
Well, laptop, actually ;p
But even in desktops, there are possibly more ARM cores than x86 ones. Something in the monitor. Something in optical drive. HDD controller perhaps. Or WiFi controller.
Re: (Score:2, Insightful)
A RISC based architecture would be much better suited for todays computers.
Is this ignoring the fact that modern x86 chips from Intel are basically RISC chips with a CISC to RISC interpreter bolted on?
Re: (Score:2)
Re: (Score:2, Informative)
Ok so...
XBOX 360 RISC
PS3 RISC
PS2 RISC
iPhone RISC
Most, if not all Mobile devices RISC
Wii RISC
Sun systems RISC
Need I go on?
If you need power and efficiency, you use RISC. Always. Try to come up with anywhere near as many examples for CISC.
Re: (Score:2, Informative)
And that's why modern x86 processors are basically RISC processors with a decoder on them for legacy x86 instructions. Your comments haven't been insightful for quite some time now.
Re: (Score:2, Informative)
So then -- and this is a genuine question -- why are RISC based devices so much more powerful while using a lower clock speed, and consuming less power?
For example, this video was recently referenced in a /. post a few days ago: http://www.youtube.com/watch?v=W4W6lVQl3QA [youtube.com]
Where an atom processor at 1.6GHz was just about on-par with a 500MHz ARM based processor.
Re: (Score:2)
Remember how Macs went from being "twice as fast on the same clock speed" when using PPC to being surprisingly twice as fast when switching to Intel?
Re: (Score:2)
I remember that... those snail ads were just embarrassing to watch and I was a PC guy who really didn't like Macs. Having to explain to my Mac friend that those ads were so utterly misleading that they really were nothing more than blatant lies.
Comparing i486 to Pentium 4 again? (Score:2)
Wrong. Don't compare Motorola G4 junk to Intel Core duo. G4 has 133Mhz FSB for God's sake. Even if G4 had "bigger Mhz" (Sixpack), its 133Mhz FSB would still guarantee the horrible performance.
But... If we compare first G4 CPUs of the time to Intel CPU of that day, we can easily match 2x speed difference, especially with decent Altivec instructions. Obviously, you also need a good programmer/developer to effectively use them.
Lets talk about G5 and current POWER6, especially POWER6 which 4.0 Ghz speeds are co
Re: (Score:2)
Re: (Score:2)
that is debatable. But the competition is good. RISC isn't a miracle cure, but I like where ARM has been going the last few years. Hopefully the next year will see some Cortex A8 or Cortex A9 chips approach the performance of x86 chips (atom at least).
Qualcoms snapdragon is based on the Cortex A8 with a ton of custom development work, I have not really seen much in high performance Cortex A9 chips yet, but they are supposed to be on the way.
Re: (Score:2)
I'm not sure it has anything specifically to do with Windows. We have and will have x86 on Windows, regardless of any superior advances, due to binary compatibility. I think it has more to do with closed source software than the OS that it runs on. Windows on x86 won out over other home alternatives meaning that with closed source software, it is Windows on x86 for the foreseeable future. Not like you can port the software over when you don't have access to it.
Re: (Score:2)
As nobody (including IBM) can match the huge R&D money Intel spends to maintain that CISC CPU and Intel became essentially a monopoly after Apple's give up of PowerPC, it is a lock down issue.
I mean you can make World's most advanced, fast RISC CPU today (compare G5 (IBM 970) to Intels of that time) but when Joe Sixpack asks if there is Windows support or Developer Joe Sixpack wants to use Visual Studio, you are stuck.
Trust me, I am writing this on a Quad G5 (970MP) now. Interestingly, when I learn more
Re: (Score:2)
One of the early ideas for a name was actually "DirectX Box"
Re: (Score:2)
Most?
Besides iD tech are there really any engines that use OpenGL?
Re: (Score:2)
Re: (Score:2)
Are you kidding?
Since when have Open Source 3D engines become popular games?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
An engine isn't a game, and very very few games which use any of those listed engines barring UE3 aren't very big sellers. In fact, most of the games stink, I know this because I've tried almost all of them under Linux.
Re: (Score:2)
Re: (Score:2)
Very true but I was thinking more about newer games
Re: (Score:2)
OpenGL isn't a gaming API. It's an interface to a graphics card, that's it.
Re: (Score:2)
"Microsoft would like you to think it is a gaming API "
The troll is strong in you.
DirectX IS a gaming API, while OpenGL is NOT. It's not really something that is up for debate. You might be able to argue that OpenGL is a better graphics API than DirectX, but even that isn't really true. DirectX is much much more powerful than OpenGL is when doing gaming related tasks and graphics.
Innovationz!!!! (Score:4, Funny)
DirectX 11 in a mobile device? So the device doubles as a hairdryer?
Re: (Score:3, Informative)
Embedded systems may only be using a screen resolution of 640x480 or 800x600 rather than dual monitor 2048x1536. That's one energy/time saving. Then there won't be 900+ stream processors like the high-end gaming cards, there might just be 128 or 256. There's another saving. Anti-aliasing will be disabled as well, so that saves some processing time and power as well.
You will still have texture mapping, shadowing effects using fragment shaders, but just not as many triangles as the current gaming engines will
Re: (Score:2)
Um, the mobility 5870 has 800 Stream Processing Units and utilizes 1.04 billion 40nm transistors.
http://www.amd.com/us/products/notebook/graphics/ati-mobility-hd-5800/Pages/hd-5870-specs.aspx [amd.com]
Re: (Score:2)
Thanks for that link - I'm amazed that the energy demand for all of that is only 50 watts, compared to the 300 watts required in the past for other cards. You could have a cluster of those and still use a standard electric socket.
Re: (Score:2)
I'm amazed that anyone bothers to seriously consider an ancillary device that uses 50 Watts of energy as something which can be usefully utilized in a light-weight device powered by batteries, let alone to go all the way to designing and marketing the thing.
Linux support is coming, we promise! (Score:5, Informative)
Support in the open-source drivers is being written as fast as ATI can verify and declassify docs. Also the r600/r700 3D code should be mostly reusable for these GPUs.
Re: (Score:3, Interesting)
How many years was it again that they promised to produce open source graphic drivers for Linux? I've lost count and have ordered a new motherboard with a silent Nvidia based graphics card because I just *HAD* it with ATI on Linux. My AMD chipset motherboard also had a lot of SATA instability under Linux and I had all kinds of problems letting the system know how to read any of the CPU's censors (X2 Phenom based CPU). So I have just ordered an Intel based CPU/chipset as well.
I've no doubt that AMD is slowly
Re:Linux support is coming, we promise! (Score:4, Interesting)
How many years was it again that they promised to produce open source graphic drivers for Linux?
Announced: September 7th, 2007: press release [amd.com]
Since then they've been catching up more and more, the HD5xxx/Evergreen/R800 instruction set was posted before Christmas so the docs are almost up to date, minus a few things like UVD2. Also AMD promised to help the open source community, not write the whole thing themselves and it's making big strides but there's also a lot of rework going on in xorg to support a modern desktop.
Re: (Score:2)
Are you that guy from the Debian mailing list? You sound like him - unjustly bashing ATI/AMD, misrepresenting their statements, and exaggerating the problems ATI on Linux has.
They fact is that even the binary drivers (yuck) are much better than thy used to be, and the Free drivers are moving along by leaps and bounds. AMD has done very well with their promise to deliver documentation, and the Xorg guys are improving drivers as fast as they can, given limited manpower, and a rather large amount of (needed) c
Re: (Score:2)
Are you that guy from the Debian mailing list? You sound like him - unjustly bashing ATI/AMD, misrepresenting their statements, and exaggerating the problems ATI on Linux has.
No, for a normal user they are unusable. Any less advanced person would not have spend those kind of hours on configuring a graphics card.
They fact is that even the binary drivers (yuck) are much better than thy used to be, and the Free drivers are moving along by leaps and bounds. AMD has done very well with their promise to deliver documentation, and the Xorg guys are improving drivers as fast as they can, given limited manpower, and a rather large amount of (needed) churn in Xorg (DRI2, KMS, TTM/GEM, Gallium3D) that they need to keep up with.
Oh, I'm not bashing anyone, trust me on this.
I'm just this guy waiting on some kind of normal display/sound drivers on my Linux computers. Currently doing anything slightly over running vesa or nvidia for graphics and very basic sound stuff sucks on Linux (or at least the last 4 Ubuntu versions I tried. Don't mistake this comment for "Linux sucks". I love the way many thi
Re: (Score:2)
>No, for a normal user they are unusable. Any less advanced person would not have spend those kind of hours on configuring a graphics card.
I don't remember doing much configuration for my r300
Re: (Score:2)
Well, congratulations for getting a well working configuration. But don't assume your easy configuration is the norm. Things are getting better, but we're not there yet.
his ease of config is absolutely the norm. (Score:2)
All I did was pop in a Fedora 12 livecd and my R500 card started working. Absolutely no configuration. Whatsoever.
Hell, my old roommate uses Gentoo and even he doesn't have to do much of any configuration to get it running, all he does is build X as usual, with radeon support. If you still need to do manual configuration of X on a modern setup, you are failing hard.
Re: (Score:2)
since my first slackware CD's
Slackware comes on a CD now? I can finally get rid of my pallet of floppies!
Re: (Score:3, Insightful)
At first glance, from the subject line, I thought your post was a snide comment about the state of official ATI drivers on linux. I must say though, you guys are doing an excellent job at picking up ATI's slack.
Re: (Score:2)
Not fast enough, I dumped my perfectly fine Radeo 4850 in favor of a somewhat slower NVidia, the reason was that X support was hit and miss, half the 3d functions crashed X others worked. I then dropped in my NVidia card and everything worked out of the box.
I do not care for how many years we got promises, the linux drivers suck donkey balls, and probably will be forever.
Wake me up when the stability is up to NVidias offerings, or shock the Intel opensource drivers.
Re: (Score:2)
Re: (Score:3, Insightful)
Your post is roughly fourteen months out-of-date. In the past year, TTM and GEM have both matured and been submitted to the main kernel, providing memory management services to nouveau, radeon, intel, and via. GLX 1.4 support is now advertised server-side for DRI2 stacks.
In Mesa, most of GLSL is now supported by the drivers that can accelerate it, and the actual GLSL hooks are now in place for r600 and i965. Additionally, in Gallium, work is underway to provide GL 2.0+ on i915, i965, r300+, and nv30+ (all G
ATI at it again... (Score:2, Informative)
Just upgraded my brother's laptop over the holiday. Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice. Option one, use the open source drivers which provide no 3d acceleration. Basically 3D is completely unusable. Option two, use an older distribution which has the required version of X, kernel support, and all dependent software. And with the second option comes all the associated security issues of running an old and unsupported distro. He chose to run a current dist
Re: (Score:2)
>Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice. Option one, use the open source drivers which provide no 3d acceleration.
Bullpucky. Any/all cards that are not supported by the binary drivers do have 3D support from the OSS drivers.
>For Linux there is still only one 3D option - NVIDIA. Period.
Funny, my experience with 3D with both Intel and ATI has been great
Re: (Score:2, Interesting)
Re: (Score:2)
>[r500] radeon opensource drivers get an average of 10fps in older games such as UT2004 or less powerfull games like Touhou 8.
Huh. I admit I have no personal experience with 3D OSS on the r500, I just knew support existed. Obviously not that helpful if Imperishable Night only gets 10fps, though.
I hope r300g and r600g get usable soonish.
Re: (Score:2)
Oddly enough, I just saw that the RadeonProgram wiki page says TH08 is "platinum" with Mesa 7.5 on r500. Something doesn't seem right here...although I guess technically the definition of "platinum" they give doesn't say anything about speed, just correctness. Still, you might think that speed so bad that it is unplayable on low settings might be worth noting.
Re: (Score:2)
Not feeling like replying to the troll posts, but I'll reply to you.
I probably reported the TH08 status; I'm kind of a danmaku fan. It was totally playable on an X1950 on an all-classic setup, but there's been a few regressions since and I bet the performance has dropped a little bit. We had to sacrifice a bit of performance to add stuff like FBOs and DRI2. It should start getting better soon though; we've started paying attention to speed and such so the next couple months should see big speedups.
Re: (Score:2)
All of the 3d games he had on his laptop are now completely unplayable; measured in fractions of frames per second.
Bullpucky. Any/all cards that are not supported by the binary drivers do have 3D support from the OSS drivers.
That's simply not even close to being true. Does my original quote sound like the open source drivers are providing 3D acceleration. Proprietary driver 20-70 fps depending on the game. Open source driver, unchanged settings, 0.008 fps. I rounded down the time in the provided fps for the open source driver, which make that number even larger than it actually is. In one of the games it took over two minutes to render one frame. No joke either. Does the later of the two numbers hint to you that the driver is n
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
Re:Driver Quality? (Score:5, Insightful)
1995 called and wants their "ATI drivers are crap" comment back.
Obviously you have never tried running Linux on a system with a ATI graphics card.
Re:Driver Quality? (Score:5, Interesting)
I have three in my system. :3
Re: (Score:2)
Re: (Score:2, Informative)
Obviously you have never tried running Linux on a system with a ATI graphics card.
Obviously you have never tried running Linux on a system with a nVidia graphics card.
It's seriously a PITA to get new drivers working on a new kernel with an old card. Anything pre-GeForce 8 may have annoying issues. Not a problem for desktop linux with a new videocard - but if you were setting up a Myth box on that old Athlon XP w/ 6600GT, you may be in for a headache.
Avoid distros like Ubuntu with automatic kernel updates. One update and suddenly your graphics drivers won't work and X won't start. Then it
Re: (Score:2)
Seriously? Why did I get modded troll? Every time I mention Linux I get modded troll.
It's well known that you need to keep your drivers up to date to work with new kernel versions. And when those new drivers don't like your videocard anymore, you get screwed over.
I wouldn't have mentioned it if it wasn't common, but it is. It happens all the time.
To be frank, it's about as common as nVidia drivers messing up during an update on Windows - but that at least kicks you to an 800x600x8 desktop, or only BSODs whe
Re: (Score:2)
Obviously you have never tried running Linux on a system with a ATI graphics card.
. One update and suddenly your graphics drivers won't work and X won't start. Then it's back down to the CLI to figure out why the fully supported drivers with full 6600GT support don't work with your 6600GT.
P.S. I've been jaded by automatic updates.
Wrong, dkms takes care of automatically (re)compiling the nvidia module if needed. This happens on boot, before X starts. All good.
Re: (Score:2, Informative)
I have a system with a Radeon 9800 Pro card in it, no problems so far with Ubuntu.
A BenQ Joybook with X300, and a Toshiba Satellite with HD3470. And I have been running ubuntu since 7.10 to 9.10 with ATI drivers in these machines. Issues, such as flickering video and incompatibility between 3D acceleration and Compiz do exist you know. I can only Google Earth on top of compiz fine only just recently (9.04 & 9.10) if I'm not mistaken. Xinerama support, which was excellent in 8.xx became unusable in 9.04. I can't hook the notebook to projector during the 8.xx series if compiz is runnin
Re: (Score:3, Insightful)
Re: (Score:2)
It was only 3 years ago when I gave up on ATI and switched to NVidia because ATIs drivers could not handle bad inputs, and would crash the entire system. So I had to write my own abstraction layer to ensure that no bad point coordinates and so on could be sent to the driver. I also filed kernel crash bugs with ATI that took forever to get fixed. After I switched to NVidia, I have yet to see a single kernel failure due to programming mistakes. Their drivers are just rock solid. So much better to develop on t
Re: (Score:3, Informative)
Well, perhaps it’b BECAUSE THEY STILL ARE!
I have written many lengthy comments about it. When they did still use APIs that were so old, that after being deprecated for a long time, they were taken completely out of the kernel. Rendering the drivers useless.
The same thing now happened with Xorg 1.7.
And how long ago did neither compositing, nor xrandr work? One or two months?
Hell, video still does not work. (Oh, it renders it. But unless you want to see huge black and white blots of over and underexposu
Re: (Score:2)
I'm sure I'll be modded into the toilet for this, in these modern times, but:
If you don't like it, code something better. Can't code? Document it better. Can't document? Organize it better. Can't organize? Pay someone else to do it (or any of the other things, really).
There's lots of ways to help push the open-source ATI drivers along, even for a skill-less schmuck as myself. (I just don't care enough to even bother with complaining about it anymore.)
Re: (Score:2)
On the whole, I would at least partially agree with you, but this is a project to make ATI cards worthwhile under Linux. It is therefore reasonable to want AMD to properly support it if they aren't going to provide drivers themselves which are comparable to the Windows ones.
They do (Score:2)
It is therefore reasonable to want AMD to properly support it if they aren't going to provide drivers themselves
ATI/AMD *do* support the Radeonhd project. They provide documentation and test code - although at a very slow pace.
On the other hand Nvidia are completely ignoring the Nouveau project. (At least, they don't sue or DMCA neither).
Re: (Score:2)
Actually, if you rely on your laptop manufacturer to provide you updated drivers, they DO suck. The decision to over the drivers from their website for mobile cards is an amazing decision they should've made years ago.
Re: (Score:2)
The $300 card I bought around 2002 had enough driver issues that I've never bought another ATI card since. So you can add at least 7 years onto that.
Re: (Score:2, Insightful)
Anyways until then i'll be sticking with nvidia cards.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Are these cards fast enough to run the games in DX11 mode?
Re: (Score:2)
At least the higher end models will. They have 800, 400 and 80 stream processors respectively.
Re: (Score:2)
Some of the reviews of "Dirt 2" had suggested that ATI 5xxx cards were up to 50% faster in DX9 mode.
Re: (Score:2)
I wouldn't know, but the 800 stream processor mobile card looks like it has very similar performance to the desktop 5xxx cards. Even at 75% speed, it should still be playable. Besides, DX11 is brand spanking new, I would expect some time before the drivers mature.
Re: (Score:2)
So, in your opinion, all technical progress should stop at once?
Re: (Score:2)
heh dx10 (Score:2)
Funny thing is, MS never learned their lesson. DirectX 10 was Vista exclusive (!!!) technology and all gamers were running XP! So, except the usual MS ass kisser companies, nobody was that stupid to release a directx 10 game.
Guess what? DirectX 11 is a Windows 7 exclusive technology!
I pity the idiots coding in directx only in this age, especially after iPhone and Intel OS X revolution. How many years must pass for them to understand?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I have the same experience with my 200M. On windows, I can play older games with no problems. On linux, using the open drivers, I don't get good enough acceleration to use Compiz with no extensions.
Re: (Score:2)
You forget to mention the Apple Newton. There are people STILL using their Newtons today. It had handwriting recognition years ahead of anything comparable, and communication capabilities. When Steve Jobs went to Next, Mr Sculley(sic) (another corporate droid - in the theme of your post) shut it down.
Over the years Newton enthusiasts have asked the company many times to release the code so they could port their beloved operating system to newer hardware as their Newtons died of old age. Apple always re