Intel Researchers Consider Ray-Tracing for Mobile Devices 120
An anonymous reader points out an Intel blog discussing the feasibility of Ray-Tracing on mobile hardware. The required processing power is reduced enough by the lower resolution on these devices that they could realistically run Ray-Traced games. We've discussed the basics of Ray-Tracing in the past. Quoting:
"Moore's Law works in favor of Ray-Tracing, because it assures us that computers will get faster - much faster - while monitor resolutions will grow at a much slower pace. As computational capabilities outgrow computational requirements, the quality of rendering Ray-Tracing in real time will improve, and developers will have an opportunity to do more than ever before. We believe that with Ray-Tracing, developers will have an opportunity to deliver more content in less time, because when you render things in a physically correct environment, you can achieve high levels of quality very quickly, and with an engine that is scalable from the Ultra-Mobile to the Ultra-Powerful, Ray-Tracing may become a very popular technology in the upcoming years."
Inverse Moore's Law (Score:5, Insightful)
Inverse Moore's Law states that the more time that developers spend on making games look 'pretty', the less time they spend on playability.
Re:Inverse Moore's Law (Score:5, Funny)
Re: (Score:2, Informative)
Re: (Score:1)
Re: (Score:3, Interesting)
Re: (Score:3, Insightful)
Re: (Score:2)
Strap it on your noggin and immerse... or pick it up and dial.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Re:Inverse Moore's Law (Score:5, Insightful)
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re:Inverse Moore's Law (Score:5, Informative)
Current rasterization approaches use a lot of approximations, it's true, but they can get away with that because in interactive graphics, most things don't need to look perfect. It's true that there's been a lot of cool work done lately with interactive ray tracing, but for anything other than very simple renderings (mostly-static scenes with no global illumination and hard shadows), ray tracers *also* rely on a bunch of approximations. They have to: getting a "perfect", physically correct result is just not a process that scales well. (Check out The Rendering Equation on wikipedia or somewhere else if you're interested; there's a integral over the hemisphere in there that has to be evaluated, which can recursively turn into a multi-dimension integral over many hemispheres. Without cheating, the evaluation of that thing is going to kick Moore's law's ass for a long, long time.)
By the way, the claim that with a "physically correct environment, you can achieve high levels of quality very quickly" doesn't really make much sense. What's a "physically correct environment" and what is it about rasterization that can't render one? How are we defining "high levels of quality" here? And "very quickly" is just not something that applies much to ray tracers at the moment, especially in the company of "physically correct".
Re: (Score:1)
Re: (Score:2, Informative)
Re:Inverse Moore's Law (Score:4, Informative)
And about scalability, you're right, of course; ray tracing does scale better with scene complexity than rasterization does, and as computing power increases it will make more and more sense to use ray tracing. However, the ray tracing vs. rasterization argument has been going on for decades now, and while ray tracing researchers always seem convinced that ray tracing is going to suddenly explode and pwn the world, it hasn't happened yet and probably won't for the forseeable future. Part of it is just market entrenchment: there are ray tracing hardware accelerators, sure, but who has them? And although I've never worked with one, I'd imagine they'd have to be a bit limited, just because ray tracing is a much more global algorithm than rasterization... I can't see how it'd be easy to cram it into a stream processor with anywhere near as much efficiency as you could with a rasterizer. On the other hand, billions are invested into GPU design every year, and even the crappiest computers one nowadays. With GPUs getting more and more powerful and flexible by the year, and ray tracing basically having to rely on CPU power alone, the balance isn't going to radically shift anytime soon.
For the record, although I do research with both, I prefer ray tracing. It's conceptually simple, it's elegant, and you don't have to do a ton of rendering passes to get simple effects like refraction (which are a real PITA for rasterization). But when these articles come around (as they periodically do on Slashdot) claiming that rasterization is dead and ray tracing is the future of everything, I have to laugh. That may happen but not for a good long while.
Re: (Score:1)
For the moment rasterization still have a small breading space as lon
photon mapping (Score:3, Interesting)
Photon mapping is a pretty good way of getting an unbiased approximation
Re: (Score:1)
Ray tracers don't rely on approximations for "anything other than very simple renderings", they rely on approximations for all rendering. For one thing it completely ignores the wave nature of the objects being modelled. There was an attempt (published in Siggraph in the 70's iirc) to render by directly modelling wave interactions of light with the modelled objects but even the supercomputers of the time were inadequate... could be time to revisit that.
Funny though, the argument, about processing power in
Re: (Score:1)
My guess would be lower resolutions (and the article says this too). Also mobile devices tend not to have powerful GPUs dedicated for doing 3D graphics the traditional polygon way - it's much easier to compete with software rendering than with hardware.
I'm confused by the "Moore's Law works in favor of Ray-Tracing" bit though - Moore's Law works in favour of any computation. And traditional polygon rasterisation
Gamers and their "games"... (Score:2)
However since Doom, I've never totally played a game just to pass the time scoring on my friends, or getting good at Tetris. I actually love the experience of exploring, and going somewhere else that I can't go in real life. In these terms, the bigger the world a game company can create, the more visually stunning and accurate the photorealism is, and
But Will It Run On Intel 915 Chipsets? (Score:1)
Just ask Microsoft and the Vista team in particular.
Re: (Score:2)
The computer graphics inverse of Moore's Law is known as Blinn's Law, and it essentially says that audience expectation rises at the same rate as Moore's Law.
Originally posed for the animation/vfx industries, the actual statement of Blinn's Law is that the amount of time it takes to compute one frame of film is constant over time. The corollary is that it doesn't matt
Re: (Score:2)
Moore's Law works in favor of Ray-Tracing, because it assures us that computers will get faster - much faster - while monitor resolutions will grow at a much slower pace.
As you know Moore is one of founders of Intel. Their research works in favor of Intel too, instead of pushing the boundaries of OpenGL ES or J2ME extensions for 3d (which are also based on opengl), offload the thing which could be done as a trivial task by mini GPU of cell phone to the cell phone CPU which Intel has significant market share.
I loved USB1/2 technology until I have purchased a Firewire hard drive and wondered around web for an explanation why 400Mbit Firewire is almost 2x real World performan
Sweet!! (Score:2, Funny)
Can't wait for my contacts list at sunset! (Score:3, Interesting)
A popular technology? Like a working filesystem? They're real popular I hear. Or an o
Brilliant! (Score:5, Funny)
"Holy Crap! Mobile gaming devices have tiny screens, imagine how easy it'd be to use advanced raytracing graphics!"
"Brilliant!"
Re: (Score:2)
"Hmmm....."
(long pause)
"What about rendering really small scenes on a big stonking server and then using some sort of 'Network' to make the images appear?"
"That sounds like some kind of magic!"
Fantastic research [acm.org].
"computational requirements" (Score:3, Insightful)
This attitude is why even tho our computers are 1000x faster then we had 20 years ago, they actually perform worse overall.
Re: (Score:3, Insightful)
This attitude is why even tho our computers are 1000x faster then we had 20 years ago, they actually perform worse overall.
I would say yes and no. Its one thing to have the computer do something simply becase it can; I agree that is very wasteful. Raytracing is not needed on a 300x200 screen; especically while plaing a game and things are moving.
On the otherhand 20 years ago like today we compormised and dispensed with things or found was to "fake it" in cases where the computer's conuld not deliver. Its really not critical shadows are rendered perfectly on my mobile phone while I am playing Doom57 Mobile Edition. An arch
Re: (Score:2, Insightful)
20 years ago, no one was connected to a 3mbps line, listening to music, with a mail and an IM client constantly pinging back, watching a video on youtube in one of twenty tabs in my firefox, with vim/emacs/eclipse open, azureus plugging away at some torrents as fast as it could, on two 1280x1024 screens in real colour, all simultaneously, on a single core I bought years ago. I still don't notice significant slowdowns.
Re: (Score:2)
if it has to be explained why the 'older day's of computing was better then today, you are too young and would never understand.
LOADING....... (Score:2)
Re: (Score:2)
Re: (Score:2)
I'm looking at my desktop right now, and I'm running no less than 6 different applications including a web browse
prog10 (Score:5, Funny)
Re:prog10 (Score:5, Informative)
Summary is misleading (Score:3, Informative)
Re: (Score:1)
1993 to 2008 = 15 years = 180 months = 10 Moore's Law Cycles
Monitor size over that period was from 640x480 to 1920x1200 (pulling from my butt, we could argue EGA vs VGA and what percentage of the users actually have 1920x1200, but its a place to start)
Pixels: 640x480 = 307,200 to 1920x1200 = 2304000 which is a factor of 7.5.
So, in summary, over the past 15 years Moore's Law has eclipsed monitor growth. 10 times vs. 7.5 times.
I
Re: (Score:1)
So, I think it's safe to say the summary ISN'T misleading.
Re: (Score:2)
Re: (Score:3, Informative)
Doubling the number of transistors on an LCD does not double the resolution (as you pointed out), it only multiplies each dimension by the square root of 2. Doubling the number of transistors on a CRT does nothing (well, maybe it gives you a more impressive OSD). But even limiting it to LCDs, it does not hold up. Display resolution
Re: (Score:2, Interesting)
Re: (Score:2, Informative)
http://en.wikipedia.org/wiki/Eye#Acuity [wikipedia.org]
http://www.dansdata.com/gz029.htm [dansdata.com]
Piggy-backing on Dan's hand waving, 300 dpi at 1 foot is a decent rule of thumb, and waving my own hands, 1 foot is a reasonable minimum distance for a handheld device(I don't imagine most people holding something any c
Good for Intel, needs more work (Score:5, Interesting)
Lockout chip business model (Score:2)
Re: (Score:1)
The one that first comes to mind is: and what happens when someone releases a console that plays old games AND is profitable by itself?
Nintendo consoles typically play only from one generation back. GBA plays GBC games. DS plays GBA games but not GBC games. GameCube with Game Boy Player plays GBA games. Wii plays GameCube games but not GBA games.
The second thing that comes to mind is that the current generation of consoles is already networked and selling downloadable versions of older games for a few bucks each, and this model is profitable right now, so why wouldn't the same model (minus the cost of having to write emulators!) remain profitable for the next generation?
For one thing, the low-dollar games compete with the full retail games. That's part of why Nintendo is taking so long to release its NES back catalog on Wii Shop Channel, despite that other games with the same mapper are available. For another, how long would it take to download one of these game
Ray casting and Java (Score:3, Interesting)
And this is just for Ray Casting which is much simpler than Ray Tracing.
During my development with Java I discovered that setting a pixel color to 0xFF000000 caused a slowdown. That's right, a black pixel would slow the framerate down. I had to set all pure black
Battery Life vs Graphics (Score:3, Insightful)
Ray-tracing may be possible on my 500Mhz smartphone's processor - but damn, I don't want to have to be plugged in to play them.
Imagine a Beowolf cluster of those.... (Score:3, Funny)
I can image that, but... (Score:2)
Existing Real-Time Ray-Tracers? (Score:2)
Re: (Score:1)
Re: (Score:2)
It looks like Povray [povray.org] is experimenting with one.
Re: (Score:2)
Real time raytracing with POV-Ray (Score:4, Informative)
Not sure I get their argument (Score:2)
If this is the case, why not just use the increasing processing power to produce better quality graphics using the current optimized techniques?
Am I missing something? Intel's argument seems a bit like saying we should get rid of QuickSort and go back to Bubble so
Re: (Score:2)
Re: (Score:3, Insightful)
rendering equation (Score:2)
The problem is that the scanline rendering techniques we use for real-time graphics these aren't an accurate solution to the rendering equation [wikipedia.org]. If you add more processing power you can render more triangles at higher framerates, but there isn't a physically-correct way to deal with the interreflections between objects (i.e. global illumination), and so the output of scanline rendering will never look quite real (it might look pretty close, but only with a lot of manual tweaking).
Plain Whitted Ray tracin
Raytracing is not the holy grail of graphics (Score:5, Informative)
Pixar is not the holy grail of graphics (Score:2, Informative)
Quantum wave tracing is the holy grail (Score:1)
Quantum wave tracing baby, that's where it's at.
What does Pixar have to do with realtime graphics? (Score:3, Interesting)
Pixar has the luxury of controlling every take, and going back after the fact to re-render shots with different settings, or even to use different algorithms (including ray-tracing) to fix any rendoring flaws caused by whatever approximations they're using at that point. Realtime graphics do not have that luxury... if there's a problem in a scene, you can't go back and fix it.
So whether raytracing is more or less appropri
Re:Raytracing is not the holy grail of graphics (Score:4, Informative)
Actually Pixar has switched to Ray Tracing. Cars was ray traced [pixar.com] [PDF]. Skimming through the whitepapers on the Pixar site [pixar.com], it's clear ray tracing was also used extensively in Ratatouille.
Even so, what Pixar is doing in feature films isn't particularly relevant to real-time ray tracing on mobile devices.
it's about memory, not performance or realism (Score:4, Informative)
It's worth pointing out (and it's mentioned in the paper you cite) that the main reason Pixar hasn't been doing much ray tracing until now is not performance or realism, but memory requirements. They need to render scenes that are too complex to fit in a single computer's memory. Scanline rendering is a memory-parallel algorithm, ray tracing is not. So, they're forced to split the scene up into manageable chunks and render them separately with scanline algorithms.
This isn't an issue for games, which are going to be run on a single machine (perhaps with multiple cores, but they share memory).
Re: (Score:3, Interesting)
Ray tracing alone is not a silver a bullet but if it produces better results with less human effort, it's a net win.
I found this on Pixar's RenderMan page (https://renderman.pixar.com/products/tools/renderman.html):
"Ray Tracing and Global Illumination
The ray tracing and
Re: (Score:2, Interesting)
Definitely.
Pixar used some raytracing for Cars and later described it as a huge mistake. Certain shots took over 200 hours per frame. In terms of performance vs. quality, even in movies, they prefer to go scanline. You won't see games going to raytracing any time soon.
In Transformers, they used cube-maps because raytracing was too slow. Is anyone here seriously going to make the case that Transformers looked bad because the reflections weren't perfect?
Subsurface scattering (Score:2)
You're starting to get close when that 3D human model describes the different layers of the skin and the amount of blood near the surface of the skin.
Intel is nuts. Raytracing is not easier. (Score:2)
Is Raytracing really needed on a tiny mobile device at say 300x400?
Easier for whom? (Score:2)
Re: (Score:2)
Wont it always be true that raytracing is slower, uses more ram etc. Its more physically correct but the alternatives will always be faster. Which is better and more performance oriented, baking ambient occlusion, or rendering it in real time with raytracing?
Several factors come into play, samples vs prebaked map resolution. Lots of samples in ambient occlusion can have a dramatic ef
Re: (Score:2)
That's not a given. A few years back a university in Germany demonstrated a dedicated raytracing engine that was getting a video-game quality realtime raytraced scenes. The thing is that this processor was only running at 66 MHz and only had 352 Mb/s memory bandwidth. Today's GPUs are running 8 times as fast, have 100 times as many transistors (which translates to more parallel ray computations), and many times the memory bandwidth.
By conce
Scientific visualization (Score:2)
I came to browse Slashdot while waiting for some ray tracing of my own. I do atomistic modeling of nanomechanics and I'm rendering movies of how atoms wiggle and move during deformation. Here is a test shot of a 4 nm tall aluminum cylinder rendered at 150 femtoseconds per second of animation:
Aluminum nanocolumn vibration (Quicktime, 14 MB) [umich.edu]
It's amazing how nice ray tracing can look compared to other visualization methods. It took three hours to generate this 1000 frame movie. But as processors add
Re: (Score:1)
Here is a test shot of a 4 nm tall aluminum cylinder rendered at 150 femtoseconds per second of animation: Aluminum nanocolumn vibration (Quicktime, 14 MB)
Ahhhhh spheres, every ray tracer's favorite primitive! :D
Where's the desktop version? (Score:3, Insightful)
If they want a phone to do 256 x 192 raytracing in real time, then a desktop with 1000x the compute power should easily be able to do 720x480 (full res television) in real time. But, oddly enough, there are no such titles out there....
Re: (Score:2)
They have a version of quake4 that can hit 100FPS at 1080p resolutions (1920x1080) with 8 cores. That means a top of the line dual core machine should be able to do 720p with no real problems.
The games will come once someone makes an engine for them to be honest. It's not QUITE possible right now, but would be demo-able at the resolution you stated.
Huh? (Score:2)
Is this 1997? (Score:2)
I m
Screw phones (Score:1)
I still don't get the whole cell phone craze. Get a Game Boy for cryin' out loud!
Games get the best stuff? (Score:2, Informative)
I'd prefer companies focus on decent vector graphics for applications before trying to move directly to ray tracing for games.
Really, nothing pushes hardware, er... harder, than games. Application GUI implementation is still in the stone age, even on mobile devices.
Idiots (Score:1)
Ray tracing in games (Score:1)
Infinite battery life instead (Score:1)
What do we really want from our mobile devices? I want "infinite battery life" and no recharging.
Moore's Law (which is really an increase in the number of transistors per given area) could give us much much better battery life for the same performance, IF we don't go the way of the desktop and squander it on bloated software and eye-candy.
There are already displays which take almost no power (less than 1mW): http://www.qualcomm [qualcomm.com]
Re: (Score:1)
Low-res future (Score:1)
Give me res > (raytracing >) size any day
Re: (Score:1)
Re:Ray Tracing is *not* DOA (Score:2)
SRAM [physorg.com] can be made smaller too...
Re: (Score:2)
None of the "enhancements" you're talking about gives ray tracing a leg up on conventional rendering.
I didn't say that ray tracing would ever be faster than dirty hacks that do not accurately model the real world.
Speed of conventional rendering, in good designs, is limited by how fast you can feed the chip geometry. YOU MUST ACCESS EVERY SINGLE PIECE OF VISIBLE GEOMETRY EVERY FRAME. There is no way to avoid it with any rendering strategy, it's just not possible.
I was saying that accessing the geometry is not a bottleneck. All that is needed is more cpu and more broadcast updated ram, which (see below)
Moore's law is working against ray tracing, not FOR it.