Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

Intel Researchers Consider Ray-Tracing for Mobile Devices 120

An anonymous reader points out an Intel blog discussing the feasibility of Ray-Tracing on mobile hardware. The required processing power is reduced enough by the lower resolution on these devices that they could realistically run Ray-Traced games. We've discussed the basics of Ray-Tracing in the past. Quoting: "Moore's Law works in favor of Ray-Tracing, because it assures us that computers will get faster - much faster - while monitor resolutions will grow at a much slower pace. As computational capabilities outgrow computational requirements, the quality of rendering Ray-Tracing in real time will improve, and developers will have an opportunity to do more than ever before. We believe that with Ray-Tracing, developers will have an opportunity to deliver more content in less time, because when you render things in a physically correct environment, you can achieve high levels of quality very quickly, and with an engine that is scalable from the Ultra-Mobile to the Ultra-Powerful, Ray-Tracing may become a very popular technology in the upcoming years."
This discussion has been archived. No new comments can be posted.

Intel Researchers Consider Ray-Tracing for Mobile Devices

Comments Filter:
  • by should_be_linear ( 779431 ) on Sunday March 02, 2008 @10:51AM (#22615318)
    As Intel couldn't compete with ATI/nVidia on 3D rendering performance, they simply redefined rules of the game. Now they seem ahead of everyone else in Real Time Raytraycing, at least based on publicly presented papers. Now, they need to integrate this into some bigger picture of "new gaming platform". If they manage to integrate this graphics with Java JVM in coherent way, so that developers can easier utilize multiple cores in games and be able to write games once, run on all platforms/future consoles as a bonus. That would be big step towards letting developers focus towards gameplay and not on DirectX/OpenGL/PS3/... API generations, extension nuances, tricks for simulating shades, optimizing polygon count in big scenes, ... ray-tracing is making all this simple without requiring effort on developer's side. Yes, I know Java is some percents slower then C++, but in Java it is so much easier to utilize multiple-cores (especially when it comes to debugging) that I am sure performance will be gained, not lost on modern CPUs.
  • by Anonymous Coward on Sunday March 02, 2008 @12:19PM (#22615756)
    Display resolutions have been getting higher, but the eye is not getting better, so there is a limit to the useful resolution of any display, and we are getting close. For a 24" widescreen at normal viewing distance, you're not going to ever want a resolution much higher than 1920x1200. Instead, you'd like the display to be bigger to take up a larger part of your field of view. But there's a problem with this; in fact your eyes can only take in a small part of the display at once. The eye has high resolution at the center of your field of vision but it quickly drops off, and your peripheral vision is very low resolution. If you render a high resolution image for your entire field of view, you are basically wasting almost all of that effort; only the part your eyes are focused on matters. What we really need is eye tracking to figure out which part of the image to render at high resolution and the rest can be rendered in low res. I think ultra-high-res monitors of the future should have built-in cameras running face recognition and eye tracking software. Incidentally, this would also enable a really cool user interface where you could control your computer by just looking and blinking.
  • What does Pixar have to do with realtime graphics? Pixar's not DOING realtime graphics.

    Pixar has the luxury of controlling every take, and going back after the fact to re-render shots with different settings, or even to use different algorithms (including ray-tracing) to fix any rendoring flaws caused by whatever approximations they're using at that point. Realtime graphics do not have that luxury... if there's a problem in a scene, you can't go back and fix it.

    So whether raytracing is more or less appropriate for realtime graphics, whether Pixar uses it or not is irrelevant.
  • by rs79 ( 71822 ) <hostmaster@open-rsc.org> on Sunday March 02, 2008 @12:52PM (#22615922) Homepage
    That phones may be able to ray trace is news? Sounds more to me like intel was of reading in the news all week how inferior their graphics stuff was because of the Microsoft Vista debacle part eight - and suddenly we have an anonymous tip to a blog at intel saying ray tracing on phones there is "an opportunity to deliver more content in less time" and "Ray-Tracing may become a very popular technology in the upcoming years".

    A popular technology? Like a working filesystem? They're real popular I hear. Or an on off button that actually works.

    Slow news day + intel graphics dept astroturfing = ray tracing on phones is news.

  • by hvm2hvm ( 1208954 ) on Sunday March 02, 2008 @01:54PM (#22616270) Homepage
    well i only play games on my mobile when i'm waiting for the bus or something. my point was that i tried some 3d racing games and some kind of 2d splinter cell clone but the only ones i actually feel like playing when i'm bored are a Zuma clone and 2 other simple games. maybe it's because i don't need to pay much attention or because i don't need time to understand how to play it. but i can't see why would anyone want to play a complex game on such a small screen and with those really bad controls.
  • by Coward Anonymous ( 110649 ) on Sunday March 02, 2008 @02:39PM (#22616534)
    Except Pixar has an army of shader developers working for 2 years on tweaking the rendering of practically every scene to ensure its photorealism. Scanline renderers may be faster but the human effort required to achieve photorealism is huge.
    Ray tracing alone is not a silver a bullet but if it produces better results with less human effort, it's a net win.

    I found this on Pixar's RenderMan page (https://renderman.pixar.com/products/tools/renderman.html):

    "Ray Tracing and Global Illumination
    The ray tracing and global illumination features have been integrated with Pixar's highly evolved implementation of the REYES "scanline" rendering algorithm so that you only incur the overhead associated with these effects when and where you need them. RenderMan shader developers can selectively invoke RenderMan's new ray tracing subsystem to invent new solutions to difficult production problems or to achieve physically correct illumination effects."


    My interpretation: If you can't figure out how to manually tweak the scene, throw CPU power at it.
  • photon mapping (Score:3, Interesting)

    by j1m+5n0w ( 749199 ) on Sunday March 02, 2008 @03:10PM (#22616694) Homepage Journal

    ...getting a "perfect", physically correct result is just not a process that scales well. (Check out The Rendering Equation on wikipedia or somewhere else if you're interested; there's a integral over the hemisphere in there that has to be evaluated, which can recursively turn into a multi-dimension integral over many hemispheres. Without cheating, the evaluation of that thing is going to kick Moore's law's ass for a long, long time.)

    Photon mapping is a pretty good way of getting an unbiased approximation to the rendering equation. It's slower than plain ray tracing, but much faster than path tracing. Real-time interactive global illumination isn't as computationally intractable as you are implying; it is likely to follow real-time ray tracing in not too many years.

  • Ray casting and Java (Score:3, Interesting)

    by KalvinB ( 205500 ) on Sunday March 02, 2008 @07:14PM (#22618444) Homepage
    Bunnies, http://www.dawnofthegeeks.com/ [dawnofthegeeks.com] (a Wolf3D clone) was originally written in Java. I then started translating it to C# and got about a 50% speed boost. I'm now able to do bump mapping, higher resolutions and still have playable framerates.

    And this is just for Ray Casting which is much simpler than Ray Tracing.

    During my development with Java I discovered that setting a pixel color to 0xFF000000 caused a slowdown. That's right, a black pixel would slow the framerate down. I had to set all pure black pixels to not quite black pixels.

    http://www.dawnofthegeeks.com/index.php?page=blog&offset=58 [dawnofthegeeks.com]

    I also found that Java is much slower at doing a "v++" than C.

    Those quirks aren't a big deal when you're not trying to do a lot of math. But they will cripple a Ray Tracer. If Sun could optimize Java better it might be viable but for now Ray Tracing based games would have to be written at a lower level even with a small resolution.

    Maybe people don't expect enough out of handhelds to notice that the graphics are "poor" and that they could be better. In that case you could probably get away with Java. People don't expect much out of a console until someone starts really pushing the limit and then everyone has to.
  • by big4ared ( 1029122 ) on Sunday March 02, 2008 @09:34PM (#22619332)

    Definitely.

    Pixar used some raytracing for Cars and later described it as a huge mistake. Certain shots took over 200 hours per frame. In terms of performance vs. quality, even in movies, they prefer to go scanline. You won't see games going to raytracing any time soon.

    In Transformers, they used cube-maps because raytracing was too slow. Is anyone here seriously going to make the case that Transformers looked bad because the reflections weren't perfect?

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...