These forums have been archived and are now read-only.

The new forums are live and can be found at https://forums.eveonline.com/

EVE General Discussion

 
  • Topic is locked indefinitely.
Previous page12
 

Raytracing in EVE?

First post
Author
CCP Darwin
C C P
C C P Alliance
#21 - 2014-11-19 23:14:20 UTC  |  Edited by: CCP Darwin
Hal Morsh wrote:
What is raytracing?


Raytracing is a technique for rendering images that traces large numbers of the paths that light takes through the scene. It's very accurate but very time consuming. Other rendering techniques simplify the problem by using various approximations that are less accurate but produce a look that's acceptable.

Here's a lot more information on the topic than I can offer here:

Wikipedia article on raytracing for computer graphics

CCP Darwin  •  Senior Software Engineer, Art & Graphics, EVE Online  •  @mark_wilkins

Ralph King-Griffin
New Eden Tech Support
#22 - 2014-11-19 23:21:20 UTC  |  Edited by: Ralph King-Griffin
Hal Morsh wrote:
What is raytracing?

Is it that effect where you can see objects cast shadows in space whilst looking at the sun? The incursion ambiance already does this, on your ship and objects in space.

*drunkenly reminisces about collage tutor pulling his hair out*

It's essentially asking your potato to trace individual rays of light from the source(s) in all directions at all times and registering the individual opacity value of everything each ray interacts with and generating shadows that way.

It's really intense for what it does and is one (amongst many) reason render farms are a thing in animation studios though I'm painfully uninformed (and drunk) about how thaw games industry handles that sort of thing.


Edit: Yeay!!!! I remembered stuff!

If I remember correctly it can change a 12 minute render into a 12 hour one(for video).
Rain6637
GoonWaffe
Goonswarm Federation
#23 - 2014-11-20 00:48:02 UTC
CCP Darwin wrote:

  • energy conservation as an upper limit for scattered light energy from a differential area on the surface

  • translation: an object will not glow brighter than light that is shed on it.

    overall PBR is a concept that entails texturing things according to the surface material, true-to-life rather than an imaginary way for the sake of looking cool in a specific setting.

    is this correct
    Darth Schweinebacke
    Wings of Fury.
    #24 - 2014-11-20 01:04:03 UTC
    Ralph King-Griffin wrote:


    If I remember correctly it can change a 12 minute render into a 12 hour one(for video).


    Try 24 hours for a single image! Grrrrrrrr.. someone buy me a renderfarm for x-mas pls!
    Cyndrogen
    The Greatest Corp in the Universe
    #25 - 2014-11-20 20:05:39 UTC
    CCP Darwin wrote:
    Cyndrogen wrote:
    When you use the term Physically Based Rendering, aka PBR, what exactly are your lighting models based on? Monte Carlo? Raytracing? The term is vague and I'm not sure how it fits within the framework of eve?

    I know there are game engines that support raytracing but what exactly is PBR in eve online based on? Are you actually calculating lighting bouncing and scattering or cheating the effect?


    Coming from a visual effects and animation background, having recently worked as a digital lighting technical director, I had, like you, presumed that physically-based rendering (PBR) systems generally required raytracing. I was wrong.

    The important components of a PBR system are:

    • Linear high-dynamic-range lighting, which we rolled out as part of an earlier release
    • energy conservation as an upper limit for scattered light energy from a differential area on the surface
    • an angular distribution function that determines how incoming light scatters off the surface at different angles (in our case a Bidirectional Reflectance Distribution Function, or BRDF, though there are also Scattering and Transmissive distribution functions as well)
    • and for a non-raytraced system, an illumination map, for which we use our cube-mapped nebulas.

    What we do not have that you might have seen from raytraced PBR implementations:

    • Any form of "global illumination," meaning light bouncing off one surface to illuminate another. (This includes raytraced reflections of other objects in the scene other than the illumination map, since in a PBR system, "diffuse bounce" is just a reflection with a broader BRDF.)
    • Raytraced shadows. (I'm pretty sure we're using depth map shadows.)
    • Any form of ray-marching volumetric techniques, such as you might use for caustics originating from transparent materials.

    EVE's implementation is an in-house one, and primarily implemented by the shader. Visually, the results are much like those of a raytraced system, but with shadow blur and bounces at zero.


    Great! Thanks for clarifying the difference! PBR is pretty much the future in visual effects and older scanline / depth mapped / caching methods are going the way of the dodo. I have used Arnold which is an unbiased PBR raytracer, and ofcourse the older Mental Ray which is biased but supports caustics thanks to photon mapping.

    Are there any plans to implement bounce light? I thought that walking in stations demo actually had this technique implemented? Or maybe it was just a tech demo?

    Every day in every way I improve my skills and get better.

    Cyndrogen
    The Greatest Corp in the Universe
    #26 - 2014-11-20 20:08:02 UTC
    CCP Darwin wrote:
    Cyndrogen wrote:
    When you use the term Physically Based Rendering, aka PBR, what exactly are your lighting models based on? Monte Carlo? Raytracing? The term is vague and I'm not sure how it fits within the framework of eve?

    I know there are game engines that support raytracing but what exactly is PBR in eve online based on? Are you actually calculating lighting bouncing and scattering or cheating the effect?


    Coming from a visual effects and animation background, having recently worked as a digital lighting technical director, I had, like you, presumed that physically-based rendering (PBR) systems generally required raytracing. I was wrong.

    The important components of a PBR system are:

    • Linear high-dynamic-range lighting, which we rolled out as part of an earlier release
    • energy conservation as an upper limit for scattered light energy from a differential area on the surface
    • an angular distribution function that determines how incoming light scatters off the surface at different angles (in our case a Bidirectional Reflectance Distribution Function, or BRDF, though there are also Scattering and Transmissive distribution functions as well)
    • and for a non-raytraced system, an illumination map, for which we use our cube-mapped nebulas.

    What we do not have that you might have seen from raytraced PBR implementations:

    • Any form of "global illumination," meaning light bouncing off one surface to illuminate another. (This includes raytraced reflections of other objects in the scene other than the illumination map, since in a PBR system, "diffuse bounce" is just a reflection with a broader BRDF.)
    • Raytraced shadows. (I'm pretty sure we're using depth map shadows.)
    • Any form of ray-marching volumetric techniques, such as you might use for caustics originating from transparent materials.

    EVE's implementation is an in-house one, and primarily implemented by the shader. Visually, the results are much like those of a raytraced system, but with shadow blur and bounces at zero.


    Another question is are your textures also stored in linear space? 8 bit? 16?

    I assume that bump and specular maps are stored as 8bit but what about the diffuse maps? Are Normal maps in eve 16 or 32 bit?

    Every day in every way I improve my skills and get better.

    Cyndrogen
    The Greatest Corp in the Universe
    #27 - 2014-11-20 20:25:40 UTC
    Ralph King-Griffin wrote:
    Hal Morsh wrote:
    What is raytracing?

    Is it that effect where you can see objects cast shadows in space whilst looking at the sun? The incursion ambiance already does this, on your ship and objects in space.

    *drunkenly reminisces about collage tutor pulling his hair out*

    It's essentially asking your potato to trace individual rays of light from the source(s) in all directions at all times and registering the individual opacity value of everything each ray interacts with and generating shadows that way.

    It's really intense for what it does and is one (amongst many) reason render farms are a thing in animation studios though I'm painfully uninformed (and drunk) about how thaw games industry handles that sort of thing.


    Edit: Yeay!!!! I remembered stuff!

    If I remember correctly it can change a 12 minute render into a 12 hour one(for video).



    It calculates many different rays and how rays interact with materials and lights. With raytracing it's possible to render not just shadow rays but refraction, reflection, bounce, etc. One implementation of raytracing is the brute force monte carlo method, which basically is like a gun that just fires random shots into the projection plane / camera plane and "samples" the environment.

    This method is great for outdoor scenes however when you deal with interior lighting monte carlo can be very noisy and time consuming to raytrace. Metropolis light transport is a more efficient way to solve the problem of light traveling through a keyhole in a room and scattering around. Light coming through a keyhole is very difficult to trace with monte carlo raytracing with acceptable render quality. This is why render engines like V Ray cheat by giving the user a "light portal", where the artist can specify the direction of light and force the render engine to sample using light portals as guides.

    Every day in every way I improve my skills and get better.

    Cyndrogen
    The Greatest Corp in the Universe
    #28 - 2014-11-20 20:37:11 UTC
    CCP Darwin wrote:

  • an angular distribution function that determines how incoming light scatters off the surface at different angles (in our case a Bidirectional Reflectance Distribution Function, or BRDF, though there are also Scattering and Transmissive distribution functions as well)
  • .



    Are you also calculating Fresnel as part of the BRDF?

    I would imagine, specifically for metals that Fresnel would be required to get the right look for reflection / diffuse ratios.

    Every day in every way I improve my skills and get better.

    CCP Darwin
    C C P
    C C P Alliance
    #29 - 2014-11-21 06:25:14 UTC
    Cyndrogen wrote:
    Are you also calculating Fresnel as part of the BRDF?

    I would imagine, specifically for metals that Fresnel would be required to get the right look for reflection / diffuse ratios.


    There are Fresnel parameters. I haven't looked closely to see how the implementation handles them, but I'd assume they're rolled into the BRDF.

    CCP Darwin  •  Senior Software Engineer, Art & Graphics, EVE Online  •  @mark_wilkins

    Herzog Wolfhammer
    Sigma Special Tactics Group
    #30 - 2014-11-21 08:10:06 UTC
    I like this thread. I read it and thought in a parallel universe the other Herzog is a graphics programmer who is happy most of the time and even enjoys getting laid too.

    Bring back DEEEEP Space!

    Sol Project
    Shitt Outta Luck - GANKING4GOOD
    #31 - 2014-11-21 08:57:30 UTC
    Who would have thought, a good thread by IZ.

    Ladies of New Eden YC 117 by Indahmawar Fazmarai

    Warning: NSFW! Barely legal girls in underwear!

    Diana Kim > AND THIS IS WHY THE FEDERATION MUST BE DESTROYED!!

    Claud Tiberius
    #32 - 2014-11-21 11:12:32 UTC
    Global illumination and Volumetric lighting would be amazing, if I had a super computer.

    Once upon a time the Golem had a Raven hull and it looked good. Then it transformed into a plataduck. The end.

    Cyndrogen
    The Greatest Corp in the Universe
    #33 - 2014-11-21 20:17:23 UTC  |  Edited by: Cyndrogen
    Claud Tiberius wrote:
    Global illumination and Volumetric lighting would be amazing, if I had a super computer.


    I'm pretty sure I remember seeing global illumination for World of Darkness and walking in stations. Unity, Cryengine and Unreal all support their version of GI and Octane is pushing quickly into the gaming marketplace. Raytracing is the future of graphics.

    Eve already supports volumetric lighting in case you have not noticed. :)

    HP already has 64 core machines, so it's only a matter of time before eve online is streamed from a cloud of supercomputers onto your old laptop.

    CCP any plans for offering a streaming EVE online version for a premium price?

    Every day in every way I improve my skills and get better.

    Cyndrogen
    The Greatest Corp in the Universe
    #34 - 2014-11-21 21:05:24 UTC  |  Edited by: Cyndrogen
    CCP Darwin wrote:

    [list]
  • Any form of "global illumination," meaning light bouncing off one surface to illuminate another. (This includes raytraced reflections of other objects in the scene other than the illumination map, since in a PBR system, "diffuse bounce" is just a reflection with a broader BRDF.)


  • Ahh yes, that was another question I had about the environment maps. Are you using convolution maps to simplify the reflections?

    Basically in visual effects we actually generate convolution maps, a fancy term for blurring the original image, but it's done in linear space on an HDR image, so the bright hot spots bloom making it easier to sample the light without having to crank up the anti aliasing samples or light samples. It's a quick cheat to get a diffuse looking surface from the render engine that is noise free.

    You could also generate mip maps, or multi resolution maps and jus sample the smaller variant to get the diffuse effect. Sounds like eve might already be doing something similar.

    What is the engine for Eve online called? Is there a name?

    Every day in every way I improve my skills and get better.

    CCP Darwin
    C C P
    C C P Alliance
    #35 - 2014-11-21 22:39:13 UTC  |  Edited by: CCP Darwin
    Cyndrogen wrote:
    [quote=CCP Darwin]You could also generate mip maps, or multi resolution maps and jus sample the smaller variant to get the diffuse effect. Sounds like eve might already be doing something similar.

    What is the engine for Eve online called? Is there a name?


    EVE's textures are mip-mapped as a level of detail optimization, so if one needs a blurred texture, reading a lower mip level is the easiest way to do it. That said, that wouldn't offer much flexibility in BRDF choice, so I'm not sure what the engine is doing internally.

    As for volumetric effects, we have them, but I believe we're not using a fully-raytraced approach to get there. I probably shouldn't speculate about what alternative we're using, but there are a couple.

    EVE's current graphics engine is called Trinity, and was released in the (wait for it...) Trinity expansion. I suspect it's no accident that our best-equipped conference room is named after that expansion.

    CCP Darwin  •  Senior Software Engineer, Art & Graphics, EVE Online  •  @mark_wilkins

    Ralph King-Griffin
    New Eden Tech Support
    #36 - 2014-11-21 23:04:52 UTC  |  Edited by: Ralph King-Griffin
    CCP Darwin wrote:


    EVE's textures are mip-mapped as a level of detail optimization, so if one needs a blurred texture, reading a lower mip level is the easiest way to do it. That said, that wouldn't offer much flexibility in BRDF choice, so I'm not sure what the engine is doing internally.

    As for volumetric effects, we have them, but I'm believe we're not using a fully-raytraced approach to get there. I probably shouldn't speculate about what alternative we're using, but there are a couple.


    All a can think right now
    Cyndrogen
    The Greatest Corp in the Universe
    #37 - 2014-11-27 00:07:20 UTC
    CCP Darwin wrote:
    Cyndrogen wrote:
    [quote=CCP Darwin]You could also generate mip maps, or multi resolution maps and jus sample the smaller variant to get the diffuse effect. Sounds like eve might already be doing something similar.

    What is the engine for Eve online called? Is there a name?


    EVE's textures are mip-mapped as a level of detail optimization, so if one needs a blurred texture, reading a lower mip level is the easiest way to do it. That said, that wouldn't offer much flexibility in BRDF choice, so I'm not sure what the engine is doing internally.

    As for volumetric effects, we have them, but I believe we're not using a fully-raytraced approach to get there. I probably shouldn't speculate about what alternative we're using, but there are a couple.

    EVE's current graphics engine is called Trinity, and was released in the (wait for it...) Trinity expansion. I suspect it's no accident that our best-equipped conference room is named after that expansion.




    Great soundtrack for Trinity as well. ;)

    Every day in every way I improve my skills and get better.

    Cyndrogen
    The Greatest Corp in the Universe
    #38 - 2014-11-27 00:28:56 UTC
    Darth Schweinebacke wrote:
    Ralph King-Griffin wrote:


    If I remember correctly it can change a 12 minute render into a 12 hour one(for video).


    Try 24 hours for a single image! Grrrrrrrr.. someone buy me a renderfarm for x-mas pls!



    Try www.lagoa.com

    Every day in every way I improve my skills and get better.

    Previous page12