Most Popular

1500 questions
4
votes
0 answers

How is glint rendering done in Ray Tracing?

How is glint rendering done with ray tracing in the photo below? Is it just a simple normal map? I found a paper online that describes the exact same method used in the photo above but I have no idea what's going on, especially with all the…
Arjan Singh
  • 2,511
  • 3
  • 22
  • 38
4
votes
1 answer

Are these edges a result of display color accuracy, computational accuracy, or something else?

I was messing with glowing lighting when I noticed an odd artifact in my first try. For some reason I can see three defined edges inside of the glowing pattern. I downloaded the image and marked where I see the edges. I am just curious why they…
J.Doe
  • 1,445
  • 12
  • 23
4
votes
1 answer

Register pressure in Compute Shader

I'm in the process of writing a Ray Tracer using DirectCompute / HLSL. First, eye rays are generated (one per pixel). Then, rays are traced, shaded and reflected in a loop. Also, shadow rays for each light source are generated and tested for…
David Kuri
  • 2,293
  • 13
  • 32
4
votes
0 answers

Is it possible to create a forced focus with a dual layer of images in order to Avoid eye strain with Virtual and Augmented Reality?

Would eye tracking allow for the alignment of a dual layered image (a google glass like device and a screen or projection) to provide a way to force the focus of the eyes into to a real life equivalent of the perceived distance in a similar, but…
Ryan
  • 41
  • 2
4
votes
1 answer

Screen coordinates, barycentric coordinates and global coordinateas

To continue with this my other question, here's the problem. So, I've a slide, which I'm trying to understand. Here are my questions. What are the barycentric coordinates of? From my previous question, $(x_1, y_1)$ and $(x_2, y_2)$ are also…
user4718
4
votes
1 answer

Projected points and screen coordinates

I'm trying to understand part of the contents of a slide, but I'm not really understanding. So, here's the interested part. I've a few questions. Are the projected points $q_1$ and $q_2$ the points with respectively the screen coordinates $(x_1,…
user4718
4
votes
1 answer

Non-Polygonal 3D cube rasterization

Given a cube defined as: struct Cube { float min[3]; float size; }; What would be the fastest rasterization method? On the internet I only found methods that used polygons, but I think for only drawing a cube thats extremely ineffective. I…
l'arbre
  • 153
  • 3
4
votes
1 answer

How to implement Texture Baking?

So currently I want to implement radiosity in my engine. After figuring out the amount of light that bounces between elements/disks/patches (I've been told disks are the best - correct me if I'm wrong) how do you bake it into a texture? Do I set the…
Arjan Singh
  • 2,511
  • 3
  • 22
  • 38
4
votes
1 answer

Is there a material that contributes to GI, but isn't directly visible?

I am doing renders of the inside of a room. I would like to show a "bird's eye" view of the room, from above the ceiling (outside the room), but still see the interior of the room. An easy way of doing this is to simply hide or remove the ceiling…
aaaidan
  • 143
  • 4
4
votes
1 answer

Attribute Location in Multiple Shader Programs

I'm trying to switch between the rendering of two different scenes as fast as possible using OpenGL. For all my OpenGL related stuff I wrote a wrapper-class which provides a quite easy way of adding and updating data. Based on the added data it…
Christian_B
  • 313
  • 1
  • 8
4
votes
1 answer

Normal vector in Phong Illumination Model should be normalized?

I'm studying Illumination subject these days and I was given a problem which asks to calculate the intensity using phong illumination model. I'm little bit confused whether I should calculate normal vector using after normalize it or as it is.…
4
votes
3 answers

Texture coordinates projection

I have some classic texture coordinates and as a normal behaviour they follow the mesh's transformations. I am trying to use the same texture coordinates behaviour but without being affected by the mesh rotation transformation. The results would be…
MaT
  • 1,229
  • 10
  • 21
4
votes
1 answer

Normal 2D photo to VR-compatible spherical photo

How to turn a normal photo like this one : into a photo that I can use in my game development platform (Unity3D). I just wrap a 3d sphere model with the photo and then it can be all around me when I use it in a VR headset. I know it's impossible…
Nani
  • 41
  • 2
4
votes
2 answers

Are there any gltf example datasets available?

I'm currently researching my options on effeciently exchanging data for a webgl application. I understand the gltf format is still being drafted but I need some example data to understand whether this format is really useful for my application. I'm…
q9f
  • 703
  • 2
  • 8
  • 22
4
votes
1 answer

Material Layering

So after reading this Unreal Engine 4 Documentation page: https://docs.unrealengine.com/latest/INT/Engine/Rendering/Materials/LayeredMaterials/ (Which describes how to use material layering in UE4) I wanted to know how material layering is…
Arjan Singh
  • 2,511
  • 3
  • 22
  • 38