9

I am trying to make a glow shader using separable gaussian blurring.

I have recently been inspired by the short youtube video "computer color is broken" and I have messed with it with color interpolation and boy his suggestion is beautiful!

A big thing the video talks about is that this principal should be applied to blurring, however I am pretty confused. I don't really know what to square when and what to sqrt when when values are being added. My current theory is each texture sample for the gaussian blur gets raised by the power of two weighted with a bell curve and added to a sum like usual. At the end the sum is square rooted, but i'm not sure if that is correct. Could someone please confirm? Would this make an appreciable difference that made things worth it?

Nathan Reed
  • 25,002
  • 2
  • 68
  • 107
J.Doe
  • 1,445
  • 12
  • 23

1 Answers1

7

Yes, your theory is correct. A gamma-correct blur entails converting the input pixels to linear color space, performing the blur weighting and accumulation in that space, and then converting back to gamma space at the end.

As noted in the comments, the actual transform is not literally squaring and square-rooting, that's just an approximation (and not that good of one). For the true sRGB gamma transform, see the equations in this Wikipedia article (look down the page for the equations involving $C_\text{srgb}$ and $C_\text{linear}$).

By the way, some visual comparisons of gamma-correct and gamma-incorrect blurs can be found on this page by Elle Stone, which shows why this whole thing matters.

Nathan Reed
  • 25,002
  • 2
  • 68
  • 107