0

I am trying to use a Float32Array as an input to GPGPU calculations in the browser.

Converting from Float32Array to Uint8Array works OK, and I am looking for a way to

  1. convert RGBA to float
  2. perform some calculations
  3. convert float to RGBA

in order to retrieve the results from Javascript.

The fragment shader looks like this:

vec4 getPointX(in sampler2D tex, in vec2 dimensions, in float index) {
  vec2 uv = (
         vec2(
          floor(mod(index, dimensions.x)),
          floor(index / dimensions.x)) + 0.5
         ) / dimensions;
  return texture2D(tex, uv).rgba;
}

const vec4 bitEnc = vec4(1.0, 255.0, 65025.0, 16581375.0);
const vec4 bitDec = 1.0 / bitEnc;

vec4 EncodeFloatRGBA (float v) {
    vec4 enc = bitEnc * v;
    enc = fract(enc);
    enc -= enc.yzww * vec2(1.0 / 255.0, 0.0).xxxy;
    return enc;
}

float DecodeFloatRGBA (vec4 v) {
    return dot(v, bitDec);
}

void main(void) { 
  vec4 a = getPointX(aValues, aDimensions, float(gl_FragCoord.x));

  float o = DecodeFloatRGBA(a);
  vec4  t = EncodeFloatRGBA(o);

  gl_FragColor = t;
}

with encoding/decoding as per this answer (question is about int - answer seems to be about float to RGBA and back).

Retrieving the pixels looks like this:

const pixels  = new Uint8Array(numOutputs * 4);
const results = new Float32Array(pixels.buffer);
gl.readPixels(0, 0, numOutputs, 1, gl.RGBA, gl.UNSIGNED_BYTE, pixels);

However the round trip RGBA => float => RGBA doesn not seem to work.

For example on an input of

Uint8Array(64) [ 222, 146, 192, 61, 116, 74, 28, 63, 56, 51, … ]

I get

Uint8Array(64) [ 222, 146, 192, 0, 116, 74, 28, 128, 56, 51, … ]

where every fourth element is not what I expect.

If - in the above code - I do

gl_FragColor = a

and skip the decoding/encoding the round-trip works cleanly, so it looks to me like things happen in the decoding/encoding step

Where is the issue then?

simone
  • 4,283
  • 4
  • 20
  • 35
  • Check your blending and alpha settings – LJᛃ Sep 08 '20 at 11:08
  • @LJᛃ . the context us set up with `var gl = canvas.getContext("webgl", { alpha : true, antialias : false });` and the textures with twgl with `type: gl.UNSIGNED_BYTE, minMag: gl.NEAREST` - with everything else as default AFAIK. Where else should I be checking? sorry that I have to ask - still somewhat new to GLSL / gpgpu – simone Sep 08 '20 at 12:25
  • @LJᛃ also: I tested how things would work by feeding the RGBA input directly as in `gl_FragColor = a` and the round-trip works cleanly, so it looks to me like things happen in the decoding/encoding step – simone Sep 08 '20 at 13:03
  • You could find alternative ways for coding/decoding here: https://stackoverflow.com/questions/39041140/how-to-quickly-pack-a-float-to-4-bytes – Sebastian Sep 08 '20 at 13:31

0 Answers0