-
Notifications
You must be signed in to change notification settings - Fork 198
Description
I am getting into some complex webgl decorating. This pursuit started when I needed some bars to be a diagonal hatch pattern and others to be solid. In order to simplify my issue, I focused in on the following example where I just want to start with using a custom shader to color individual datapoints. However, this code fails and cannot figure out why. Instead everything just shows as black (the default color for the gl_FragColor). Everything points to the fact that the shader doesn't recognize the aColor attribute. Idk if that is because that attribute (or the shader is being created in the wrong place or context). It might also mean that bc the program passed by the decorate function isn't strickly a WebGLProgram type it doesnt get properly added. Unsure and would love some guidance.
Using d3fc version 15.2.13
const solidBarSeries = fc.seriesWebglBar()
.equals((a, b) => a === b)
.defined(() => true)
.crossValue(d => d.selector)
.mainValue(d => d.end)
.baseValue(d => d.start)
.bandwidth(d => barSize(yScale))
.decorate((program, data) => {
const colorsForShader = data.map(d => traceMeta[d.traceIndex].color);
console.log(colorsForShader)
// Vertex shader: forward attributes → varyings
program.vertexShader()
.appendHeader(`attribute vec4 aColor;`)
.appendHeader(`varying lowp vec4 vColor;`)
.appendBody(`
vColor = aColor;
`);
// Fragment shader: hatch blending
program.fragmentShader()
.appendHeader(`varying lowp vec4 vColor;`)
.appendBody(`gl_FragColor = vColor;`);
// Per-datapoint color
fc.webglAttribute()
.size(4)
.type(gl.FLOAT)
.data(colorsForShader) // '[[0.8,0.42,0.42,1],[0.35,0.6,0.85,1],[0.42,0.8,0.62,1],[0.8,0.42,0.42,1],[0.35,0.6,0.85,1],[0.33,0.46,0.56,1],[0.35,0.6,0.85,1]]'
(program, "aColor");
});