23

It seems to me like one could theoretically use WebGL for computation--such as computing primes or π or something along those lines. However, from what little I've seen, the shader itself isn't written in Javascript, so I have a few questions:

  1. What language are the shaders written in?
  2. Would it even be worthwhile to attempt to do such a thing, taking into account how shaders work?
  3. How does one pass variables back and forth during runtime? Or if not possible, how does one pass information back after the shader finishes executing?
  4. Since it isn't Javascript, how would one handle very large integers (BigInteger in Java or a ported version in Javascript)?
  5. I would assume this automatically compiles the script so that it runs across all the cores in the graphics card, can I get a confirmation?

If relevant, in this specific case, I'm trying to factor fairly large numbers as part of a [very] extended compsci project.

EDIT:

  1. WebGL shaders are written in GLSL.

2 Answers 2

27

I've used compute shaders from JavaScript in Chrome using WebGL to solve the travelling salesman problem as a distributed set of smaller optimization problems solved in the fragment shader, and in a few other genetic optimization problems.

Problems:

  1. You can put floats in (r: 1.00, g: 234.24234, b: -22.0) but you can only get integers out (r: 255, g: 255, b: 0). This can be overcome by encoding a single float into 4 integers as an output per fragment. This is actually so heavy an operation that it almost defeats the purpose for 99% of problems. Your better to solve problems with simple integer or boolean sub-solutions.

  2. Debugging is a nightmare of epic proportions and the community is at the time of writing this actively.

  3. Injecting data into the shader as pixel data is VERY slow, reading it out is even slower. To give you an example, reading and writing the data to solve a TSP problem takes 200 and 400 ms respectively, the actual 'draw' or 'compute' time of that data is 14 ms. In order to be usable your data set has to be large enough in the right way.

  4. JavaScript is weakly typed (on the surface...), whereas OpenGL ES is strongly typed. In order to interoperate we have to use things like Int32Array or Float32Array in JavaScript, which feels awkward and constraining in a language normally touted for it's freedoms.

  5. Big number support comes down to using 5 or 6 textures of input data, combining all that pixel data into a single number structure (somehow...), then operating on that big number in a meaningful way. Very hacky, not at all recommended.

Sign up to request clarification or add additional context in comments.

4 Comments

Would you mind sharing your project? I want to compute without GPGPU (e.g. no OpenCL, no CUDA, ...), just for fun and challenge purposes. Everytime I search for it the answers are "not worth, use OpenCL", and its frustrating. Some code to get started would be nice.
My gists contain a lot of GPU projects: gist.github.com/adrianseeley that may be of some help. Here's a feed forward neural network on the GPU: gist.github.com/adrianseeley/08ca986403368018c1c3, here's a quantum particle swarm on the GPU: gist.github.com/adrianseeley/9fd4e0a28e8f559646c4, here's a GPU compute shader setup to get started from: gist.github.com/adrianseeley/f768fd7a3aab2370eafc - email me [email protected] for questions / consults.
@AdrianSeeley - hijacking these comments, sorry, but I'm wondering how slow "VERY" slow is, relative to data size - w.r.t the TSP taking .2-.4s to read the result.. what was the size of the input/output there please?
It's been a few laptops since I ran this code last, but I remember the timing for getting data in and out being upwards of 10x the timing of the computation on the GPU itself. Making this process more of a novelty at the time.
18

There's a project currently being worked on to do pretty much exactly what you're doing - WebCL. I don't believe it's live in any browsers yet, though.

To answer your questions:

  1. Already answered I guess!
  2. Probably not worth doing in WebGL. If you want to play around with GPU computation, you'll probably have better luck doing it outside the browser for now, as the toolchains are much more mature there.
  3. If you're stuck with WebGL, one approach might be to write your results into a texture and read that back.
  4. With difficulty. Much like CPUs, GPUs can only work with certain size values natively, and everything else has to be emulated.
  5. Yep.

1 Comment

Indeed, you better stick to WebCL and leave WebGL alone. I read somewhere that there's already one working plugin for firefox done by nokia. Just google it and give it a shot.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.