flashmathgpugpgpumolehill

Fast arithmetic using the flash 3d api?


Some computationally intensive software are now using the GPU to solve mathematical problems. Now that flash has GPU support, is it possible to use flash to crunch math problems? How would it be done.

In other words, does flash expose sufficient low level API to control the behavior of the GPU sufficiently to perform such a task?

Example of problems; Find a message with the following hash: 2987432847298374298374982374


Solution

  • Yes, since you can provide textures for random access input (if they provide constant buffers that would be nicer) and render to BitmapData for output, which are the minimum operations you need for GPGPU. There's not too much information on how complete the API, or AGAL's instruction set, is - I haven't seen any examples of integer registers or bitwise operations which would be incredibly useful for your example problem, so the potential performance might be far worse than a DirectX or OpenGL implementation could get, but it should still be far better than ActionScript!

    I should note, however, that this would be old-school GPGPU, which is even harder to do than it currently is with the new OpenCL and DirectCompute APIs.