c++node.jscuda

How can i use CUDA with Node.js


CUDA is an NVIDIA provided API that lets C/C++ use GPU for some stuff. I don't know what that stuff is, but I would like to know. From what I saw the gains were remarkable. Also CUDA only works for NVIDIA GPUs...

There does exist a module for Node.js, but it's only for 64bit version of Windows, yet there exists CUDA for 32bit version as well so the only thing missing here is the binding/extension for Node.js to CUDA in C++. And There is no sign of documentation anywhere on GitHub or the internet about that module. Last commits were like 1/2 year+ ago.

If it's all possible than it'd be very great. As Node.js would be able to use GPU for operations, putting it on a whole new level for web stuff, and other applications. Also given parallel nature of Node.js it fits perfectly with GPU's parallel nature.

Suppose there is no module that exists right now. What are my choices.

It's been done already by someone else: http://www.cs.cmu.edu/afs/cs/academic/class/15418-s12/www/competition/r2jitu.com/418/final_report.pdf


Solution

  • The proper way to do this is to use the Nvidia CUDA toolkit to write your cuda app in C++ and then invoke it as a separate process from node. This way you can get the most from CUDA and draw on the power of node for controlling that process.

    For example, if you have a cuda application and you want to scale it to, say, 32 computers, you would write the application in fast C or C++ and then use node to push it to all the PC's in the cluster and handle communication with each remote process over the network. Node shines in this area. Once each CUDA app instance finishes it's job, you join all the data with node and present it to the user.