Is there a way to measure the execution time of a given compute shader in Unity ?
I thought to "artificialy" add a compute buffer in this compute shader and make a "getData" on it, because I know this function will block the cpu, until the gpu calculation is over. But, it seems a bit rough as a method... moreover, how to know how long the getdata will last in this case ?
Another solution I tried, was to used GraphicsFence and to check when it's passed
property was set to true, but I can only do it in coroutine which are called once per frame, therefore the estimation of the execution time will be always greater than the duration of a frame...
I've seen that it's possible to retrieve data written in the Unity Profiler. So I programmed a little tool to retrieve this data and calculate an average time for a given compute shader.
The tool is available at this link: https://github.com/davidAlgis/com.studio-nyx.compute-shader-performance-estimation.
Although slow, it gives a good idea of the average execution time of a compute shader without having to code anything.