Species Net runs slowly and does not seem to use GPU

After reinstalling the latest version (described as 6.10 but apparently 6.09) and running SpeciesNet as an animal classifier (with Australia as the location) over some test data, GPU use was 0% and the processing speed was 2 images/sec, whereas the general classification (using Megadetector) runs at about 6 images/sec and with a GPU use >0%.

Hi Simon,

You’re right! It doesn’t use the GPU on Windows.

That’s because it is using TensorFlow 2, which has dropped NVIDIA support on native Windows. There is nothing I can do about that from AddaxAI. If you really want to use your GPU, I advise you to install AddaxAI via Windows Subsystem for Linux (WSL). That way it will be able to find your NVIDIA GPU’s.

There might be a future release of SpeciesNet which is based on PyTorch, which will be able to run on your GPU using native Windows. But who knows how long that might take…

Hope this helps!

Peter

Just in case anyone arrives at this thread in the future, this issue has been resolved: the SpeciesNet weights were ported to PyTorch, and Peter incorporated the updated SpeciesNet code/weights into AddaxAI. Consequently, SpeciesNet can now use the GPU on Windows within AddaxAI. Happy SpeciesNet’ing!

I shall leave you with an image of the only bear that ever wandered through my yard (which is right around the corner from a 7-11, hardly in the wilderness), with SpeciesNet output, rendered in AddaxAI after running on the GPU in AddaxAI.

1 Like