I am using Paralution in a GPU-based Lagrangian fluid simulation C++ -code for quarter a year now and I am very happy with it! The only problem I am facing is, that I did not find a way to print my memory usage during the simulation. It does not matter if it is only the solver or everything together. But right now I can just output the size of matrix and solution vector from outside paralution, and have a look at nvidia-smi to calulate the solver as the rest.
So if you have an answer how to output the size of the solver in memory, or anything from inside paralution, I would be very happy.
Best regards, and have a nice Christmas and new years eve!