Deleting scratch files before a computation finishes

Hello, I am encountering a persistent runtime issue and I suspect that I know what might be causing it. I am using PsiAPI to design a script that will calculate the bond energy of a diatomic molecule at various bond lengths using several different methods (SCF and CCSD(T), using both a double zeta and triple zeta basis in turn) and plotting the results as a function of bond length. The objective of this exercise is to visually compare the results of the different methods. The way I have implemented this is by creating a for loop that will plug a new position coordinate for the second atom in the molecule into a string, plug the result into psi4.geometry(), and then calculate the energy using all the desired methods, saving each result into a NumPy matrix. The script never runs to completion however, and it yields the following error when it fails:

Fatal Error: PSIOManager cannot get a mirror file handle

Error occurred in file: D:\a\1\s\psi4\src\psi4\libpsio\ on line: 168

I suspect that what is occurring here is that the results of every single one of these different energy calculations, at every iteration of the loop, are being saved into a temporary folder as numbered scratch files that don’t get deleted until the script stops running. But because of the sheer number of calculations that the script has to perform, it either runs out of space in memory or runs out of valid numerical titles for all these scratch files before the calculation finishes. Is my guess correct here?

Can I tell the script to delete all the scratch files at the end of each loop, rather than at the end of the entire script? Once the loop moves onto the next iteration and the energy values from the previous one have been saved into the matrix, the contents of the old scratch files should never be relevant again. Alternatively, is there a more efficient way to accomplish this same task without using a for loop, perhaps one that is less computationally intensive? Please note that I expect my script will need to be written in PsiAPI style, as it needs to make use of functions from other modules such as NumPy, MatPlotLib, and Pandas that I do not know how to incorporate into the code in Psithon style. The version of Psi4 I am using is Psi4 1.4rc2.dev1.

I’m new to Psi4 so I apologize if these are trivial questions. I would appreciate any help you can offer me. Thank you sincerely.

The function to clear away all temporary files is psi4.core.clean(). Try using that. If that doesn’t work, the debugging process is going to get a bit more technical.

That’s unfortunate, I’ve been using psi4.core.clean() this whole time (after each energy calculation). I suppose it must be something else.

What number of calculations are we talking about here?

Do you have any psi4.$PID.clean files after the script crashes?

It is a simple record file keeping track of most files that are created and that will be deleted after running clean().
Why psi4 would crash at opening the file for writing is unclear to me.

There are eight energy calculations in a given loop (using SCF and CCSD(T) energy methods, using both CP and NoCP BSSE corrections, and within the double zeta basis and the triple zeta basis) and the script is set to loop 27 times. So 216 energy calculations. The script might be generating scratch files whenever it runs the psi4.geometry function as well, which happens once per loop prior to the energy calls, in which case we’d have 243 of them.

Where could I check for psi4.$PID.clean files?

The .clean files are written to the current working directory.

That number of calculations should not be too concerning.
We know of an issue of not freeing all memory after an energy call. Tricky to track down but it’s not a lot of memory. Still it could add up for hundreds of calculations.
Is there a potential memory concern on your machine?

I’ve found the .clean files, there are 69 of them dating between May 24 and yesterday (June 23). My computer has 7.89 GB of RAM available still, and since the CLEAN files are on the order of a few KB at the largest they clearly aren’t going to pose a problem in that respect. Do you know why the script might be failing to get a mirror file handle, if it isn’t the number of calculations that is the problem?