I started toying around with atomic RAS-CI. Is there something wrong with the following input
molecule {
0 5
Fe
}
set basis pcseg-0
set reference rohf
set scf_type pk
set mcscf_type conv
# Ar core
set restricted_docc [3, 0, 0, 0, 0, 2, 2, 2]
# spd active space
set ras2 [3, 1, 1, 1, 0, 1, 1, 1]
cas_e, cas_wfn = energy('rasscf', return_wfn=True)
# Dynamical correlation from Ar core
set restricted_docc [0, 0, 0, 0, 0, 0, 0, 0]
set ras1 [3, 0, 0, 0, 0, 2, 2, 2]
ras_e, ras_wfn = energy('detci', ref_wfn=cas_wfn, return_wfn=True)
The output from the second part looks very weird:
==> Starting CI iterations <==
H0 Block Eigenvalue = -1259.69953111
Simultaneous Expansion Method (Block Davidson Method)
Using 1 initial trial vectors
Iter Root Total Energy Delta E C RMS
@CI 0: 0 -104.599203359018 -1.0460E+02 1.7097E+00
@CI 1: 0 -114.034313448065 -9.4351E+00 2.8252E+01
@CI 2: 0 -126.656421888822 -1.2622E+01 1.2979E+01
@CI 3: 0 -129.395267421081 -2.7388E+00 9.7215E+00
@CI 4: 0 -131.864369339767 -2.4691E+00 1.1087E+01
@CI 5: 0 -144.156387935524 -1.2292E+01 2.5547E+01
@CI 6: 0 -152.844222654133 -8.6878E+00 1.0207E+01
@CI 7: 0 -154.032962698529 -1.1887E+00 2.7989E+00
@CI 8: 0 -154.146131021761 -1.1317E-01 2.2249E+00
@CI 9: 0 -154.270646128935 -1.2452E-01 2.5911E+00
@CI 10: 0 -158.914747893258 -4.6441E+00 2.6410E+01
@CI 11: 0 -170.712879724983 -1.1798E+01 1.7935E+01
@CI 12: 0 -175.721291161824 -5.0084E+00 1.0286E+01
@CI 13: 0 -176.954792530500 -1.2335E+00 5.8794E+00
@CI 14: 0 -177.398912783427 -4.4412E-01 2.7091E+00
@CI 15: 0 -177.468521952843 -6.9609E-02 1.1821E+00
@CI 16: 0 -177.486922248949 -1.8400E-02 8.1383E-01
@CI 17: 0 -177.510798336187 -2.3876E-02 1.4520E+00
@CI 18: 0 -177.679485706644 -1.6869E-01 3.7443E+00
@CI 19: 0 -178.051070273708 -3.7158E-01 3.6720E+00
@CI 20: 0 -178.247547328802 -1.9648E-01 2.2715E+00
@CI 21: 0 -178.343356506871 -9.5809E-02 1.6707E+00
The H0 block eigenvalue looks alright, but it’s as if detci is screwing up its guess vector, since the energy is 1100 Hartree too high…