For very large models, memory requirements become demanding. Would it be possible to formally state the scaling laws in the documentation or even provide functions to estimate the absolute memory usage? Of particular interest is the impact of the total number of bands to be solved, as that came as a surprise to us. If this is not possible analytically, I would be happy to contribute some empirical results on a research cluster for a proposed test suite.
For very large models, memory requirements become demanding. Would it be possible to formally state the scaling laws in the documentation or even provide functions to estimate the absolute memory usage? Of particular interest is the impact of the total number of bands to be solved, as that came as a surprise to us. If this is not possible analytically, I would be happy to contribute some empirical results on a research cluster for a proposed test suite.