Splitting jobs on nodes - Memory

Forums

Hello,

I'm trying to figure out the most efficient way to split my jobs on the cluster. I have 8 core nodes with 32 Gb of ram.
How can I calculate how much memory I need for a given job?
Right now, when submitting 200 images 350x350 to one core, I use about 10 gb of RAM. But I don't know what takes the space.

By advance, thank you.

Amedee

Formulas for memory requirements were published in the 2007 paper of Frealign. If you set FSTAT = F, the memory in bytes should be

M = 24 * N*N*N + 4 * IBLOW*IBLOW*IBLOW * N*N*N + 200 * 1000000

N     = particle dimension
IBLOW = padding factor (can be 1, 2 or 4)

For N = 350 and IBLOW = 4, M = 11.4 GB. If you set IBLOW = 1, this reduces to 1.3 GB.