From: Robert A. Knop Jr. (robert.a.knop@vanderbilt.edu)
Date: Mon Apr 05 2004 - 05:52:19 PDT
It can be done, but here's the reason I hesitate: the memory
requirements are rather large.
Some math: each 4k x 4k image takes up 64MB of system memory when it is
read in. By default, each subtraction stars with *four* images red
(new, sub, sub0, ref), or 256MB. Naturally, there are other overheads
(including the fact that these images are a bit bigger than 4kx4k, the
memory needed for generating displays, for keeping object lists, code,
and other variables).
As such, the effective memory footprint of searchscan is between 300MB
and 350MB for this search. That's fine on a machine with half a gig of
RAM, but will get slower on a machine with less.
If you want to be able to see the individual images from within
searchscan, it will *double* the memory footprint of searchscan.
This means that (a) you need at least a gig of RAM on machines that will
do scanning, and that (b) not much else should be running on those
machines.
Is the group willing to upgrade the machines to be used for scanning
(including the ones at Vanderbilt) so that we have enough memory to
integrate the individual images into searchscan for the May, July, and
August searches?
More disk space in Berkeley is probably going to be needed too.
-Rob
-- --Prof. Robert Knop Department of Physics & Astronomy, Vanderbilt University robert.a.knop@vanderbilt.edu
This archive was generated by hypermail 2.1.4 : Mon Apr 05 2004 - 05:52:43 PDT