Gerson says he is still working on the extended Leibengut curve. He says there are two spots where it isn't perfect; I didn't manage to get the dates of those spots. Gerson is just adding in a parabola, while Don (Gerson tells us) is doing something more fancy. The goal of all of this is to come up with a final preferred template for the Type Ia template, which does go through all the data.
Mike is trying to find white dwarfs in our data by looking for things with proper motion over a baseline of (e.g.) a year.
Mike got his thesis out in December. The final thing turned out to be a progress report, as he's got a few more months of work to do. Saul suggests that Mike put a copy of the thesis on the web.
So far, he searched 9 fields, about 50 images per field. They have candidates, but they need more advanced photometry (i.e. colors). He also has trouble that his position errors don't go into nice sigma=1 gaussians; they're skinnier. His errors come from the median residual of the transformations. Saul says that as a rule of thumb, the fraction of the pixel to which you should be able to transform is 1 over the sqrt of the number of photons in the second brightest pixel. Rob points out (and Mike agrees) that the dominant errors are actually in the transformations, since we're dealing with multiple images. Those tend to be quite a bit larger than the single image centroiding error.
He chooses one image to be the "fixed" image, and he transforms all the coordinates from every image of that field to that fixed image. He uses all of the objects in a field in order to figure out the differencs in original position and position on the fixed image; that gives an estimate as error in compared position (difference in position) as a function of signal to noise. (There's an implicit assumption that most things don't move.)
Eventually they plot position versus time, and fit a line to it. For all objects, the chisquare (per degree of freedom) should come out to one, but it's coming out lower right now. So, he's overestimating his error bars. Currently he's trying to figure out what's going on. One possibility is correlations in the transformations.
Gerson suggests that some of Mike's methods could be used to search out and do something with AGNs. Saul is particularly interested in this with respect to variability studies (especially with regard to things like GRB candidates).
Susana tells us that they've given the Forest Service the draft of the final plans for the observatory. She says that they're supposed to get back to her by the end of the week.
Lots of things are waiting for the move. The mirror is way dirty; they will wash and realuminize it when the telescope moves.
Alex Lewin is visiting from Davis. She's working with snminuit. Her eventual goal is to try to run the other group's data through our stuff, to see if she can duplicate their results. She will eventually need the help of Robert Quimby for running the cosmology software.
Rob talks about getting the Millenium cluster working for the nearby search.
Robert's been writing support code for the nearby search. LinCal is his deal that has a full calendar for when there is observing time, etc. He's also written "What's Up", which has a text interface (so one could telnet in to panisse as deep and test it).
Rob mentioned the DPF meeting. He said that .... had complained about our saying that our data indicates that the universe will expand forever....
ROB FILL THIS IN
oops, I never did
Saul talks about the AAS. He says the poster and talks went well. One question they got was how do you know when measuring things which are very faint that there isn't some sort of systematic observational effect futzing up your data? The biggest thing would be some sort of linearity problem. We don't think that this is likely to be an issue, but it would be nice to be able to prove it.