We start with Dr. Grooms Grammatical Clinic dissecting Saul's new opening paragraph to the Nature paper.
So, yes, the wording needs to be rewritten. How about overall content? Greg thinks the level is too low; in explanation, it seems he thinks that things are gone over in a little bit too much detail. Saul was aiming for explaining the context of what we were doing. What is the scientific problem, how has it been addressed so far, and what is new about this work. (This seems to be what Sage wants, all in 180 words.)
Further discussion over just how much background goes into that first paragraph.... Peter thinks we can get rid of all the references of what has been done. Others aren't sure they agree. Don is talking about redshifts. Susanna is the Sicentist Formerly Known as Sick. Rob is writing non-sequetorial paragraphs as he feels that is the best way to simulate this discussion.
Finally we get to telling Peter point by point the things (concepts) we want in the first paragraph; Peter will take a stab at doing it.
Next issue of paper: the extinction section. People seem to be happy with the general idea of what we have. (Why we don't think there is extinction; how screwed we would be if we were; HST later will help; we go on ignoring how screwed we might be.)
Last paragraph: Sage asked for a projected errorbar; how good can we do when we have more z=0.8 SNe? Do we really want to do this? This is a troublesome prediction to make. We say we will be able to measure Omega and Lambda independently... what does that mean? How well? How many SNe would it take? Greg suggests that we say that our point is at this distance with the HST you can get your photometric accuracy down to the level of the intrinsic dispersion of SNe, or better.
This week Greg figured out that HST has yet another problem: short exposures have a different zeropoint from long exposure. Electrons are lost with short exposures, it seems. Somebody told him that the effect could be as much as 0.8 magnitudes for our dimmest SN! Greg isn't sure that this number is accurate, however. He says that they aren't really all that well known yet, either. Theoretically this month they are getting more data to try to determine the affects of this effect. The direction of the effect is that currently we think the SN is fainter than it really is (if the effect applies, which Greg suspects it does)
Saul has another take on this. Andy Fructer told Saul to talk to Peter Stetson. Currently, Saul says that Stetson sees it as a different in the zeropoint in between images of different gains and different exposure times. Stetson has zeropoints (unpublised) which are 4%+-1% different. Greg argues that he doesn't think the effect can be stated as a simple zeropoint difference. He wants to know if Stetson has long and short exposures of the same exact field. Greg thinks that Stetson is describing what he can do with his data, but that it doesn't match the effect Greg is talking about. The next thing is that Saul says the correction he found is in the other direction; our SN is fainter than we thought it was. So do we have to effects going in opposite directions?
Peter mentions that since we were on a gain of 7 there was let another correction for "Holtzman's stuff" which we should have applied, but didn't. (Yipes a ripes.) These correct the gain of 14 zeropoint to get the gain of 7 zeropoint. Er.... The numbers we were given by HST, and the ones that Peter used, were different than the one that Holtzman quoted for the zeropoint. The consensus about this is that people believe that these things were put in.
Re: all of this, rather than starting to kill ourselves to fix all these, Peter suggests that we state in the figure caption what we did, and not worry about corrections that aren't yet characterized. Greg perhaps wants to point out that our photometric errors are already large on the points most effected, so that these effects wouldn't throw off our conclusions.
In terms of estimating this, it's hard because we don't have a physical model of the effect, only an empirical one which may not apply in our regime. So, finding the differences in magnitudes is very hard.
Devil's Advocate position: we have not demonstrated in this paper that SNe can be studied with HST with sufficient accuracy to solve this problem. How do we solve it? Peter says we still have the data, we change our results as updated HST corrections come in.
A slowly emerging consensus seems to be that in the caption we just say that, hey, we did this based on what we know already about the HST, and that we didn't correct for recently proposed but poorly charcterised nonlinearities in WFPC at the low flux end. Mention that perhaps two low points would go. This inidicates that we are cognizent of the problem, even if we haven't applied it yet.
There is a lot of arguement against including a Stetson Private Communication correction, unless we can find a hard reference for it (which it sounds like we can't). The 4% of this, even if included as an uncertainty, wouldn't be noticed reative to our outher errors (mostly, photon noise).
Note that the "long-and-short" problem Greg tells us he talked to Stefano doesn't have anything to do with long and short exposures, but rather has to do with the number of counts in a pixel, and doesn't have anything to do with CTE.
All of this sounds very, very much like black magic.
LBL's gonna be doing switching power during the Christmas shutdown. We have to make sure it doesn't nuke us out during the run. Saul is talking to folks in computer services to make sure that we have support to make sure that the network still goes.
...is not yet APM matching. More fighting with data is necessary.
Susanna tells us that the transfer rate is 17 megabytes in 2 minutes. Much better than CTIO.
They have a Sparc Ultra set up for data reduction which has between 128 and 156 megs of memory. If you do long exposures, it's reasonable to do online reduction, but they don't recommend that you be doing anything when the chip is reading out.
The other group got screwed with their CFHT search.