SCP Meeting Notes, October 2, 1998

We are talking about cookies.


SN Rise Time

Peter and Don have been working on supernova rise time. He did this based on a file of all our supernova light curve points, K-corrected and stretched, in one file. (The uber-set.) Don fit the right half of a parabola to the very beginning of the light curve. He always starts at day -20 (relative to peak).

From there, he stepped his way up the curve, raising the upper limit. He showed us the plot for -20 to -10 days. In his mind, it's a very good fit; for the "toe" of the curve, a parabola is good enough to describe the data. It's got physics in it; the fitted quantity is velocity of expansion times (temperature squared). He even gets something within the reasonable bounds of what you would expect.

He says doing the fit to the Libengut template, using the whole data set he got 0.999. For just the data from -4 days on, he found stretch 1.005+-0.0025. For the data up through day 4, he found stretch 0.995+-0.0025. (More or less.) So, there is a small difference, but nothing we'd have ever come close to seeing in our individual supernova fits.

Just for aesthetics (not to affect any published results, since it wouldn't), he's suggesting taking out the 0.005 in the stretch, and then joining it to a parabola for the beginning of the lightcurve, to make a new template.

Results from these fits. He has different results for different amounts of the uber-set fit. Fits using data up through -8 and up through -10 produces slightly different (but less than 1sigma) results for the time of explosion. -17.6+-0.3 days is the time, although the error isn't magnified to reflect the cutoff sensitivity. There is also that other parameter (temperture^2 * velocity), but I missed his value for it... hmm, something like 6.6, I think. (Temperature is in units of solar temperature, I think.). He says it is maybe 2.5 times as hot as the sun if the velocity is 10,000 km/s. Peter says this is comparable to the velocity of lines seen in the earliest spectra that there are.

There is some discussion about the fact that Don's getting a too small chi-square. Greg asserts that the single-supernova lightcurve fits indicate now that the errors are about right.

Don says that the one strong statement that can be made is that the data do not support anything more complicated than a parabola. It's nice that it is so simple. What more, there's an actual physical interpretation (turn on time, T^2*v). We've got a turn on time that seems to be stable to half of a day.

Peter says that modellers right now are getting 14.5 or 15 days for turn on times, so this result would affect the modellers. Joe Bob says Publish. Peter's gonna work on this and showing how it compares to what various models predict.

Cute. Fun.

--Saul Perlmutter


Greg talks about Keck and the strategy.

The current strategy is to have clear weather at Keck.

--G. Aldering

Two fields, one at 23h, one at 5h. The 5h field has our z=0.86 supernova in it. We have HST followup slated for the 5h field; if we find a SN, we'll follow it, otherwise we'll use the HST to get a final ref for that supernova. (Greg is working on making sure we have backup NICMOS time to get a final ref for that supernova.)

Greg was thinking about using the HST on the 26th to confirm a supernova for the Keck spectra time the next day... but we can't turn the HST data around fast enough. So, we'll just gamble on a candidate and hope that it pans out.

Greg has a sketc of what followup looks like, assuming we find these things at maximum. It's an I-band lightcurve, which matches to U-band in the rest frame. There are two additional NICMOS points. If we don't get one above a z of 1, he also showed us the spacing for a z=0.8, which still works.

With good weather, Greg predicts we will find 6-10 supernova, with 3 over z=1. (So, sqrt(3) becomes a little scary when compared to 3.) But, if the seeing is good and it stays clear, we should get a high-z supernova, over z=1 if we're lucky. Hopefully, in the 5h field, since we don't get good followup in the 23h field (making it just a trophy supernova).

If nothing else, clear data may give us among the first limits on those rates. If we get this data, Greg says, we might want to think about how we do the rates paper... since Reynald's rates paper hasn't come out yet, we might want to find a way to move very quickly on this stuff.

In terms of informing HST of our targets: as soon as we have candidates, we have to ask Doug for guide stars. The deadline is October 20, and the run is on the 14th, though the sooner we get it to him the better. Greg told him we'd have the stuff to him by the 19th.

This raises the question of whether we should try to subtract while we're obtaining data. There seems to be feeling (Pete, Rob) against doing a realtime search. Saul still thinks perhaps we ought to do it, because in the past he says that we found things by subtracting that fed back towards making the observations go better.


Markus: Software and Adieu

This is the last meeting for Markus. He's leaving next Wednesday. He'll be in Monday. He's written a nice suite of programs for cosmology, and he's even written some very nice documentation.

Markus wrote a package that, given redshift and a set of parameters, you can calculate lots of things: distance modulus, lookback time, scale factor. You can get apparent magnitude at peak at UBVRIJHK, as long as there is a spectrum which covers that filter. He also has the same thing using R and I band, using a lightcurve rather than a spectrum. (This also includes stretch.)

These are common questions we ask, so these should be useful programs.

There is a possibility that Markus may have the opportunity to do a Master's and or PhD thesis, perhaps associated with us somehow.


The Age of the Universe

Robert tried to find out the age of the universe. He ended up doing it with a Monte Carlo distribution, throwing down points for Omega and Lambda, and gave it a probability, and filled in the contours. He calculated the age using one of Markus' programs, and made a histogram.

They tried it with Jacobean and derivatives, but had trouble because it was very small differences in big numbers.

Sometimes you can get around that by being clever.

--somebody

But we weren't.

--somebody else

So, they settled on Monte Carlo. This age, in the flat case and the general case, is what ended up in the paper. The age is 14.5+-1 years (general) or 14.9+-1 (flat).


Indian Protests of the Automated Search Telescope

Chew's Ridge: hung up because some guy is 1/100000th Native American, and Mother Earth speaks to him, and said that the site is sacred and can't have a telecope placed on it. He's a renegade member of some tribe or another (and the tribe elders all hate him-- even his second cousin doesn't like him). MIRA and the Forest Service (especially) is taking this seriously, which is slowing things down horribly.

This guy complains that when he's up there meditating, he looks up and sees the MIRA building, and it interferes with his religous freedom. (Sheesh, the stuff that's happened while I was in Church...!)

White Guilt is a bad thing.


Don's CCD Report

Don shows us a picture of a CCD. He shows us updates from some calculations documented in notes a few weeks ago. One thing that has come out is that it's fairly unique to use an optical model; most people use a diffusion model.

He shows us his calculation results for a Site CCD, complete with fringing. He also plots relative fringing (QE devided by average QE). He also shows us LRIS model.