Collaboration Meeting in Paris

Sunday, 1998 May 31

Contents:


Agenda Determination and Unfocused Startup Talk

So far only 9 of us are here, but we're starting to talk about how we're gonna make a task list. Specifically, right now, we're worring again about who's going to do all this nearby observing. Isobel doesn't think that covering the runs is going to be a problem; she's more worried about reducing all the scads of data. Re: realtime data reduction, Greg says that the data rate will be slow enough that we could funnel all the data to Berkeley and deal with it there.

We start to mutter about what, perhaps, our agenda should be. Alex wants to discuss this Glashow paper that appearened recently. Apparently, there is this theory about the speed of light not being constant for all particles, and also for there being two different kinds of light particles (think neutrino oscillations). Alex thinks that we can put limits on this. Saul wonders how far-out all of this is, and should we really be worrying about it. Alex and Reynald will work on this; it sounds like it's two more free parameters in the luminosity distance. Peter tells Alex to E-mail him with the formula that includes these new parameters, and Peter will E-mail back the postscript fit files.

One agenda item Saul wants is a formal discussion of what we talked about at dinner last night, so that we can know what's still got to go into all of this paper. Another item is the current HST. Reynald tells us that R. McMahon wants to talk about data within the collaboration, and how to deal with people who want to do other things with our data. Related to this is the thought of other uses of the data... what other random projects could we do. We also need a task list / priority list. We need to discuss other papers from the 40 SNe, as well as what analysis still needs to be done on the remaining existing z=0.2-0.9 SNe. Ariel would like us discuss the longer term. Ariel is talking about a group communication issues... knowing, for instance, when some in the group are giving talks places.


A Bug in the Paper's Data

Prior to the first topic: Greg found a bug, and previously the Hamuy supernovae didn't include the Finkbeiner extinction. Whereas we were epsilon bluer than the Hamuy earlier, Greg says (not cutting out Case 2), we are 0.044 redder than the Hamuy. Also, they don't look like quite the same reddening distribution any more. Saul notes that we ought to cut out the reddened SCP supernovae before doing this comparison.

(Added 1998 June 6: The next day, we found another bug in Gerson's tables. The redshifts for the supernovae were not the latest value from Isobel, and what's more they'd all been rounded to two decimal places. The effects of this will be dinky, but probably it should be done right.)


Current HST Supernovae

First topic: Current HST. Gerson is showing us the publicity images of 9819, which show the striking resolution improvement of the space telescope. Greg notes that in almost all of the images, the supernova distance from the galaxy looks to be large. (There is one exception, with what looks to be an elliptical host.) Greg cautions, however, that to the eye on a WFPC image the faint outer parts of the galaxy are not apparent. He thinks the surface brightness of the outer parts of the galaxy is low in comparison to the readout noise.

Gerson notes that perhaps we don't have to wait for final HST refs. Greg cautions that, e.g. 9878, we need to make sure that the fuzz around the supernova really is part of the psf. Looking at this one, Isobel notes that it looks a little like the gaalxy does extend slightly in the direction of the supernova... it almost looks like the SN could be on the core of a second galaxy which is interacting.

Greg notes that regarding HST photometry, there are serious concerns about HST. The photometry of faint stars, he is told by Steffano Cossertano at StSCI, seems to have a constant offset of something like 200 e- (lost) in a (2 or 5, unremembered) aperture. The effect of this on our faintest lightcurve points can be 100%. Steffano took extensive data in November to try and calculate this. Apparently, this problem didn't exist on the ground... apparently, the background doesn't matter either, which is very odd for charge transfer problems. The hypothesis is that every single pixel has several small traps in it, due to cosmic ray hits. The charge transfer inefficiency, which we corrrect for, is separate. For 9784, we used Brad Whitmore's initial results, which has about a 7% effect. Apparently, this has been increasing in time as well. In short, it seems that the HST performance is degrading with time.

Ariel wants to know how much data we took. At one point, we typically get two R band and two I band images. As they get fainter, we start going to just one filter. Ariel wonders if we can use non-equal exposure times to check the effects mentioned in the previous paragraph. Greg thinks that we don't have nearly enough signal to noise to check this -- Steffano will have way better data for this sort of thing.

Greg says that the status is that we need some bit of code to do poitn spread fitting plus planar fitting (or maybe a deVoucouleurs law) to take out the galaxy background. Saul notes that there is this Tiny Tim program that givs you a psf. Saul says that Andy Fructer says that you have to do one more convolution after Tiny Tim. Greg says that there also slight focus variations that can change the psf and affect the photometry at the few % level. In short, just running Tiny Tim probably won't be enough.

Saul also notes that we need to worry more about the way of comparing ground based to space based data. I.e. one wants to integrate the HST and ground based data, with the correlated errors in there. Also, we probably have almost no overlap in time between ground and HST observations.

Greg notes that Shane Burns is working on the NICMOS data. The biggest dissapointment is that we were trying to reduce the readout noise by reading out the array n times. However, as the readout noise seems to be correlated, we aren't winning by as much as we had hoped to. (Note added 1998 June 6: Shane tells me (Rob) that things get better with NICMOS when he fixes the pedestal (differential quandrant bias offset variations) NICMOS is going to be hard.

Greg says as soon as the HST ones are ready, we're going to do a paper. Whatever additional ground based ones are ready will probably be mixed in with this. Peter thinks that in this paper, we should take all of the data for which we have color corrections (by which he means host galaxy reddening) and do it.


Upcoming Papers on our "Finished" SNe

New Topic:

Other papers on the current 40 SNe. Gerson has the Time Dilation/explosion time/composite lightcurve paper that he's sort of been working on forever. Part of this is Pace Libengut... the Libengut lightcurve fits everything very well. This is based on 34 SNe, not using the very highest and lowest redshift ones where we don't have rest ~B and rest ~V.

Saul mentions the stretch paper, including perhaps information about U band data (which, Saul says, seems to follow the same realationship, whereas R and I don't seem to completely). Gerson emphasises (this will go in his paper too) that while Delta-m15 only deals with the past-max lightcurves, stretch works on both before and after. Saul also notes that the day of maximum differences for different bands seem to be related to stretch -- although, so far, this is based on a limited data set.

Rates

Reynald mentions the rates paper... it's late, but he's now going to use the data through the November/December run. He says there will be bins at 0.3, 0.5, and 0.8... although, he doesn't use these bins when he paramtrizes things as a power of 1+z (or whatever). He's got something like 35 supernovae going into this. He says that the rates seem to be going up at higher redshift... but the rate in SNUs is constant, i.e. it just seems to reflect the volume. There is discussion about which cosmology he's used in doing this. There seems to be sentiment for using a cosmology consistent with what we find in the SCP. Greg notes that the Schmidt paper casts aspersions on us, saying that a machine will find supernovae that humans wouldn't accept as real candidates (in terms of fakes and efficiency calculations), thereby overestimating the efficiencies messing up the rates. Greg says that Reynald ought to look at what they say and address this.

Saul tries to summarize the result here. For the rate paper, they are working at a higher threshold, and they do not dig into the noise. At the level at which they are working, we believe that our scanning efficiency (the human factor) is better than 95% (or is it 99%?). Another note is re: getting spectra and identifying supernovae, we've only missed one in all of our searches of the brighter - rob runs to board - supernovae. So, the rates paper is operating in a more conservative regime where we are very efficient.

For the Rates, Reynald is more worried about things like host galaxy extinction and other things we don't take into account. Reynald says that we should be careful to note that we assume there is no host galaxy extinction or "inclination" effects. Local SN rate are corrected for this inclination thing in spirals, which comes out to about a factor of two correction. Greg notes that for things in terms of SNUs, it should be OK because stars should on the average be affected by more or less the same deal.

Talk of including information about hosts and publishing rates for different host types, ellipticals vs. spirals. Host galaxy colors, from Greg/Robert (Quimby), as well as host information from the spectra interpreted by R. Ellis. We won't be able to cut very finely -- this could be done again when we have better morphology informaiton.

Re: Rates, Pilar brings up the ugle spectre of Type II's. This is all very hard since we don't follow any of those. Peter doesn't even think that we can put a limit on this -- we don't have a luminosity function that we don't understand. Saul wonders if there's something even very crude that we can say about this. Take all of the brightest things that we don't know what they are, and assume that they are Type II's.

Waidaminnit... numbers are getting bandied about that are inconsistent from what we had before. Reynald has 35 identified Ias. Now the question is, how many more objects do we have which are high-sigma which are in addition to those 35? It sounds like it's only a few.

(Aside)

(Aside (which actually came in a few paragraphs above): Peter has this magic supernova program that can give you a K-correction or a magnitude of a "template" supernova. Saul is asking for documentation on how to run this program. Rob should talk to Peter about this program and set up a multimedia CGI/Java interface that one can run from the web.)

Time Dilation etc.

Oop... now we're backtracking, and Gerson wants to go back to the time dilation / explotion time paper. He wants to go thorugh for 10 minutes through the results. With Peter's help, he got all the points K-corrected to the B band for 34 supernovae. Plotting all of this, you clearly see the time dilation, plotting in two z-intervals (below z=0.45 and above z=0.45). The other thing is to add all of the supernovae toghter, shifted to line up the day of max. You get three unique bands: the Hamuy data (no time dilation) and these two other bands. If you divide by 1+z, it more or less boils down to a unique curve. If you divide each individual supernova by the stretch, then the whole set all boils down to a single curve. Because sometimes we have some SN light in the ref, we map out the entire early region of the lightcurve -- we can clearly see the lightcurve go all the way down to zero. The initial rise is at -18.8+-0.2 days (times the stretch) before max. This is the first time the stretch is applied to the early days. (NOTE -- does snminuit still not apply the stretch before -10 days, or whatever it was? We should think about this in the next few weeks.)

Confusion about the meaning of this -18 days. Peter says, how many days before max is the first observable point.

Saul raises the issue that, visual is good, but he also wants to see some quantification of how much better the lightcurve gets after doing the various corrections. "Type Ia Supernovae Lightcurves: Cosmological Time Dilation, Single Parameter Description, and Turn on Time" or something like that is the name of Geron's paper. He wants input, and lots of people yelled at him that he should pass out a draft for the world to read.

Spectroscopy

Isobel: Spectroscopy paper. There was quite a discussion about it at dinner last night. What should go into it? It will start with just the best supernova spectra, and say something about how it fits into the time sequence for Ia spectra, and make a time squence for our distant supernovae. Isobel wants to come up with somthing quantitative, which won't be killed by our host galaxy contamination, to show how well these things fit into the time squence. (Host contamination makes this pretty hard -- plateau/slope offset, as well as the H&K lines eating into the supernova spectrum.)

Isobel shows us a first version of a plot of a time sequence of high redshift supernovae.

Part of the issue of this paper is deciding what goes into this paper. She only selected from set D and E, and she included the ones that she liked. Peter is suggesting that 9851 (the one caught very early) should go into this as well. Of course, this opens a can of worms, because including that means that we ought to include lots of other good looking ones from set F and G.

(Aside: Isobel has a Spectra web page with lots and lots of informationn.)

Presentation: do we want to have all of the spectra, with a template (nearby supernova) next to it? Saul likes to to have the template not overwriting the distant supernova. What about putting in the galaxy spectra as well? Saul says he likes it.

Note re: template galaxy subtraction, 975 Isobel tried one and it helped the spectrum an awful lot. One issue is trying to decide how much to subtract... it's sort of a matter of art to decide how much to subtract. Saul thinks we can just do it, and say it's a demonstration of how striking the effect can be, but caution that it's very hard to quantify the true amount of host (and the tilt) that ought to be subtracted out of this.

Since last night, is Isobel happy that she has a paper? She thinks so. Perhaps the time sequence thing is good enough, although she'd like to get some type identification in there as well. Saul questions if there is a catalog of Ic's to make a time sequence. Peter says no, but Isobel wants to know if there are one or two at the same date. Peter says it will look completely different... but there isn't a good selection of Ic spectra for a good complete set.

Photometry

Rob -- photometry paper. Should the whole search business be included? It seems that this sort of thing ought to be published, but should it be mixed into the photometry paper? The photometry paper will include a complete description of how the photometry is performed, as well as the results of that photometry. Perhaps Sebastien could write a search paper, although he may be overloaded. Lightcurve fits should go into this paper as well.


Important Personnel Announcement

Cake boys have returned.


The Truly Giant Task List

New topic: upcoming tasks.


Data Sharing

New topic: data sharing.

Just say no

-G. Goldhaber

Filippenko did some data on one of our z~0.36 papers, where he got some spectra (97ex). We got a good early Keck spectrum, Filippenko got two later. We have three nice spectra at z~0.36 with dates that match with data where we have photometry. Fili. is going to write a brief thing on time dilation with spectra and comparison of these spectra with a nearby spectra of the same supernova. No lightcurve data goes into this, only the three spectra. The issue is, is any bit of information (e.g. the time of maximum) from the lightcurve going to go into this paper? That would mean a million names going onto the paper because of the amount of work that goes into the lightcurve. Filippenko, apparently, doesn't want all of the group on the paper... and if we are, all of his group goes on to the paper.

There is some confusion over the issue of why this will be a Filippenko et al. paper instead of a Nugent et al. paper. Peter thinks that perhaps if Isobel puts her paper out first, then Filippenko won't want to do the paper. In any event, it seems that the Isobel paper should get out first, and Peter says that he can guarantee that this one will go out first.

Summary: what Alex wants is unacceptable to us, although now it seems to be mostly because it was Peter who came up with the idea and who did the work on this... so, if a paper goes in, it should be Nugent et al. But, in any event, the Isobel spectroscopic paper should go in.

What about other people who want to use your data? Jeff Willick at Stanford wants to use our data to look for clusters, and might bring in a student (and include anybody in our group who wants to work on it?). Gordon Squires is looking for grav lenses (large scale strucutre weak lensing), and worked with Mike Moyer in our group. (Note that CFHT data cannot be uesd for weak lensing!) Richard Ellis says that George Efstathieu wants to try to coordinate Ia SNe with CMB, to do the simultaneous fit. Saul's suggestion is that anybody who wants to do this can -- we've published. (Well, OK, we will be published.) Well, OK, now we're talking about the paper, so we'll come back to this.

Turner and White want to use this, along with CMBR and such to bound w and the equation of state and all of that. This is another one of those where we may just publish the data, and Saul (or whoever) willl go on it if he does anything useful for it.

New topic: databases. Reynald is talking about going to Sybase. Greg says that he's heard of those who use it for large projects thinking it's incredibly slow.


Communication or Lack Thereof

New topic: communication and talks. First issue, need to have a few differenet mailing list -- a nearsearch to be separate from deepnews to be separate from Berkeley whining.

TitleDistributionWhat for
Deepnewshigh redshift searchInformation germane to the high-z (z>0.2) work.
Nearnewsfull nearby collaborationCoordination and discussion of the gigantic nearby (z=0.02-0.2) effort
(Something)Berkeley peopleRob whining about full disks and the like

What would be good is a group calendar. Rob will look into this. Additionally, we need a list of talks, who is giving it, who has given it, where, when, proceedings, etc. This should get its own web page as well. Somewhere as well should have something which has the current list of the bloody collaboration.

It is important that the collaboration be informed when new information is going out in talks, so that people will have a clue if they are asked about it.

Re: communications, Reynald would like to have real meetings as well, perhaps twice a year. Perhaps everybody doesn't go to every one, but.... Should we decide on a tenative date for Berkeley: beginning of December? One worry is that the Texas symposium will be in December. Right-o. Next year, where?

There is a conference the first weekend in September in Portugal. Astroparticlephysics.