SCP Collaboration Meeting, 2000 January 14, 14:30

This is our final plenary session. We're finally in an air-conditioned room. The items to cover are first a report on the discussions on rules and procedures; short reports from Anna, Ariel, and Gerson; and, finally, the updated brainstorm on what we'll do with HST given updated information.


Rules Discussion

We came in with a number of proposed rules like collaboration membership, author lists, procedures for how we handle getting people together before certain deadlines, etc. Instead, the main thing we did was come up with a mechanism to help make these decisions without having to have everybody there at the same time.

There will be an executive committee composed of Saul, Greg, Isobel, Reynald, and Ariel. This committee is charged with coming up with a meeting time approximately once a month, setting an agenda a week in advance. The collaboration will be able to give feedback on how to make those decisions before the meeting.

That's all that got finalized. Everything else is up for discussion using the new mechanism.

Ana on What's Going On in Portugal

In Lisbon, along with a few students ( Ana, Maria, and Paulo), and some others. have been working on reduction of data from the Spring 1999 nearby campaign.

In Spring 1999, YALO was used for a lot of photometric followup. In November 1999, their group financial support for 2 years; this is supporting the work in Lisbon, and includes some support for members from Stockholm and France. In March 2000, they decided to do the data reduction in Lisbon, rather than sending people to Berkeley to do it. Ariel is getting all the data from the nearby supernova in Stockholm. Stockholm is starting with the spectroscopy, and Lisbon is doing the photometry. At the same time, it was decided that the Lisbon people would take part in the NOT observations run by the Stockholm people.

In April 2000, there was an ESO meeting in Lisbon. There, the common opinion was that the YALO telescope was important for Portugal, and that it was important to support YALO2. In May2000, there was a YALO meeting.

They have photometric data for lightcurve in UBVRI, from a lot of telescopes. They want to start as soon as possible analyzing the YALO data. They have last year, and final references for the U-band. First they are working on understanding the image quality and understanding the telescope. The second task will be creating the lightcurves.

She shows a display of the SNe we have. They decided to start with the SNe where there are points before or at maximum.

In terms of understanding the qualtiy of the images, they are starting by working with IRAF and C programs. She says taht they don't have the raw data in Lisbon, but rather the flatfielded data.

It seems that the only YALO data we got was flatfielded; nobody in the SCP has the raw data, but rather the data processed through a pipeline, which evidently Brad Schaefer set up. It might take some effort to get that raw data....


Spring 1999 Nearby Search Status

Greg notes that Nikola at EROS has been analyzing all the photometry data. Also, Lou Strolger at CTIO has been working on the data for the supernovae that were found jointly with his group. Greg reminds us that we have different arrangements for data sharing with some different groups. For isntance, the SNe found with the Smith/Strolger group are joint between us and them. There was an undersatnding that Lou would be able to publish the lightcurves on those supernovae first. Also, we shared data with them, so that they get our followup and we get their followup on those supernovae. With EROS, there was a very different arrangement. Originally, they wanted to become members of the SCP. Instead, as a compromise, we agreed that they would have access to all of the followup data on all of the nearby campaign, but they couldn't publish anything without clearing it through us the first time around. So far, this has worked; Nikola has even screened things that he was going to discuss at conferences.

We also got early notifiaction with a copule of other groups (includng Brain Schmidt and Alex Filippenko) who were searching. We agreed we would work jointly on that stuff. This includes 99AO and the two Kait supernovae. Greg believes that all of the sundry complicated arrangements are in fact self-consistent.

The other groups are in fact proceeding with their own analyses. For instance, Greg says that Lou Strolger has a nice lightcurve of Pompeii.

There are a few supernovae which we have exclusive control over, the ones found with NEAT and Spacewatch. (Does EROS have access to these??)

Greg thinks that publications of lightcurves aren't our principal science goal. The spectra have a lot going on; Strolger, and currently EROS, aren't doing anything with the spectra. Of course, Ariel's students are working on the spectra, and Nikola has expressed an interest in working on the spectra.

Greg discusses manpower issues with Reynald; the basic conclusion is basically that there is no good way to estimate when things will go out.

Ana says that she has money to hire a postdoc in Lisbon. She notes that this money has a time limit on when it might be spent.


Greg on SN Factory

Saul asks Greg to give a few words on what the SN Factory is. Greg says that from the start, what we'll see out of the rules committee is that the SN Factory will be a separate collaboration. It will include people here, but also lots of other people who won't then necessarily be members of the SCP.

The SN factory came out of our surprise at how easily we could find supernovae with NEAT, and also the perception that we were going to get hammered by an evolution question. Maybe a larger effort on the supernova game was appropriate. Last spring was to find our own set of "Hamuy" supernove. The SN Factory has broader goals.

The goal is to get 300 SNe observed with UBVRI and spectroscopic followup. The goal is to convince people that SNe at high-z are like the ones at low-z. We think that there's enough dispersion in the properties of low redshift supernovae that we'll be able to find a wide range. I.e., they won't all be a certain kind that would be dissimilar to high redshift. We've identified a straightfowrad way to find them: using NEAT, which has been upgraded to a 1.2m telescope, running 18 nights a month rather than 6. Soon, they're also going to be starting to run at Palomar, ultimately with a rather large imager similar to what QUEST has. Once at full capacity, we should find several supernovae a night if we ingest and process all the raw data. The problem to solve, then, is the followup issue. We're trying to identify a fairly automated form of followup. Last spring, we sent troupes of people out to telescope, and it was labor intensive. This time, we'd like to have an IFU spectrograph that a night assistant can run. The IFU would have the advantage of not requiring an astronomer there to make sure the actual object is down the slit. Also, we should be able to do spectrophotometry with the IFU. We ought to be able to just do UBVRI photometry with images rebuilt from the spectra.

There's been discussions with others to build and IFU, and discussions with ESO and Hawaii about getting some regular telescope time. E.g., an hour a night on some telescope. There's a lot of software required to automatically find the supernovae, to decide what needs to be observed when, and to do the reduction. The discussions are principally with Reynald's group and the group in Leon. Portugal and Stockholm may participate as well (e.g. NOT).

Not much is very firm yet. We have the search telescope resources. We've been working with the lab to make sure we'll have the computing resources. Michael Wood-Vasey is working on the software to help clean things up for automated finding. At France, they have manpower and money for building the instrument. At the lab, we have an LDRD with money for buying telescope time and computing resources.

The membership of the SN Factory collaboration is currently rather fluid. Nobody's in the collaboration yet, as things are still very early.

Carl asks if we're planning on starting in 2002. Greg says that the only kind of timeline we have is that it takes 18 months to build a spectrograph, and the funding is available in a certain window. In any event, while work will go on, we won't see this running any time in the next few years.

Greg notes that we're still followup starved. YALO2 will perhaps contribute, because the IFU data may not be enough to get the lightcurve coverage that we want. He says that we can't really slow them down, because in order to make the timescale useful, we'll have to get something like 100 a year.

The reduction will hopefully be very automated. It will be a homogeneous dataset. The datacube from the IFU will be reduced, and we'll get final photometry and spectra out of the automated procedure.

Saul makes the comment that the kind of auotmation that will be done is exactly the sort of thing that we'll have to do in order to deal with SNAP.

Gerson raises manpower issues. Discussion ensues.


Gerson's Paper

Gerson says that he's "finished" his paper (the same paper he talked about at last year's meeting) just in time for the conference. He wants to give some flavor of the paper, hopefully encouraging people to read the paper and give back comments.

Gerson shows a stacked plot of data, uncorrected, for 35 supernovae (out of the 42- these are ones where the B band transforms to the R band, i.e. between 0.3 and 0.7 in z). Also are the 18 green Hamuy data. It looks like a mess.

Since the SNe are so similar, he did an average of all points from the 35 SNe within one day, and similar for the 18 Hamuy SNe. The Hamuy data more or less follow a plotted rest system curve; the high-z data are clearly a different distribution.

He goes one step further and transforms everything to the rest system (z correction). The data fit each othermuch better; this establishes the (1+z) factor, rather than "tired light" being responsible for the redshift.

Gerson notes that all the SNe were aligned at the maximum point- date and normalization.

Gerson then goes one step further, and divides out not just (1+z), but also adjusts the time axis with stretch. He doesn't have the average plot, but he does have the plot with everything. Now they all fit on the curve. You can ask how well they fit the curve; he shows a plot of the residuals (averaged day by day). The Hamuy data fits the data out to 50 days; there aren't any of the deviated ones that Peter and Greg were mentioning. The SCP data also show very reasonable residuals. The two sets have similar residuals, and stretch works.

The template Gerson has been using is the Leibengut template, which goes down to -5 days. Back in 1995 or 1997 or something, there was made an extended Leibengut curve, doing a linear extrapolation based on two early points. This is the curve that we've used in our published cosmology papers.

The other thing discussed in this paper is the question of the rise time. At conferences, Gerson has quoted an explosion date. To do this, you can't use the linear extrapolation in magnitude, as it doesn't have any starting date. Instead, what he did was use a parabola to get a definite explosion date; Don Groom worked a lot on this. Fiting parabolae to the curve give an explosion date, which he says is 17.6 days before maximum. Initially, they gave just the error corresponding to the uncertainty in fitting just this parabloa. This is what Adam Reiss lached on to in order to show a putative difference between a measurement of this and low-z supernovae. That intiatied the work that AKN (Aldering/Knop/Nugent) to show that the errors on the explosion date is much larger, and that there is no significant difference between low-z and high-z supernovae.

Simultaneously, Gerson has also worked on this problem of finding errors in a different way. When we first looked at these curves way back when, we realized that the early time was a very uncertain region, and introduced another parameter <psi>, or "pscrunch", which adjusted the curve in the early region and swept out different explosion times. So, he fit for pscrunch, time of maximum, apmlitude, stretch, and perhaps zero offsetl level. Unfortunately, when you do this, our data don't fit very well. So, you cannot determine psi independently from our fit.

Instead, what he did was do fits where Psi varied from 1.4 down to 0.8 or so. Here, he calculated chisquare on 30 supernovae doing this. He still found a minimum at -17.6 data (in comparison to the AKN value of -18.2). The Reiss value on nearby supernovae was about -20 days (with a small uncertaintly; 0.15?). Gerson's uncertainty was about 1.5 days. You still have to add the error on fitting the parabola, and the error on how tmax is determined (i.e. how well the supernovae were lined up). Putting all these together give you -17.6 +1.6 -1.1 days as the time of explosion. The uncertainty on AKN he says he couldn't determine it; he says he hasn't figured out if it was 1.2 days or 3.6 days. Greg says we resisted giving a hard quote; it's there as 1.2 days, but also the text states that the systematic errors probably dominate.

Greg says that one thing we learned is that the nature of the correlated data from the SCP is that the errorbars will go up. (Is this just the photometric correlations, or the correlations on T_max? The latter might be very important.) Gerson says that he didn't take any of these correlated errors into account.

Conclusion: the time dilation indicates that the universe is expanding; high-z fit low-z supernovae; when you do it right, there is no significant difference between the Gerson and Reiss values.

Greg notes that the AKN chisquare analysis gave exactly the value that Gerson quote. The maximum liklihood value is the one that's a little bit different.

Gerson hands out copies of the paper. He hopes to get feedback, and hopes that for once we can send a paper out in a short period of time.


Aside

Saul says that a grad student from Davis, Alex Lewin, is working on a paper about what happens when you measure supernovae doing very different methods. He says that the (what?) and the stretch methods agree, but both tend to diverge from MLCS. Saul says that each of these different methods are a "cartoon" model of the range of supernovae, and that they will all diverge a bit as you move away from standard s=1 supernovae. This paper is just trying to figure out what the differences are, not to establish what's right.

Chris points out a paper that Bruno wrote which came out in April of this year, which summarizes Type Ia supernovae. The assertion is that this paper is on astro-ph


Ariel on Spectral Comparisons.

So far we've discussed quantitative ways of comparing nearby lightcurves with high redshift ones. However, for the nearby campaign, we also want to address spectral comparisons. Dan talked yestrday about adressing this with wavelets.

Ariel shows a quick and dirty way of doing the 0th level that Dan has to beat. He's using the spectral templates from panisse.lbl.gov/~nugent. He allowed for an unreduced background (host galaxy, poor flattening, etc.). He also allowed for under/overcorrection for atmospheric extinction: 6 parameters. He blueshifted the spectrum to the retframe. Finally, he had:

  F=(SNIa(lambda,t)*N + a + b*lambda + c*lambda^-1)*(1+d*ext(lambda*(1+z)))
Ariel says that there is nothing special about this formula, but it's what he used to fit SN spectra. Rob asks if the SNIa was a function of stretch. Peter says no; what he has is designed for a s=1 supernova, ideally, though he included everything that wasn't an extreme outlyer. Note that ext is the atmospheric extinction term.

Ariel shows a spectrum that Isobel gave hiim for a -8 day z=0..65 supernova (presumably found from the lightcurve fit). He shows a chisquare plot as a function of day in temploat. The best chisquare is in fact close to where it ought to be. But how do you find the errorbar? He rescaled the chisquare plot so that the minimum reduced chisquare was 1, and then found an uncertainty by going up the right amount in chisquare. (I may not have gotten this correctly.) This chisquare plot was assuming just a single variance all across the spectrum, not considering that it changes across the spectrum.

He shows another example, z=0.3 or something, where the chisquare plot looked much more parabolic, and nicely lined up at 10 days. He also shows a Beethoven VLT spectrum, done with restframe day -7; he got -7 +0.1 -1.9 days as a 1-sigma range. This apparently agrees very nicely to what Peter did with chi-by-eye. Peter says that chi-by-eye with spectrum combined with change in the first few data points. Ariel hasn't tried it on the Keck spectrum yet, which Peter estimated was -13.5 days (rest frame).

He started playing with this with the nearby data, where we have good S/N. One of the motivations is to find out if we can say anythign about the spectral differences.

This is all very preliminary; mostly, Ariel wants us to know about it. He sees this as the first thing Dan will beat with his wavelet stuff.

Isobel notes that there is sky absorption, which ought to be removed. Peter says that that should probably be removed from the template as well....

Peter makes the comment that he's surprised (pleasantly) that it actually works. He says that he did a lot of stuff to the template to make sure it gave him the right photometry; he didn't work too hard on making sure the features were right. Peter says it was a bit of an art.


HST Brainstorming

In the case that it may really be true that we can't use much of ACS and NICMOS in the next proposal, what should we do with HST? It sounds like they're going to tell us that we can only use ACS and NICMOS in a very minor way in a proposal. What science would we say that we're going to do in this event? The idea is that there will be a year long period, and somewhere 9 or 10 months in, there may be ACS and NICMOS. We have no idea what the call for proposals is going to look like, or when it will come out. There is the suggestion to wait for it, but Saul worries that we may be stuck with a very short time period to do the proposals.

Saul wants us to have at least an idea of a plan in case this calamity happens to us. Is there any niche that we can claim to fill. The only thing that anybody can come up with is "more of the same", getting 8-10 more at 0.85. Perhaps Ariel can come up with a science case. Greg worries that we won't get B-V. Peter thinks we should do IR, even without AO. Chris says that this is hard; in J-band, z=1, it's 15,000 seconds for S/N 20 at VLT, assuming seeing of 0.6". At z=0.85, how much does it save us in exposure time? Less than a factor of 2. Doing 10 SNe in the service mode is out of the question, Chris tells us.

Perhaps Ariel will tell us that you need the 0.4's to nail w better. We may need to wait and see what we can do with SNe in order to figure out what we want to propose to do. Might we already be pushed up against the intrinsic limit? Greg says it might also be the color correction.

Greg wonder if we can get the colors elsewhere; perhaps we can do it via Keck, rather than VLT servicing.

It sounds like the limiting factor is going to be how much time it takes to do the spectra and the IR. Ariel will need real possibilities for doing simulations.

This may make it a harder job for scheduling how we're going to deal with the proposals.

HSTeege.


Assigning Tasks

People have been asking when we get around to assigning or prioritizing tasks. Saul thinks that the best thing for us to do, since we have people coming in, is to take time to digest the tasks. It will then be a matter of collecting who feels that they could do these tasks and who is interested in doing these tasks. Perhaps the executive committee will keep an eye on whether people are picking up tasks.


Meeting Dates

Finally, there will be a rump session for people choosing dates for telescope proposal planning meetings.

The biggest worry is the CFHT proposal, which is end of August (September 1). Reynald says that he can't meet between the last week of July (J22) and the first week of August (7). The current plan is to have it out to the collaboration 1 week before the due date, which would be August 23. The first meeting, then, may be Thursday morning July 20 at 9:00 PDT. Ariel will send out some ammunition by June 29 at 9AM PDT (a teleconference meeting then). He may just send something written, so that the phone conversation will be very short.