In order to 1) facilitate investigations of trade-offs, such as PSF sampling, effective spectral resolution, data extraction method, accuracy of wavelength calibration, and orientation of dispersion direction w.r.t. serial readout, 2) refine the post-processing gain factor and 3) optimize parameters for scientific performance, NASA Goddard is providing a software suite The Coronagraph Rapid Imaging Spectrograph in Python (CRISPY) that can simulate WFIRST CGI IFS data and that can be used for end-to-end simulations of the IFS in conjunction with a coronagraph PSF model. Examples of how to use the code are also available (and these are updated at least on a weekly basis). Note also that CRISPY is currently under active development. CRISPY simulations can be used by the community to test home-grown extraction routines in the mission planning stages.
In short, the software goes through the following stages:
Each of these stages is discussed further below.
As the instrument, optics simulations and observing scenarios mature, we will keep this page updated with the latest simulated data and links to the up-to-date simulation code. For basic parameters of the CGI IFS, please see the spacecraft and instrument parameters page.
The IFS software needs a spatial datacube which is also resolved in wavelength at a resolution higher than the spectrograph's native resolution of R=50. Each single"monochromatic" wavelength slice of the input cube is expected to represent the number of photons per second at the lenslet array. CRISPY offers two ways for the user to create such cubes at the entrance of the coronagraph and convert them to coronagraphic images at the lenslet array, but it is fully compatible with user-generated cubes.
Using a realistic spatial-spectral data cube as input (under development): Development of this portion of the code ("Haystacks") is led by Dr. Aki Roberge at GSFC . Haystacks provides a suite of realistic extrasolar models that are spatially and spectrally resolved. The first component of the model is a realistic central star and planets. Next, exo-zodiacal dust is added to the model, followed by noise (including confusion noise), then background galaxies are added to the scene, together with Galactic dust effects and background stars. The dust structure is modeled by Dr. C. Stark at STScI using N-body simulations, and it includes several dust grain populations, grain-grain collisions. CRISPY uses radiative transfer modeling to convert the effects of exozodiacal dust into flux densities, including the effects of starlight scattering off dust and re-emission of absorbed starlight by dust grains. The code allows for variations in the overall intensity, inclination effects, distance, sampling, zodi emission structures and intensities, etc. Orbital dynamics are included, accounting for planets orbiting the central star, and an appropriately rotating dust structure. The background cubes are provided separately and they can be moved to simulate proper motion of background stars. Currently two different solar system models are considered (Roberge et al. in preparation), consisting of 1) our datadisk current solar system and 2) Endor system with a Jupiter-like planet and Earth-like moon. These scene simulations are intended to become benchmarks for future coronagraphs.
Using a fiducial input from J. Krist (experimental): The software is also able to process a time-series simulation from J. Krist, but this is limited to only a star and a planet. Please contact us to obtain a sample time series.
When using Haystacks or a user-generated astrophysical scene, one needs to the propagate the entire field of view through a coronagraph simulator. For this we use the WebbPSF Fraunhoffer (i.e., far field limit) SPC model.
The user can directly provide a wavelength-resolved cube at the entrance of the lenslet array, or use one of the methods above to generate it. The propagation through the IFS is its own standalone routine that does not depend on the input units, so it is up to the user to do proper bookkeeping. Right now the propagation is assumed lossless.
Each input slice is first rotated to match the orientation of the lenslet array, then rebinned to the correct sampling so that the resulting array is a map of the flux within each lenslet. Each lenslet is then remapped onto its detector location, using a pre-computed wavelength calibration. A Gaussian PSFLet is then put in the correct location on the detector.
The user is allowed to change most parameters (that are kept in a single parameter file).
The software correctly accounts for the various contributors to detector noise: CIC, read noise, dark current, and of course Poisson noise. Quantum efficiency, as well as its degradation as a function of detector traps, is also implemented, following guidelines from JPL. Goddard and MIT are currently collaborating on writing a fast 2D trap model, based on the high-fidelity models built by JPL. This will be incorporated into the code when ready.
The data reduction pipeline routines made available in this package are adapted from Tim Brandt's CHARIS data reduction package (Brandt et al. 2017, in preparation). CHARIS is an Integral Field Spectrograph for the SUBARU telescope. Two reduction methods are offered: "optimal extraction", that uses 1D gaussian fits for each pixel across each microspectrum, and "least squares extraction" that fits each microspectrum as a sum of pre-computed PSFLets. The latter method is not yet completely operational.
This code is presently used by JPL High Contrast Imaging Lab in the PISCES IFS instrument.
Although no post-processing software is currently included, future updates may include a module such as KLIP.