Software pipelines are under development. They are designed to handle the enormous data flow produced by the panoramic camera and maximize the scientific output. The Data Management Software will automatically process the data collected during the night to check if its quality fulfills the scientific and technical requirements, update the survey's databases and feed the Scheduler to compute the telescope targets of the following nights.
At the OAJ after data acquisition the images will be converted to the FITS format dened as input for the pipelines. At the same time the image headers will be updated with complementary information from the telescope, the monitors and the meteorological station and operation scheduler.
Once the raw data is in the main archive of UPAD it will be automatically processed by the pipeline providing calibrated images and astronomical catalogs. During the data treatment the pipeline use a management database to store processing information, the parameters used, and quality assessment values.
The pipeline has three main stages:
- Generation of Calibration Frames.
- Daily pipeline.
- Tiling pipeline.
Figure 1 shows the data flow during the processing of images in the same Tile. This consider the data processing of the four individual images of a Tile, and their combination into the final high S/N image, following the Survey Strategy prescription.
The Calibration Frames part is devoted to generate and validate the master calibration images that will be systematically used to correct the instrumental imprint on the science images. The main Master Frames Images are:
- BIAS frames, generated from 0 secs exposure images and used together with the overscan and prescanareas to correct any readout pattern in the images.
- FLATS frames, generated either combining sky observations during twilight or images taken pointing to the illuminated dome.
- Illumination correction Frame, used to correct deviation from homogeneous illumination during the flat fields acquisition.
- Superflats or Fringing frames generated combining the science images, and used to remove the low frequencies patterns on the sky or fringing additive patterns.
The frequency to generate a new set of Calibration Frames (CF) will be defined during the instruments commissioning phase depending on their stability. The pipeline provide access to the Calibration Frames quality assessment information using a Web Portal that display the information collected in the administrative database and at file level.
In daily processing, the pipeline corrects the instrumental signature making use of the validated Calibration Frames, masks contaminants (cosmic rays, satellite traces) and calibrates the astrometry and the photometry of the images. A first set of catalogs is created over the individual images. Those individual image catalogs will be offered to the astronomical community shortly after the observations. The work-flow of the Daily pipeline is outlined in the figure 2.
In a second stage, the Tiling pipeline will combine the processed images observed in the same sky area to provide a combined photometric calibrated image for each filter. The combined images for a Tile in the whole set of filters compound a Datacube. Those Datacubes will require a more detailed revision and will be released only after a significant part of the sky has been observed. Source catalogs will be extracted from the final Tiles and Datacubes and will be stored in a Database system. Figure 3 shows the computations performed and the work-flow of the Tiling pipeline.
To storage J-PLUS, we are using the well known SQL relational database PosgreSQL.
To provide data access to the Science Databases, a web portal has been developed which both, the possibility to "navigate" across the images, and query the databases (see Data releases section). All the services which are available through the web-portal are also accessible through VO-protocols. The user can perform SIAP (Simple Image Access Protocol) queries to request a tile or a cutout of a tile a cone search or a TAP (Table Access Protocol) query. That means that the data can be accessed and analyzed using the Virtual Observatory tools.