Atmospheric Observatory
Live data
A lot of the Atmospheric Observatory data is coming directly from the fieldsite and is usually, at most, a few minutes old.
Description | Location on labserver | Back-up version on ITS disk |
---|---|---|
5 min measured data | /export/labserver/data/METFiDAS/data/processed/measured | /export/its/labs/RUAOData/METFiDAS/data/processed/measured |
5 min derived data (derived from the measured data) | /export/labserver/data/METFiDAS/data/processed/derived | /export/its/labs/RUAOData/METFiDAS/data/processed/derived |
Minimum and maximum for measured data | /export/labserver/data/METFiDAS/data/processed/min-max | /export/its/labs/RUAOData/METFiDAS/data/processed/min-max |
1 sec data for Met | /export/labserver/data/METFiDAS/data/raw/met/1sec | /export/its/labs/RUAOData/METFiDAS/data/raw/met/1sec |
1 sec data for Research | /export/labserver/data/METFiDAS/data/raw/research/1sec | /export/its/labs/RUAOData/METFiDAS/data/raw/research/1sec |
SkyCam images | /export/labserver/data/SkyCam | /export/its/labs/RUAOData/SkyCam |
CloudCam images | /export/labserver/data/CloudCam | /export/its/labs/RUAOData/CloudCam |
Lightning images | /export/labserver/data/NexStorm | /export/its/labs/RUAOData/NexStorm |
Any new live data is copied from labserver to the ITS disk using the Dependency Perl Code (DPC), which is installed in /home/sws09a/public_html/cgi-bin/update. They're may be a other ways of doing this, but this was quick and an easy for easy for Marc to do and works. A full list of the files that are copied can be seen at the Dependency Perl Code web interface for this data. The DPC is run through a cron job on labserver every night.