Saving EUMetSat cloud images

New - 20th June 2018


Downloading and saving EuMetSat cloud images to aid analysis of NLC events


Introduction

 

Earlier this year I contacted the AIM team to find out if and when they would be publishing their Northern Hemisphere 'Daily Daisy's' in 2018. Following a question raised by the AIM team about visibility and cloud cover I realised I had little information about the extent of cloud cover in the diagonal path through the atmosphere from the observer to the NLC. A search through the weather satellite archives showed that images of midday cloud cover are available going back to the beginning of the 21st Century but the night time cloud cover images are not archived. Theoretically it should be possible to reconstruct the night time cloud images described below as the images from the individual radiometers on the satellite are archived at one hour intervals. While this may be useful for looking at past weather when NLC occurred I though that it would be better to record the satellite images at the same time as the NLC. This page describes the night time cloud products and then shows how to automate the saving of the files and get over the limitations of the Linux WGET function. The scripts that download the images run on the same Raspberry Pi that I use to operate my NLC intervalometer. While I use the scripts for the night time cloud images with a little modification they can be used to download other images from EUMetSat or indeed other images that are updated at regular intervals and accessible on the web.

 

 

Night Time Cloud images

 

In addition to the images from several channels of the radiometers on the satellite EUMetSat publish a number of products that use a combination of several channels to show particular types of cloud. In terms of dusk or night observations of NLC there are two products that show useful data.

 

The first is the night time cloud and fog. This product is optimised to emphasize the different types of cloud at night. It is not intended to work during daylight hours. The latest images are available on the EUMetSat viewer here and a detailed technical description of how to interpret the images is here (PowerPoint file). This gives a fairly crude view of the cloud - while it is clear the fog is at or near ground level the relationship between the image shading and the height of the clouds is more difficult to see. This should show if there is no cloud over the entire path length to the NLC but for instance low cloud several hundred miles away where the NLC are located should not be a problem. I use the Western Europe image as this has the best coverage of the UK but people further East may find the alternative Europe image better or even further east the images from the Satellite located 45 degrees East. This product is updated every hour. In the rest of the world you are on your own.

 

The second product that looks useful in NLC analysis is the Cloud Top Height (CTH). While this is intended for aviation use it has better colour resolution of different height clouds. However only a full disc image is available so the resolution over the UK is quite poor but better than nothing. The latest CTH images are available on the EUMetSat viewer here and a detailed technical description of how to interpret the images is here (PDF file). This product is updated 4 times an hour.

 

There are a couple of points to note, since the observations are made from satellites they can only see the highest level of cloud. This will mask any lower level cloud. This is less of a problem local to the observer as you can see and hopefully record the local cloud. Equally cloud below the NLC which does not show on the satellite images should not be a problem. The problem area is the middle distance where the observation path could be passing through middle level cloud masked by a layer of high level cloud.

 

It is not clear how deep in the atmosphere the path of the sunlight illuminating the NLC is. If it passes deeply through the air cloud Sunward of the NLC may also be a problem. My belief is that the illuminating rays pass through only the top of the atmosphere - if they go lower the light would be reddened resulting in red rather than the characteristic blue NLC.

 

Software requirements

 

EUMetSat have limitations on the number of times per hour images can be downloaded to prevent overloading the servers. While the products should be available at the frequencies listed above the exact time they are issued depends on the reception of the images from the satellite and the subsequent processing to produce the products. In addition the CTH uses some ground based measurements. To ensure we get all the images we need to check if a new image is available fairly frequently but we also need to ensure that we comply with the maximum frequency of downloads described on the EUMetSat site here.

 

It should be noted that all the images for a particular product have the same name, the latest image just overwrites the old. Thus if you fail to fetch a particular version of an image it has essentially gone forever. They are shown on the viewers linked above for a couple of days but don't seem to be accessible as image files.

 

Software description & function

 

Linux has a command line function WGET that allows files to be fetched from websites. This has a couple of features that appear to be useful in downloading only new copies of files from the EUMetSat server. In normal mode WGET will download a copy of the file from the server every time it is run and overwrite the local copy. Thus the local copy is essentially the same as the server though it will be delayed depending on the frequency of running WGET. Old copies of the file are destroyed which does not provide the history required for analysis. To prevent the files being overwritten WGET has the -nc (noclobber) option. In this case as each file is downloaded it is given a numerical suffix. As a result multiple copies of the same image will be saved with different numeric suffixes. The duplicate copies will then have to be deleted manually. Both the no options and the -nc options download the file every time they are run - not good for the server loading. The function also has an option -N to only download new copies of the file. Before the download is requested WGET queries the server to determine if a new version of the file is available, if not it exits. On the face of it this seems what we want but more detailed investigation shows that when a new file is downloaded it over writes the old one ! It would be Ok if you could use the -N and -NC options at the same time but this combination is not allowed.

 

My initial attempts to solve this were to rename the 'new' file when it was downloaded by WGET. However the next time WGET is run with the N option as the file no longer exists with the same name as that on the server it downloads another copy. As a result I ended up with multiple copies of the same file with different names and possibly an annoyed server owner. I might as well have used the -nc option.

 

The way I solved the problem was to use WGET - N to only download new copies of the file and then compare the local copy of the file with those already saved locally. If it is different the new file is copied to the archive and given a new name.

 

There are a number of ways the new file can be compared with the archive, modification date and size are two obvious ones but I rejected modification date in case it got changed in the copying and renaming process. The file size carries the risk that two different images could conceivably have the same size. A better choice was to compute the MD5 checksum for each file. If they are the same the checksums should agree, the chance of two different image files having the same checksum are negligible.

 

The code is written as a shell script - I have two versions one for the night fog image available here and a similar one for the CTH available here.

 

They are separate scripts as the images are downloaded to two different folders and the fog image script is run every 10 minutes using cron while the CTH is run every 5 minutes. The 10 minute interval can probably be increased to 15 or 20 minutes without missing files. The CTH interval is more difficult - time in cron on a Raspberry Pi can only be in whole minutes and should really be an factor of the 15 minute new image interval - this only leaves every 3 or 5 minutes.

 

The 'Fog' script is available here and the CTH script here.

 

The Fog & CTH images are stored in their own directories in the user area  ~/Camera_Data/Download/Fog and ~/Camera_Data/Download/CTH. These directories are created if they do not exist.

 

My normal procedure is to rename 'Download' to the previous yyyymmdd each morning and then copy this to my NAS. Note this needs to be done before the scripts are run that evening. My  Python intervalometer programme copies the data from my Canon 10D NLC camera to the Download directory after completion of the NLC imaging.

 

How the scripts work is detailed in the comments .

 

Any questions email me.

 

 

© John Murrell 2018

  


 

Clicking on the Home logo will return you to the home page