Upload data to the Terradue storage¶
In this scenario, we generate a .properties file that accompanies the result files and upload all files that form the result to the Terradue storage.
1. Set the necessary variables¶
The following section defines all the necessary information as variables so the code below can be easily reused.
[ ]:
import getpass
# Set the credentials (Ellip username and API key)
username = raw_input("What is your Ellip username? ")
api_key = getpass.getpass("What is your Ellip API key? ")
# Set the name of the destination repository on the Terradue storage
repo_name = raw_input("What is the destination respository name? (press Enter to confirm default [{0}]) ".format(username))
if not repo_name:
repo_name = username
# Set the destination URL on store (directory)
store_url = "https://store.terradue.com/{0}/data-publication-sample/".format(repo_name)
# Set dictionary with file information
files = [
{'name':'data-publication-sample.tif', 'content_type':'image/tiff'},
{'name':'data-publication-sample.kmz', 'content_type':'application/octet-stream'},
{'name':'data-publication-sample.properties', 'content_type':'text/plain'}
]
2. Download sample files¶
This step is necessary to have some local files to work with. In a normal processing scenario, the files to be uploaded are already available locally.
[ ]:
import os
import requests
def retrieve_file(url, local_file):
res = requests.get(url, allow_redirects=True)
open(local_file, 'wb').write(res.content)
sample_dir = "data-publication-sample"
if not os.path.exists(sample_dir):
os.mkdir(sample_dir)
for ext in ['tif', 'xml', 'kmz']:
retrieve_file(
"https://store.terradue.com/eo-samples/data-publication-sample/data-publication-sample.{0}".format(ext),
"{0}/data-publication-sample.{1}".format(sample_dir, ext)
)
3. Generate the metadata file¶
We create a dictionary containing the values and then write its content to a file.
[ ]:
from collections import OrderedDict
metadata = OrderedDict()
metadata['Product_ID'] = 'data-publication-sample'
metadata['Product_format'] = 'GEOTIFF'
metadata['Product_size'] = '31624950 '
metadata['Product_url'] = '{0}/data-publication-sample.tif'.format(store_url)
metadata['Bounding_box'] = '40.448583 13.587901 41.234180 14.856833'
metadata['Bounding_box_wkt'] = 'POLYGON((13.587901 40.604756,13.737746 41.234180,14.856833 41.077912,14.695978 40.448583,13.587901 40.604756))'
metadata['License'] = 'https://creativecommons.org/licenses/by/4.0'
metadata['User_ID'] = 'me'
metadata['Software_version'] = 'Sample processor v2.5'
metadata['Date'] = '2017-02-11T05:10:43.490699Z/2018-01-01T05:10:51.413166Z'
metadata['Date_of_measurement_start'] = '2017-02-11T05:10:43.490699Z'
metadata['Date_of_measurement_end'] = '2018-01-01T05:10:51.413166Z'
metadata['Date_of_production'] = '2018-10-05T23:34:01Z'
metadata['Super_master_SAR_image_ID'] = 'S1A_IW_SLC__1SDV_20160902T051142_20160902T051210_012869_014526_541D.SAFE, S1A_IW_SLC__1SDV_20160902T051207_20160902T051234_012869_014526_B4AA.SAFE'
metadata['Master_SAR_image_ID'] = 'S1B_IW_SLC__1SDV_20170211T051043_20170211T051110_004248_0075D9_BCAE.SAFE, S1B_IW_SLC__1SDV_20170211T051108_20170211T051135_004248_0075D9_6A83.SAFE'
metadata['Slave_SAR_image_ID'] = 'S1B_IW_SLC__1SDV_20180101T051051_20180101T051118_008973_010039_B427.SAFE, S1B_IW_SLC__1SDV_20180101T051116_20180101T051143_008973_010039_CF15.SAFE'
metadata['Perpendicular_baseline'] = '10.1852'
metadata['Parallel_baseline'] = '18.1659'
metadata['Along_track_baseline'] = '0.877837'
metadata['Reference_point'] = '14.250889 40.833728'
metadata['Spatial_resolution'] = '37, 37'
metadata['Sensor'] = 'S1'
metadata['Mode'] = 'IW'
metadata['Antenna_side'] = 'Right'
metadata['Relative_orbit_number'] = '22'
metadata['Orbit_direction'] = 'DESCENDING'
metadata['Wavelenght'] = '0.055465760'
metadata['Value_unit'] = 'cm'
metadata['Number_of_looks_azimuth'] = '2'
metadata['Number_of_looks_range'] = '10'
metadata['Applied_filter'] = 'No_Filter'
properties_file = open("data-publication-sample/data-publication-sample.properties", "w")
for key in metadata:
properties_file.write("{0}={1}\n".format(key, metadata[key]))
properties_file.close()
4. Define a function for the file upload¶
To upload the file, we have to use the HTTP PUT method and basic HTTP authentication. This function verifies also the file integrity by comparing the hash codes of the local and remote files.
[ ]:
import requests
import json
import hashlib
import os
def send_file(local_file, content_type, store_url, username, api_key):
# Open file to send
content = open(local_file, 'rb').read()
# Send file (using the HTTP PUT method)
res = requests.put(url="{0}/{1}".format(store_url, os.path.basename(local_file)),
headers={"Content-Type": content_type},
auth=(username, api_key),
data=content
)
print("Response status code: {0}".format(res.status_code))
assert res.status_code == 201
print('File {0} uploaded'.format(local_file))
# Verify the content (hash check)
local_checksum = hashlib.sha256(content).hexdigest()
remote_checksum = res.json()['checksums']['sha256']
print("- Local checksum: {0}".format(local_checksum))
print("- Remote checksum: {0}".format(remote_checksum))
assert local_checksum == remote_checksum
5. Upload the files¶
The output of this cell shows the upload success (HTTP status code and checksum).
[ ]:
for file in files:
try:
send_file("data-publication-sample/{0}".format(file['name']), file['content_type'], store_url, username, api_key)
except Exception as e:
print('ERROR: File {0} NOT uploaded: {1}'.format(file['name'], e))
END