Ancillary Functions

This module gathers central functions and classes for general pyroSAR applications.

find_datasets

find pyroSAR datasets in a directory based on their metadata

getargs

get the arguments of a function

groupby

group a list of images by a metadata attribute

groupbyTime

function to group images by their acquisition time difference

hasarg

simple check whether a function takes a parameter as input

multilook_factors

compute multi-looking factors to approximate a square pixel with defined target ground range pixel spacing.

parse_datasetname

Parse the name of a pyroSAR processing product and extract its metadata components as dictionary

seconds

function to extract time in seconds from a file name.

Lock

File and folder locking mechanism.

LockCollection

Like Lock but for multiple files/folders.

class pyroSAR.ancillary.Lock(target, soft=False, timeout=7200)[source]

Bases: object

File and folder locking mechanism. This mechanism creates lock files indicating whether a file/folder

  1. is being modified (target.lock),

  2. is being used/read (target.used_<uuid.uuid4>) or

  3. was damaged during modification (target.error).

Although these files will not prevent locking by other mechanisms (UNIX locks are generally only advisory), this mechanism is respected across any running instances. I.e., if such a lock file exists, no process trying to acquire a lock using this class will succeed if a lock file intending to prevent it exists. This was implemented because other existing solutions like filelock or fcntl do not implement effective solutions for parallel jobs in HPC systems.

Hard locks prevent any usage of the data. Damage/error locks work like hard locks except that timeout is ignored and a RuntimeError is raised immediately. Error locks are created if an error occurs whilst a hard lock is acquired and target exists (by renaming the hard lock file). Infinite usage locks may exist, each with a different random UUID. No hard lock may be acquired whilst usage locks exist. On error usage locks are simply deleted.

It may happen that lock files remain when a process is killed by HPC schedulers like Slurm because in this case the process is not ended by Python. Optimally, hard locks should be renamed to error lock files and usage lock files should be deleted. This has to be done separately.

Examples

>>> from pyroSAR.ancillary import Lock
>>> target = 'test.txt'
>>> with Lock(target=target):
>>>     with open(target, 'w') as f:
>>>         f.write('Hello World!')
Parameters:
  • target (str) – the file/folder to lock

  • soft (bool) – lock the file/folder only for reading (and not for modification)?

  • timeout (int) – the time in seconds to retry acquiring a lock

is_used()[source]

Does any usage lock exist?

Return type:

bool

remove()[source]

Remove the acquired soft/hard lock

class pyroSAR.ancillary.LockCollection(targets, soft=False, timeout=7200)[source]

Bases: object

Like Lock but for multiple files/folders.

Parameters:
  • targets (list[str]) – the files/folders to lock

  • soft (bool) – lock the files/folders only for reading (and not for modification)?

  • timeout (int) – the time in seconds to retry acquiring a lock

pyroSAR.ancillary.find_datasets(directory, recursive=False, **kwargs)[source]

find pyroSAR datasets in a directory based on their metadata

Parameters:
  • directory (str) – the name of the directory to be searched

  • recursive (bool) – search the directory recursively into subdirectories?

  • kwargs – Metadata attributes for filtering the scene list supplied as key=value. e.g. sensor=’S1A’. Multiple allowed options can be provided in tuples, e.g. sensor=(‘S1A’, ‘S1B’). Any types other than tuples require an exact match, e.g. proc_steps=[‘grd’, ‘mli’, ‘geo’, ‘norm’, ‘db’] will be matched only if these processing steps are contained in the product name in this exact order. The special attributes start and stop can be used for time filtering where start<=value<=stop. See function parse_datasetname() for further options.

Returns:

the file names found in the directory and filtered by metadata attributes

Return type:

list of str

Examples

>>> selection = find_datasets('path/to/files', sensor=('S1A', 'S1B'), polarization='VV')
pyroSAR.ancillary.getargs(func)[source]

get the arguments of a function

Parameters:

func (function) – the function to be checked

Returns:

the argument names

Return type:

list or str

pyroSAR.ancillary.groupby(images, attribute)[source]

group a list of images by a metadata attribute

Parameters:
  • images (list[str]) – the names of the images to be sorted

  • attribute (str) – the name of the attribute used for sorting; see parse_datasetname() for options

Returns:

a list of sub-lists containing the grouped images

Return type:

list[list[str]]

pyroSAR.ancillary.groupbyTime(images, function, time)[source]

function to group images by their acquisition time difference

Parameters:
  • images (list[str]) – a list of image names

  • function (function) – a function to derive the time from the image names; see e.g. seconds()

  • time (int or float) – a time difference in seconds by which to group the images

Returns:

a list of sub-lists containing the grouped images

Return type:

list[list[str]]

pyroSAR.ancillary.hasarg(func, arg)[source]

simple check whether a function takes a parameter as input

Parameters:
  • func (function) – the function to be checked

  • arg (str) – the argument name to be found

Returns:

does the function take this as argument?

Return type:

bool

pyroSAR.ancillary.multilook_factors(source_rg, source_az, target, geometry, incidence)[source]

compute multi-looking factors to approximate a square pixel with defined target ground range pixel spacing.

Parameters:
  • source_rg (int or float) – the range pixel spacing

  • source_az (int or float) – the azimuth pixel spacing

  • target (int or float) – the target pixel spacing of an approximately square pixel

  • geometry (str) – the imaging geometry; either ‘SLANT_RANGE’ or ‘GROUND_RANGE’

  • incidence (int or float) – the angle of incidence

Returns:

the multi-looking factors as (range looks, azimuth looks)

Return type:

tuple[int]

Examples

>>> from pyroSAR.ancillary import multilook_factors
>>> rlks, azlks = multilook_factors(source_rg=2, source_az=13, target=10,
>>>                                 geometry='SLANT_RANGE', incidence=39)
>>> print(rlks, azlks)
4 1
pyroSAR.ancillary.parse_datasetname(name, parse_date=False)[source]

Parse the name of a pyroSAR processing product and extract its metadata components as dictionary

Parameters:
  • name (str) – the name of the file to be parsed

  • parse_date (bool) – parse the start date to a datetime object or just return the string?

Returns:

the metadata attributes

Return type:

dict

Examples

>>> meta = parse_datasetname('S1A__IW___A_20150309T173017_VV_grd_mli_geo_norm_db.tif')
>>> print(sorted(meta.keys()))
['acquisition_mode', 'extensions', 'filename', 'orbit',
'outname_base', 'polarization', 'proc_steps', 'sensor', 'start']
pyroSAR.ancillary.seconds(filename)[source]

function to extract time in seconds from a file name. the format must follow a fixed pattern: YYYYmmddTHHMMSS Images processed with pyroSAR functionalities via module snap or gamma will contain this information.

Parameters:

filename (str) – the name of a file from which to extract the time from

Returns:

the difference between the time stamp in filename and Jan 01 1900 in seconds

Return type:

float

pyroSAR.ancillary.windows_fileprefix(func, path, exc_info)[source]

Helper function for shutil.rmtree() to exceed Windows’ file name length limit of 256 characters. See here for details.

Parameters:

Examples

>>> import shutil
>>> from pyroSAR.ancillary import windows_fileprefix
>>> shutil.rmtree('/path', onerror=windows_fileprefix)