Package ivs :: Package inout :: Module hdf5
[hide private]
[frames] | no frames]

Module hdf5

source code

Read and write HDF5 files.

HDF5 is a data model, library, and file format for storing and managing data. It supports an unlimited variety of datatypes, and is designed for flexible and efficient I/O and for high volume and complex data. HDF5 is portable and is extensible, allowing applications to evolve in their use of HDF5. The HDF5 Technology suite includes tools and applications for managing, manipulating, viewing, and analyzing data in the HDF5 format. http://alfven.org/wp/hdf5-for-python/

Functions [hide private]
    Input
dict
read2dict(filename)
Read the filestructure of a hdf5 file to a dictionary.
source code
    Output
 
write_dict(data, filename, update=True, attr_types=[])
Write the content of a dictionary to a hdf5 file.
source code
Variables [hide private]
  logger = logging.getLogger("IO.HDF5")
Function Details [hide private]

read2dict(filename)

source code 

Read the filestructure of a hdf5 file to a dictionary.

Parameters:
  • filename (str) - the name of the hdf5 file to read
Returns: dict
dictionary with read filestructure

write_dict(data, filename, update=True, attr_types=[])

source code 

Write the content of a dictionary to a hdf5 file. The dictionary can contain other nested dictionaries, this file stucture will be maintained in the saved hdf5 file.

Pay attention to the fact that the data type of lists might change when writing to hdf5. Lists are stored as numpy arrays, thus all items in a list are converted to the same type: ['bla', 1, 24.5] will become ['bla', '1', '24.5']. Upt till now there is nothing in place to check this, or correct it when reading a hdf5 file.

Parameters:
  • data (dict) - the dictionary to write to file
  • filename (str) - the name of the hdf5 file to write to
  • update (bool) - True if you want to update an existing file, False to overwrite
  • attr_types (List of types) - the data types that you want to save as an attribute instead of a dataset. (standard everything is saved as dataset.)