Commit 2438422b authored by Mark Hymers's avatar Mark Hymers

Add Tutorial 2

Signed-off-by: Mark Hymers's avatarMark Hymers <>
parent 12eb539a
.. toctree::
:maxdepth: 2
:maxdepth: 1
from anamnesis import AbstractAnam, register_class
class ComplexPerson(AbstractAnam):
hdf5_outputs = ['name', 'age']
hdf5_defaultgroup = 'person'
hdf5_aliases = ['test_classes2.OldComplexPerson']
def __init__(self, name='Unknown', age=0):
AbstractAnam.__init__(self) = name
self.age = age
class ComplexPlace(AbstractAnam):
hdf5_outputs = ['location']
hdf5_defaultgroup = 'place'
def __init__(self, location='Somewhere'):
self.location = location
class ComplexTrain(AbstractAnam):
hdf5_outputs = ['destination']
hdf5_defaultgroup = 'train'
def __init__(self, destination='Edinburgh'):
self.destination = destination
def init_from_hdf5(self):
print("We have already set destination: %s" % self.destination)
import h5py
from anamnesis import obj_from_hdf5file
import test_classes2
# Demonstrate reading a file which has the old class name
# in the HDF5 file
s = obj_from_hdf5file('test_script2_aliases.hdf5', 'person')
# Show that we have reconstructed the object
import h5py
from anamnesis import obj_from_hdf5file
from test_classes2 import ComplexPerson, ComplexPlace
# Load the classes from the HDF5 file using
# the default hdf5group names
s = ComplexPerson.from_hdf5file('test_script2.hdf5')
l = ComplexPlace.from_hdf5file('test_script2.hdf5')
# Show that we have reconstructed the object
import h5py
from anamnesis import obj_from_hdf5file
from test_classes2 import ComplexPerson
# Create an example object
p = ComplexPerson('Bob', 75)
p.extra_data['hometown'] = 'Oxford'
# Save the object out
f = h5py.File('test_script2_extradata.hdf5', 'w')
# Delete our object
del p
# Re-load our object
p = obj_from_hdf5file('test_script2_extradata.hdf5')
# Show that we recovered the object and the extra data
import h5py
from anamnesis import obj_from_hdf5file
from test_classes2 import ComplexPerson
# Load the train object and watch for the printed output from the
# init_from_hdf5 function
p = obj_from_hdf5file('test_script2.hdf5', 'train')
import shutil
import h5py
from test_classes2 import (ComplexPerson,
# Create a person and a place
s = ComplexPerson('Anna', 45)
l = ComplexPlace('York')
t = ComplexTrain('Glasgow')
# Serialise the person and place to disk
f = h5py.File('test_script2.hdf5', 'w')
# Serialise the person to disk using a different name
# To do this, we copy the HDF5 file and manually edit it
shutil.copyfile('test_script2.hdf5', 'test_script2_aliases.hdf5')
f = h5py.File('test_script2_alias.hdf5', 'a')
f['person'].attrs['class'] = 'test_classes2.OldComplexPerson'
Tutorial 2 - More advanced serialisation features
Anamnesis has support for some more advanced features regarding serialisation.
In the main, most people will not require these, however they are used in
NAF (the project from which anamnesis was extracted).
These features can be best described by the name of the member variables
or function names which are used to configure them.
One thing to note is that anamnesis implicitly reserves the use of these
names for its own functionality. Note that any future additions will
use the prefixes `hdf5_` or `anam_`. In order to avoid clashes with
future versions of anamnesis, avoid using variables or function names
with these prefixes.
1. `hdf5_defaultgroup` (member variable)
2. `hdf5_aliases` (member variable)
3. `hdf5_mapnames` (member variable)
4. `extra_data` (member variable)
5. `extra_bcast` (member variable)
6. `init_from_hdf5` (member function)
7. `refs` (member variable)
8. `shortdesc` (member variable)
All of the files needed to run these examples are generated by the script
``. This is also where several examples of the actual
usage of the variables within classes can be seen.
.. literalinclude::
:language: python
This variable is usually used when serialising a single instance of a class
into and out of an HDF5 file. Its use obviates the need to specify a group
name when reading from an HDF5 file using the `from_hdf5file` function.
E.g., if we have two classes, one of which has an `hdf5_defaultgroup`
set to `person` and the other to `place`, we can load each of the
instances without specifying where they are in the file, as follows:
.. literalinclude::
:language: python
`hdf5_aliases` is a list wihch allows developers to specify additional class
names which should be matched by the given class. As an example, if
`hdf5_aliases` in the `test_classes2.ComplexPerson` class is set
to `['test_classes2.OldComplexPerson']`, any files which were created
using the old class name (`OldComplexPerson`) will now be read using the
`ComplexPerson` class instead:
.. literalinclude::
:language: python
`hdf5_mapnames` is a rather specialised variable for which most users will
not have a use. It allows users to control the mapping of variable names
into and out of the HDF5 file - in other words, it decouples the names
of the groups and attributes in the HDF5 file from those in the Python class.
As a concrete example, let us say that we are using a Python class which
has a variable called `_order` but that for neatness sake, we would rather
that this was called `order` in the HDF5 file. In this case, we would define
the `hdf5_mapnames` variable as follows.
hdf5_mapnames = {'_order': 'order'}
hdf5_outputs = ['_order']
Note that `hdf5_mapnames` is a dictionary which maps Python class names to HDF5
entry names and that we still list the original variable name in
You can have as many mappings as you want, but be very
careful not to have a name in both `hdf5_outputs` and as a *target* in
`hdf5_mapnames`. I.e., this is bad (assuming that your class has
member variables `_order` and `myvariable`
# Don't do this
hdf5_mapnames = {'_order': 'myvariable'}
hdf5_outputs = ['myvariable']
For (hopefully) obvious reasons, this makes no sense as you are attempting
to serialise both the `_order` and `myvariable` variables into the HDF5
entry with name `myvariable`. Don't Do This (TM).
The `extra_data` variable is a dictionary which can be used by users of a class
to serialise and unserialise additional data which is not normally saved by the
To use this, simply use the `extra_data` as a standard dictionary, for example:
.. literalinclude::
:language: python
The `extra_bcast` variable is a list of member variable names similar to that
in the main `hdf5_outputs` variable. The difference is that variables listed
in `extra_bcast` will be transferred via MPI when the object is sent or broadcast,
but will *not* be placed into the HDF5 file during serialisation/unserialisation.
The most common use of this is when there is some cached information in the class
which you do not want to recompute on every MPI node but do not need to save
into the HDF5 file. In this case, the name of the variable containing the cache
would *not* be listed in `hdf5_outputs` but would be listed in `extra_bcast`.
It is also possible in that instance that you would wish to use the `init_from_hdf5`
function as documented below.
The optional function `init_from_hdf5` is called after the object has has its
members loaded when it is being unserialized from an HDF5 file. This means
that you can perform any post-processing which you find necessary; for instance,
if a class has a cache which needs updating after it is reinitialised (because
it is not necessary to serialize/unserialize it), you can use this function to
do so. To see how this works, look at the example class `ComplexTrain` in
the `` file shown above and examine the output from the
`` script which uses this class:
.. literalinclude::
:language: python
Full use of this variable requires the addition of anamnesis' report functionality.
This will be ported from NAF soon.
Full use of this variable requires the addition of anamnesis' report functionality.
This will be ported from NAF soon.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment