hdf images hdf images

This web site is no longer maintained (but will remain online).
Please see The HDF Group's new Support Portal for the latest information.

Things That Can Affect Performance

HDF5 performance, such as speed, memory usage, and storage efficiency can be affected by how an HDF5 file is accessed or stored. Listed below are performance issues that can an occur and how to avoid them.

Excessive Memory Usage

Open Objects

Open objects use up memory. The amount of memory used may be substantial when many objects are left open. You should:

There are APIs to determine if datasets and groups are left open. H5Fget_obj_count will get the number of open objects in the file, and H5Fget_obj_ids will return a list of the open object identifiers.

Metadata Cache

The metadata cache can also affect memory usage. Modify the metadata cache settings to minimize the size and growth of the cache as much as possible without decreasing performance.

By default the metadata cache is 2 MB in size, and it can be allowed to increase to a maximum of 32 MB per file. The metadata cache can be disabled or modified. Memory used for the cache is not released until the datasets or file are closed.

See the H5Pset_cache API for setting the cache, as well as the Information on the Metadata Cache FAQ.

Memory and Storage Issues Caused by Chunking

There can be a number of issues caused by using chunking inefficiently. Please see the advanced topic, Chunking in HDF5, for detailed information regarding the use of chunking. Some things that may help are listed below:

Other Issues


- - Last modified: 07 September 2016