Header Data Units

The ImageHDU and CompImageHDU classes are discussed in the section on Images.

The TableHDU and BinTableHDU classes are discussed in the section on Tables.

PrimaryHDU

class pyfits.PrimaryHDU(data=None, header=None, do_not_scale_image_data=False, ignore_blank=False, uint=True, scale_back=None)

Bases: pyfits.hdu.image._ImageBaseHDU

FITS primary HDU class.

Construct a primary HDU.

Parameters:
 

data : array or DELAYED, optional

The data in the HDU.

header : Header instance, optional

The header to be used (as a template). If header is None, a minimal header will be provided.

do_not_scale_image_data : bool, optional

If True, image data is not scaled using BSCALE/BZERO values when read. (default: False)

ignore_blank : bool, optional

If True, the BLANK header keyword will be ignored if present. Otherwise, pixels equal to this value will be replaced with NaNs. (default: False)

uint : bool, optional

Interpret signed integer data where BZERO is the central value and BSCALE == 1 as unsigned integer data. For example, int16 data with BZERO = 32768 and BSCALE = 1 would be treated as uint16 data. (default: True)

scale_back : bool, optional

If True, when saving changes to a file that contained scaled image data, restore the data to the original type and reapply the original BSCALE/BZERO values. This could lead to loss of accuracy if scaling back to integer values after performing floating point operations on the data. Pseudo-unsigned integers are automatically rescaled unless scale_back is explicitly set to False. (default: None)

add_checksum(when=None, override_datasum=False, blocking='standard', checksum_keyword='CHECKSUM', datasum_keyword='DATASUM')

Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.

Parameters:
 

when : str, optional

comment string for the cards; by default the comments will represent the time when the checksum was calculated

override_datasum : bool, optional

add the CHECKSUM card only

blocking: str, optional

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

checksum_keyword : str, optional

The name of the header keyword to store the checksum value in; this is typically ‘CHECKSUM’ per convention, but there exist use cases in which a different keyword should be used

datasum_keyword : str, optional

See checksum_keyword

Notes

For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True. This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.

add_datasum(when=None, blocking='standard', datasum_keyword='DATASUM')

Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.

Parameters:
 

when : str, optional

Comment string for the card that by default represents the time when the checksum was calculated

blocking: str, optional

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

datasum_keyword : str, optional

The name of the header keyword to store the datasum value in; this is typically ‘DATASUM’ per convention, but there exist use cases in which a different keyword should be used

Returns:
 

checksum : int

The calculated datasum

Notes

For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.

copy()

Make a copy of the HDU, both header and data are copied.

filebytes()

Calculates and returns the number of bytes that this HDU will write to a file.

fileinfo()

Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList.

Returns:
 

dict or None

The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.

Dictionary contents:

Key Value
file File object associated with the HDU
filemode Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)
hdrLoc Starting byte location of header in file
datLoc Starting byte location of data block in file
datSpan Data size including padding
fromstring(data, checksum=False, ignore_missing_end=False, **kwargs)

Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.

Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a memoryview.

Parameters:
 

data : str, bytearray, memoryview, ndarray

A byte string containing the HDU’s header and data.

checksum : bool, optional

Check the HDU’s checksum and/or datasum.

ignore_missing_end : bool, optional

Ignore a missing end card in the header data. Note that without the end card the end of the header may be ambiguous and resulted in a corrupt HDU. In this case the assumption is that the first 2880 block that does not begin with valid FITS header data is the beginning of the data.

kwargs : optional

May consist of additional keyword arguments specific to an HDU type–these correspond to keywords recognized by the constructors of different HDU classes such as PrimaryHDU, ImageHDU, or BinTableHDU. Any unrecognized keyword arguments are simply ignored.

readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs)

Read the HDU from a file. Normally an HDU should be opened with open() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto().

Parameters:
 

fileobj : file object or file-like object

Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.

checksum : bool

If True, verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.

ignore_missing_end : bool

Do not issue an exception when opening a file that is missing an END card in the last header.

req_cards(keyword, pos, test, fix_value, option, errlist)

Check the existence, location, and value of a required Card.

Parameters:
 

keyword : str

The keyword to validate

pos : int, callable

If an int, this specifies the exact location this card should have in the header. Remember that Python is zero-indexed, so this means pos=0 requires the card to be the first card in the header. If given a callable, it should take one argument–the actual position of the keyword–and return True or False. This can be used for custom evaluation. For example if pos=lambda idx: idx > 10 this will check that the keyword’s index is greater than 10.

test : callable

This should be a callable (generally a function) that is passed the value of the given keyword and returns True or False. This can be used to validate the value associated with the given keyword.

fix_value : str, int, float, complex, bool, None

A valid value for a FITS keyword to to use if the given test fails to replace an invalid value. In other words, this provides a default value to use as a replacement if the keyword’s current value is invalid. If None, there is no replacement value and the keyword is unfixable.

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". May also be any combination of "fix" or "silentfix" with "+ignore", +warn, or +exception" (e.g. ``"fix+warn"). See Verification options for more info.

errlist : list

A list of validation errors already found in the FITS file; this is used primarily for the validation system to collect errors across multiple HDUs and multiple calls to req_cards.

Notes

If pos=None, the card can be anywhere in the header. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.

run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True)

Execute the verification with selected option.

scale(type=None, option='old', bscale=1, bzero=0)

Scale image data by using BSCALE/BZERO.

Call to this method will scale data and update the keywords of BSCALE and BZERO in the HDU’s header. This method should only be used right before writing to the output file, as the data will be scaled and is therefore not very usable after the call.

Parameters:
 

type : str, optional

destination data type, use a string representing a numpy dtype name, (e.g. 'uint8', 'int16', 'float32' etc.). If is None, use the current data type.

option : str

How to scale the data: if "old", use the original BSCALE and BZERO values when the data was read/created. If "minmax", use the minimum and maximum of the data to scale. The option will be overwritten by any user specified bscale/bzero values.

bscale, bzero : int, optional

User-specified BSCALE and BZERO values

update_ext_name(*args, **kwargs)

Deprecated since version 3.2: Use the .name attribute or Header.set instead.

Update the extension name associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters:
 

value : str

Value to be used for the new extension name

comment : str, optional

To be used for updating, default=None.

before : str or int, optional

Name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both are specified.

after : str or int, optional

Name of the keyword, or index of the Card after which the new card will be placed in the Header

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_ext_version(*args, **kwargs)

Deprecated since version 3.2: Use the .ver attribute or Header.set instead.

Update the extension version associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters:
 

value : str

Value to be used for the new extension version

comment : str, optional

To be used for updating; default=None.

before : str or int, optional

Name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both are specified.

after : str or int, optional

Name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

verify(option='warn')

Verify all values in the instance.

Parameters:
 

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". May also be any combination of "fix" or "silentfix" with "+ignore", +warn, or +exception" (e.g. ``"fix+warn"). See Verification options for more info.

verify_checksum(blocking='standard')

Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.

blocking : str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns:
 

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no CHECKSUM keyword present
verify_datasum(blocking='standard')

Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.

blocking : str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns:
 

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no DATASUM keyword present
writeto(name, output_verify='exception', clobber=False, checksum=False)

Write the HDU to a new file. This is a convenience method to provide a user easier output interface if only one HDU needs to be written to a file.

Parameters:
 

name : file path, file object or file-like object

Output FITS file. If the file object is already opened, it must be opened in a writeable mode.

output_verify : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". May also be any combination of "fix" or "silentfix" with "+ignore", +warn, or +exception (e.g. "fix+warn"). See Verification options for more info.

clobber : bool

Overwrite the output file if exists.

checksum : bool

When True adds both DATASUM and CHECKSUM cards to the header of the HDU when written to the file.

data

Image/array data as a ndarray.

Please remember that the order of axes on an Numpy array are opposite of the order specified in the FITS file. For example for a 2D image the “rows” or y-axis are the first dimension, and the “columns” or x-axis are the second dimension.

If the data is scaled using the BZERO and BSCALE parameters, this attribute returns the data scaled to its physical values unless the file was opened with do_not_scale_image_data=True.

section

Access a section of the image array without loading the entire array into memory. The Section object returned by this attribute is not meant to be used directly by itself. Rather, slices of the section return the appropriate slice of the data, and loads only that section into memory.

Sections are mostly obsoleted by memmap support, but should still be used to deal with very large scaled images. See the Data Sections section of the PyFITS documentation for more details.

shape

Shape of the image array–should be equivalent to self.data.shape.

size

Size (in bytes) of the data portion of the HDU.

GroupsHDU

class pyfits.GroupsHDU(data=None, header=None)

Bases: pyfits.hdu.image.PrimaryHDU, pyfits.hdu.table._TableLikeHDU

FITS Random Groups HDU class.

See the Random Access Groups section in the PyFITS documentation for more details on working with this type of HDU.

Iterates through the subclasses of _BaseHDU and uses that class’s match_header() method to determine which subclass to instantiate.

It’s important to be aware that the class hierarchy is traversed in a depth-last order. Each match_header() should identify an HDU type as uniquely as possible. Abstract types may choose to simply return False or raise NotImplementedError to be skipped.

If any unexpected exceptions are raised while evaluating match_header(), the type is taken to be _CorruptedHDU.

add_checksum(when=None, override_datasum=False, blocking='standard', checksum_keyword='CHECKSUM', datasum_keyword='DATASUM')

Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.

Parameters:
 

when : str, optional

comment string for the cards; by default the comments will represent the time when the checksum was calculated

override_datasum : bool, optional

add the CHECKSUM card only

blocking: str, optional

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

checksum_keyword : str, optional

The name of the header keyword to store the checksum value in; this is typically ‘CHECKSUM’ per convention, but there exist use cases in which a different keyword should be used

datasum_keyword : str, optional

See checksum_keyword

Notes

For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True. This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.

add_datasum(when=None, blocking='standard', datasum_keyword='DATASUM')

Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.

Parameters:
 

when : str, optional

Comment string for the card that by default represents the time when the checksum was calculated

blocking: str, optional

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

datasum_keyword : str, optional

The name of the header keyword to store the datasum value in; this is typically ‘DATASUM’ per convention, but there exist use cases in which a different keyword should be used

Returns:
 

checksum : int

The calculated datasum

Notes

For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.

copy()

Make a copy of the HDU, both header and data are copied.

filebytes()

Calculates and returns the number of bytes that this HDU will write to a file.

fileinfo()

Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList.

Returns:
 

dict or None

The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.

Dictionary contents:

Key Value
file File object associated with the HDU
filemode Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)
hdrLoc Starting byte location of header in file
datLoc Starting byte location of data block in file
datSpan Data size including padding
from_columns(columns, header=None, nrows=0, fill=False, **kwargs)

Given either a ColDefs object, a sequence of Column objects, or another table HDU or table data (a FITS_rec or multi-field numpy.ndarray or numpy.recarray object, return a new table HDU of the class this method was called on using the column definition from the input.

This is an alternative to the now deprecated new_table function, and otherwise accepts the same arguments. See also FITS_rec.from_columns.

Parameters:
 

columns : sequence of Column, ColDefs, or other

The columns from which to create the table data, or an object with a column-like structure from which a ColDefs can be instantiated. This includes an existing BinTableHDU or TableHDU, or a numpy.recarray to give some examples.

If these columns have data arrays attached that data may be used in initializing the new table. Otherwise the input columns will be used as a template for a new table with the requested number of rows.

header : Header

An optional Header object to instantiate the new HDU yet. Header keywords specifically related to defining the table structure (such as the “TXXXn” keywords like TTYPEn) will be overridden by the supplied column definitions, but all other informational and data model-specific keywords are kept.

nrows : int

Number of rows in the new table. If the input columns have data associated with them, the size of the largest input column is used. Otherwise the default is 0.

fill : bool

If True, will fill all cells with zeros or blanks. If False, copy the data from input, undefined cells will still be filled with zeros/blanks.

Notes

Any additional keyword arguments accepted by the HDU class’s __init__ may also be passed in as keyword arguments.

fromstring(data, checksum=False, ignore_missing_end=False, **kwargs)

Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.

Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a memoryview.

Parameters:
 

data : str, bytearray, memoryview, ndarray

A byte string containing the HDU’s header and data.

checksum : bool, optional

Check the HDU’s checksum and/or datasum.

ignore_missing_end : bool, optional

Ignore a missing end card in the header data. Note that without the end card the end of the header may be ambiguous and resulted in a corrupt HDU. In this case the assumption is that the first 2880 block that does not begin with valid FITS header data is the beginning of the data.

kwargs : optional

May consist of additional keyword arguments specific to an HDU type–these correspond to keywords recognized by the constructors of different HDU classes such as PrimaryHDU, ImageHDU, or BinTableHDU. Any unrecognized keyword arguments are simply ignored.

readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs)

Read the HDU from a file. Normally an HDU should be opened with open() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto().

Parameters:
 

fileobj : file object or file-like object

Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.

checksum : bool

If True, verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.

ignore_missing_end : bool

Do not issue an exception when opening a file that is missing an END card in the last header.

req_cards(keyword, pos, test, fix_value, option, errlist)

Check the existence, location, and value of a required Card.

Parameters:
 

keyword : str

The keyword to validate

pos : int, callable

If an int, this specifies the exact location this card should have in the header. Remember that Python is zero-indexed, so this means pos=0 requires the card to be the first card in the header. If given a callable, it should take one argument–the actual position of the keyword–and return True or False. This can be used for custom evaluation. For example if pos=lambda idx: idx > 10 this will check that the keyword’s index is greater than 10.

test : callable

This should be a callable (generally a function) that is passed the value of the given keyword and returns True or False. This can be used to validate the value associated with the given keyword.

fix_value : str, int, float, complex, bool, None

A valid value for a FITS keyword to to use if the given test fails to replace an invalid value. In other words, this provides a default value to use as a replacement if the keyword’s current value is invalid. If None, there is no replacement value and the keyword is unfixable.

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". May also be any combination of "fix" or "silentfix" with "+ignore", +warn, or +exception" (e.g. ``"fix+warn"). See Verification options for more info.

errlist : list

A list of validation errors already found in the FITS file; this is used primarily for the validation system to collect errors across multiple HDUs and multiple calls to req_cards.

Notes

If pos=None, the card can be anywhere in the header. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.

run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True)

Execute the verification with selected option.

scale(type=None, option='old', bscale=1, bzero=0)

Scale image data by using BSCALE/BZERO.

Call to this method will scale data and update the keywords of BSCALE and BZERO in the HDU’s header. This method should only be used right before writing to the output file, as the data will be scaled and is therefore not very usable after the call.

Parameters:
 

type : str, optional

destination data type, use a string representing a numpy dtype name, (e.g. 'uint8', 'int16', 'float32' etc.). If is None, use the current data type.

option : str

How to scale the data: if "old", use the original BSCALE and BZERO values when the data was read/created. If "minmax", use the minimum and maximum of the data to scale. The option will be overwritten by any user specified bscale/bzero values.

bscale, bzero : int, optional

User-specified BSCALE and BZERO values

update_ext_name(*args, **kwargs)

Deprecated since version 3.2: Use the .name attribute or Header.set instead.

Update the extension name associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters:
 

value : str

Value to be used for the new extension name

comment : str, optional

To be used for updating, default=None.

before : str or int, optional

Name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both are specified.

after : str or int, optional

Name of the keyword, or index of the Card after which the new card will be placed in the Header

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_ext_version(*args, **kwargs)

Deprecated since version 3.2: Use the .ver attribute or Header.set instead.

Update the extension version associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters:
 

value : str

Value to be used for the new extension version

comment : str, optional

To be used for updating; default=None.

before : str or int, optional

Name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both are specified.

after : str or int, optional

Name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

verify(option='warn')

Verify all values in the instance.

Parameters:
 

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". May also be any combination of "fix" or "silentfix" with "+ignore", +warn, or +exception" (e.g. ``"fix+warn"). See Verification options for more info.

verify_checksum(blocking='standard')

Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.

blocking : str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns:
 

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no CHECKSUM keyword present
verify_datasum(blocking='standard')

Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.

blocking : str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns:
 

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no DATASUM keyword present
writeto(name, output_verify='exception', clobber=False, checksum=False)

Write the HDU to a new file. This is a convenience method to provide a user easier output interface if only one HDU needs to be written to a file.

Parameters:
 

name : file path, file object or file-like object

Output FITS file. If the file object is already opened, it must be opened in a writeable mode.

output_verify : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". May also be any combination of "fix" or "silentfix" with "+ignore", +warn, or +exception (e.g. "fix+warn"). See Verification options for more info.

clobber : bool

Overwrite the output file if exists.

checksum : bool

When True adds both DATASUM and CHECKSUM cards to the header of the HDU when written to the file.

data

The data of a random group FITS file will be like a binary table’s data.

parnames

The names of the group parameters as described by the header.

section

Access a section of the image array without loading the entire array into memory. The Section object returned by this attribute is not meant to be used directly by itself. Rather, slices of the section return the appropriate slice of the data, and loads only that section into memory.

Sections are mostly obsoleted by memmap support, but should still be used to deal with very large scaled images. See the Data Sections section of the PyFITS documentation for more details.

shape

Shape of the image array–should be equivalent to self.data.shape.

size

Returns the size (in bytes) of the HDU’s data part.

GroupData

class pyfits.GroupData

Bases: pyfits.fitsrec.FITS_rec

Random groups data object.

Allows structured access to FITS Group data in a manner analogous to tables.

Parameters:
 

input : array or FITS_rec instance

input data, either the group data itself (a numpy.ndarray) or a record array (FITS_rec) which will contain both group parameter info and the data. The rest of the arguments are used only for the first case.

bitpix : int

data type as expressed in FITS BITPIX value (8, 16, 32, 64, -32, or -64)

pardata : sequence of arrays

parameter data, as a list of (numeric) arrays.

parnames : sequence of str

list of parameter names.

bscale : int

BSCALE of the data

bzero : int

BZERO of the data

parbscales : sequence of int

list of bscales for the parameters

parbzeros : sequence of int

list of bzeros for the parameters

par(parname)

Get the group parameter values.

data

The raw group data represented as a multi-dimensional numpy.ndarray array.

Group

class pyfits.Group(input, row=0, start=None, end=None, step=None, base=None)

Bases: pyfits.fitsrec.FITS_record

One group of the random group data.

par(parname)

Get the group parameter value.

setpar(parname, value)

Set the group parameter value.

StreamingHDU

class pyfits.StreamingHDU(name, header)

Bases: object

A class that provides the capability to stream data to a FITS file instead of requiring data to all be written at once.

The following pseudocode illustrates its use:

header = pyfits.Header()

for all the cards you need in the header:
    header[key] = (value, comment)

shdu = pyfits.StreamingHDU('filename.fits', header)

for each piece of data:
    shdu.write(data)

shdu.close()

Construct a StreamingHDU object given a file name and a header.

Parameters:
 

name : file path, file object, or file like object

The file to which the header and data will be streamed. If opened, the file object must be opened in a writeable binary mode such as ‘wb’ or ‘ab+’.

header : Header instance

The header object associated with the data to be written to the file.

Notes

The file will be opened and the header appended to the end of the file. If the file does not already exist, it will be created, and if the header represents a Primary header, it will be written to the beginning of the file. If the file does not exist and the provided header is not a Primary header, a default Primary HDU will be inserted at the beginning of the file and the provided header will be added as the first extension. If the file does already exist, but the provided header represents a Primary header, the header will be modified to an image extension header and appended to the end of the file.

close()

Close the physical FITS file.

write(data)

Write the given data to the stream.

Parameters:
 

data : ndarray

Data to stream to the file.

Returns:
 

writecomplete : int

Flag that when True indicates that all of the required data has been written to the stream.

Notes

Only the amount of data specified in the header provided to the class constructor may be written to the stream. If the provided data would cause the stream to overflow, an IOError exception is raised and the data is not written. Once sufficient data has been written to the stream to satisfy the amount specified in the header, the stream is padded to fill a complete FITS block and no more data will be accepted. An attempt to write more data after the stream has been filled will raise an IOError exception. If the dtype of the input data does not match what is expected by the header, a exceptions.TypeError exception is raised.

size

Return the size (in bytes) of the data portion of the HDU.