Re: question about HDF5 compound type macros...

NOTE: The netcdf-hdf mailing list is no longer active. The list archives are made available for historical reasons.

Hi Ed,

> >> Ed:
> >> 
> >> I think you are true.
> >
> >     Indeed, yes.  However, it is appropriate becuase you want to get that
> > machine's native settings for the struct in memory.  If you'd like to 
> > generate
> > a "packed" compound type for storing the data on disk (which may not always 
> > be
> > the best option, because it may require datatype conversions on more 
> > machines)
> > you can use H5Tpack().  To generate a "native" compound datatype for a
> > particular machine from a packed compound datatype, use 
> > H5Tget_native_type().
> >
> >     Quincey
> >
> 
> Ah, now I think I am starting to understand why you have H5Tpack.
> 
> But to continue my line of questioning, in the example we've been
> discussing, it would be possible to run identical code on two
> machines, and get a different data file as a result?

    Yes, although I believe that the different files would be type-convertable.

> I just re-read the H5Tpack documentation. Probably I need to pack all
> my compound types to make sure that they come out the same way every
> time.

    You could do that, but it may be slower for users on one particular
machine.

> But if I have a packed compound type, and I read it onto a array of
> the struct that the compound type represents, then how does all the
> data come out OK? Do you guys really read it member by member, and
> adjust all the byte boundaries behind the scenes?

    Well, if we don't need to adjust anything, we'll read it directly into the
user's buffer.  If we do need to adjust sizes & offsets, we read it into an
internal buffer and then copy it member by member into the user's buffer (as
you say).

    Quincey

  • 2005 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdf-hdf archives: