[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[netCDF #OSI-637834]: run_par_tests fails



Hello,

My apologies for the delayed response.  Can you tell me what version of libhdf5 
you are working against? It would be my guess that you are using an older 
version of libhdf5.  I would recommend ensuring you are using the latest 
version of the 1.8.x or 1.10.x HDF5 releases.  If this is already the case, can 
you provide additional information about how the libhdf5 was built and 
installed?

Thank you, have a great day,

-Ward

> I am trying to build NETCDF 4.6.1 with parallel-I/O support - I ran the
> configure script but my make check fails the run_par_tests with the
> following output:
> 
> ============================================================================
> 
> Testsuite summary for netCDF 4.6.1
> 
> ============================================================================
> 
> *# TOTAL: 26*
> 
> # PASS:  25
> 
> # SKIP:  0
> 
> # XFAIL: 0
> 
> # FAIL:  1
> 
> # XPASS: 0
> 
> # ERROR: 0
> 
> ============================================================================
> 
> See h5_test/test-suite.log
> 
> Please report to address@hidden
> 
> ============================================================================
> 
> make[3]: *** [test-suite.log] Error 1
> 
> make[3]: Leaving directory
> `/home/sutirtha/NETCDF4.6.1/netcdf-c-4.6.1/h5_test'
> 
> make[2]: *** [check-TESTS] Error 2
> 
> make[2]: Leaving directory
> `/home/sutirtha/NETCDF4.6.1/netcdf-c-4.6.1/h5_test'
> 
> make[1]: *** [check-am] Error 2
> 
> make[1]: Leaving directory
> `/home/sutirtha/NETCDF4.6.1/netcdf-c-4.6.1/h5_test'
> 
> make: *** [check-recursive] Error 1
> 
> [sutirtha@comet-ln3 netcdf-c-4.6.1]$ less h5_test/test-suite.log
> 
> 
> ==========================================
> 
> netCDF 4.6.1: h5_test/test-suite.log
> 
> ==========================================
> 
> 
> # TOTAL: 26
> 
> # PASS:  25
> 
> # SKIP:  0
> 
> # XFAIL: 0
> 
> # FAIL:  1
> 
> # XPASS: 0
> 
> # ERROR: 0
> 
> 
> .. contents:: :depth: 2
> 
> 
> FAIL: run_par_tests.sh
> 
> ======================
> 
> 
> 
> Testing parallel I/O with HDF5...
> 
> *** Creating file for parallel I/O read, and rereading it...[cli_0]:
> aborting job:
> 
> Fatal error in PMPI_Info_dup:
> 
> Invalid argument, error stack:
> 
> PMPI_Info_dup(151): MPI_Info_dup(MPI_INFO_NULL, newinfo=0x7ffd77816ae8)
> failed
> 
> PMPI_Info_dup(110): Null MPI_Info
> 
> 
> HDF5: infinite loop closing library
> 
> 
> D,T,AC,FD,P,FD,P,FD,P,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD,FD
> 
> 
> ===================================================================================
> 
> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> 
> =   PID 10350 RUNNING AT comet-ln3.sdsc.edu
> 
> =   EXIT CODE: 1
> 
> =   CLEANING UP REMAINING PROCESSES
> 
> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> 
> ===================================================================================
> 
> FAIL run_par_tests.sh (exit status: 1)
> 
> 
> Is this an issue with my HFD5 library or perhaps I haven’t set up something
> else correctly?
> 
> 
> Any advice/suggestions would be very helpful.
> 
> 
> Thanking you,
> Sutirtha Sengupta.
> 


Ticket Details
===================
Ticket ID: OSI-637834
Department: Support netCDF
Priority: High
Status: Closed
===================
NOTE: All email exchanges with Unidata User Support are recorded in the Unidata 
inquiry tracking system and then made publicly available through the web.  If 
you do not want to have your interactions made available in this way, you must 
let us know in each email you send to us.