[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[IDD #MIY-175774]: "WARN" Log Message in Concatenate NEXRAD2 Script



Hi Ziv,

re:
> I am playing around with a bash decompression script to EXECUTE on the
> files as they are written but before they are concatenated.
> 
> However, I am getting the following bzip2 error from test runs on files in
> the .tmp folder:
> 
> bzip2: Can't guess original name for KPOE_20160122_202252 -- using
> KPOE_20160122_202252.out
> 
> bzip2: Compressed file ends unexpectedly;
> perhaps it is corrupted?  *Possible* reason follows.
> bzip2: No such file or directory
> Input file = KPOE_20160122_202252, output file = KPOE_20160122_202252.out
> 
> It is possible that the compressed file(s) have become corrupted.

It is unlikely that they would be corrupted by the LDM relay process.
It may be the case that the individual chunks have some information that
is not bzip2 compressed and some that is (e.g., an uncompressed header
followed by a block of compressed bytes).  Since I am not an expert in
use of this type of data, I will need to forward your question to a more
qualified staff member in Unidata.  I will try to get to this tomorrow,
but this may have to slip.

re:
> You can use the -tvv option to test integrity of such files.
> 
> You can use the `bzip2recover' program to attempt to recover
> data from undamaged sections of corrupted files.
> 
> bzip2: Deleting output file KPOE_20160122_202252.out, if it exists.
> 
> Can you suggest any remedy? I was looking at this previous email
> <https://www.unidata.ucar.edu/support/help/MailArchives/datastream/msg01661.html>
> to ya'll and it has me worried :~/

My best guess is that I was incorrect in characterizing the individual
chunks as being completely bzip2 compressed.  Unidata staff member Ryan
May has been working with data as part of the NOAA Big Data Project,
so he will definitely know what needs to be done to unbzip2 the pieces;
reassemble the uncompressed pieces into a usable volume scan; and then
gzip the result.  Ryan wrote the Python script procedure that is being
used to add real-time Level 2 data to the full Level 2 archive that was
moved to AWS as part of the NOAA Big Data project.  I will be sending
Ryan a note asking him to work with you on your effort.

re:
> Thanks!

No worries, and again sorry for the slow response.

Cheers,

Tom
--
****************************************************************************
Unidata User Support                                    UCAR Unidata Program
(303) 497-8642                                                 P.O. Box 3000
address@hidden                                   Boulder, CO 80307
----------------------------------------------------------------------------
Unidata HomePage                       http://www.unidata.ucar.edu
****************************************************************************


Ticket Details
===================
Ticket ID: MIY-175774
Department: Support IDD
Priority: Normal
Status: Closed