[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[IDD #MIY-175774]: "WARN" Log Message in Concatenate NEXRAD2 Script



Hi Ziv,

re:
> I tried including the flags/directory with the EXEC script.

OK.  Did you remember to adjust the directory hierarchy in the other two
actions to match the directory structure you specified in the '-d' value
you were passing to hhmmssRadarII.pl?  All of the directory hierarchies
need to match since the work flow is:

- write pieces of a volume scan to disk

- run the re-assembler Perl script to assemble the volume scan pieces
  into a volume scan

re:
> However,
> I now get the following error in my ldmd.log file:
> 
> Jan 20 15:49:57 ldm-downstream pqact[26180] ERROR: Child 30182 exited with 
> status 1
> Jan 20 15:49:57 ldm-downstream pqact[26180] ERROR: Deleting failed EXEC entry 
> "perl /home/ldm/util/hhmmssRadarII.pl -d 
> /dev/shm/ldm/pub/native/radar/level2/_DONE_ -l /dev/shm/ldm/logs -v KLGX 
> 20160120 154556 55"

I just setup NEXRAD2 processing on a machine here in Unidata that was not
receiving the feed before.  My procedure was:

- identify where the NEXRAD2 pieces and files can be written

- adjust the NEXRAD2 entries in my copy of the pattern-action file 
'pqact.radars'

  One part of this which was not absolutely needed was to comment out the
  action that files model data packets separately.

- add an EXEC line the the ~ldm/etc/ldmd.conf file to have 'pqact' process
  NEXRAD2 products using the actions in ~ldm/etc/pqact.radars

- add a REQUEST line in ~ldm/etc/ldmd.conf to get the NEXRAD2 data feed

After restarting the LDM on my target machine, data is being ingested
and processed with no errors.

re:
> I also tried it with the directories in quotation marks as that is how they
> appeared in the Perl script. No luck.

I am using the unaltered hhmmssRadarII.pl script (which I put in ~ldm/util)
in my test.  I have put the altered copy of 'pqact.radars' (as 
'pqact.radars.mod')
out on our anonymous FTP so you can grab it and see exactly what I did:

machine:   ftp.unidata.ucar.edu
user:      anonymous
pass:      your_email_address
directory: pub/nexrad2
files:     pqact.radars.mod
           hhmmssRadarII.pl

You will see that I followed the deep hierarchy that was expected in the way
that NEXRAD2 processing was originally designed (with the exception of
writing to /machine/ldm/data/pub/... instead of /data/ldm/pub/...).  I
don't know if this is absolutely needed, but I did it anyway in case there
are some not obvious assumptions in hhmmssRadarII.pl.

re:
> As you can see, I am writing the files into /dev/shm which has ~6 GB
> of storage.

This comment makes me think that you intended to attach your latest copy
of 'pqact.radars' to your email.  Since I see no attachments, I have to
assume that it was inadvertently left off.

re:
> Nonetheless, the directory fills up within an hour.

Your processing will need to implement the steps of moving the reconstituted
volume scans to your desired directory.  I say move, not copy as this will
keep your RAM disk from filling up.

re:
> I will
> focus on getting the scour file to prevent this after I/we figure out
> the above error.

OK.

re:
> Any idea what the error might be? Need any additional information?

I recommend that you review the reworked 'pqact.radars' file (named 
'pqact.radars.mod')
that I put out on anonymous FTP to see how it differs from what you are trying
to use now.  I further recommend that you re-download hhmmssRadarII.pl so that
you are working with a clean copy.

re:
> Thank you!

No worries.

Cheers,

Tom
--
****************************************************************************
Unidata User Support                                    UCAR Unidata Program
(303) 497-8642                                                 P.O. Box 3000
address@hidden                                   Boulder, CO 80307
----------------------------------------------------------------------------
Unidata HomePage                       http://www.unidata.ucar.edu
****************************************************************************


Ticket Details
===================
Ticket ID: MIY-175774
Department: Support IDD
Priority: Normal
Status: Closed