Re: Unique Filename

Randy,

So your web application generates text files and then another application 
inserts them into LDM and then LDM writes them to another directory?  What 
language is your web application, insert application, post-LDM 
application?  A more precise timestamp may be available during one of 
these steps....

You could always put a csh script in this loop to append a processID ($$) 
with the timestamp and that would make them unique...

Also, your web application could write the timestamp, check for a 
redundancy and then maybe sleep for a second, before trying again??

Daryl

On Fri, 17 Jan 2003, Randy Breeser wrote:

>Thanks for the quick response Steve;
>
>Actually I would not have a problem with a lot of files in the target 
>directory, that directory is a que for products that will be swept into 
>another system. There will never be a large volume of products, they 
>will be very small text files but they may stay in the que for up to 15 
>seconds. These will be severe weather reports from the public via the 
>web so I need to account for the possibility of two reports hitting the 
>system at the same time.
>
>Also I cannot change the filename in any other way then to append 
>something on the end.
>
>Thanks...RLB
>
>Steve Emmerson wrote:
>
>>Randy,
>>
>>  
>>
>>>Date: Tue, 14 Jan 2003 16:23:15 -0600
>>>From: "Randy Breeser" <Randy.Breeser@xxxxxxxx>
>>>Organization: NWS La Crosse Wisconsin
>>>To: ldm-users@xxxxxxxxxxxxxxxx
>>>Subject: Unique Filename
>>>    
>>>
>>
>>The above message contained the following:
>>
>>  
>>
>>>First let me say that I am pretty new to LDM and I hope that I am 
>>>posting this question to the right list...if not maybe someone will 
>>>point me to a beginners list.
>>>
>>>I need to generate a unique filename in pqact.conf so that files will 
>>>not overwrite those already in a que. Below is what I have done so far.
>>> I added "%M%S" here but this will only resolve to one second. Not quite 
>>>good enough...it is possible that 2 files could be processed during the 
>>>same second. The "NEWFILE.dat" part of the filename cannot change nor 
>>>can the path.
>>>
>>>EXP     .*(transfile.dat)        FILE    -overwrite      
>>>/some/file/path/NEWFILE.dat%M%S
>>>
>>>Any help would be greatly appreciated.
>>>    
>>>
>>
>>If you're worried about multiple files being processed in the same
>>second, then it seems to me that you're in a bad situation for the
>>following reasons:
>>
>>    1.  You could end-up with thousands upon thousands of files in a
>>      single directory.
>>
>>    2.  The scour(1) facility might not be sufficient to keep the number
>>      of files down to a managable level.
>>
>>It could be that, with a little thought, a solution could be found that
>>obviates the need for sub-second resolution.  What are these files and
>>what are you trying to do with them?  Does there product-ID containing
>>nothing that could be useful?
>>
>>Regards,
>>Steve Emmerson   <http://www.unidata.ucar.edu>
>>  
>>
>
>

-- 
/**
 * Daryl Herzmann (akrherz@xxxxxxxxxxx)
 * Program Assistant -- Iowa Environmental Mesonet
 * http://mesonet.agron.iastate.edu
 */



  • 2003 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the ldm-users archives: