perl-modperl mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Boysenberry Payne <>
Subject Re: Trying to get File and Directory info off of external server quickly
Date Tue, 02 Aug 2005 17:35:56 GMT
Thank You Everyone,

I think now that I know I can use $ftp->ls( "-lR" ), which I couldn't 
anywhere in the Net::FTP docs or other O'Reilly books I have, I can
stick to Net::FTP without is being slow.  What was causing my script
to take so long was the multiple $ftp->cwd( $directory ), $ftp->ls() and
$ftp->dir(  $directory . $file ) calls for each directory in my 
directory loop.

Now I use one cwd and ls("-lR") from my public html area then process
the return array, which is a lot faster.  It would be nice to be able 
to specify
the directory as well as the "-lR" without using cwd( $directory ); does
anyone know how to do it?

Thanks for the tips on making my code more efficient too.


This message contains information that is confidential
and proprietary to Humaniteque and / or its affiliates.
It is intended only for the recipient named and for
the express purpose(s) described therein.
Any other use is prohibited.
The World's Best Site Builder
On Aug 1, 2005, at 6:28 PM, Randy Kobes wrote:

> On Mon, 1 Aug 2005, Boysenberry Payne wrote:
>> I'm not sure if HEAD would work.
>> Basically, I'm trying to read a directory's files.
>> After I confirm a file exists and doesn't have zero
>> size I check that it has the appropriate extension
>> for the directory then I add the directory address,
>> file name and extension to a table in our database.
> Can you get someone on the remote server to do a
>    cd top_level_directory
>    ls -lR > ls-lR  # or find -fls find-ls
>    gzip ls-lR      # or gzip find-ls
> periodically, and then you can grab and parse ls-lR.gz or find-ls.gz?
> -- 
> best regards,
> randy kobes

View raw message