perl-modperl mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Philip M. Gollucci" <pgollu...@p6m7g8.com>
Subject Re: Trying to get File and Directory info off of external server quickly
Date Mon, 01 Aug 2005 22:54:56 GMT
Boysenberry Payne wrote:
> I'm not sure if HEAD would work.
> Basically, I'm trying to read a directory's files.
> After I confirm a file exists and doesn't have zero
> size I check that it has the appropriate extension
> for the directory then I add the directory address,
> file name and extension to a table in our database.
We actually do something very similar to this involving pictures being 
uploaded from a digital camera to eventually be published on a website.

Cronjob1:
   Poll destination directory and move the files to a temp location
   The destination directory is where the camera puts them.

Cronjob2:
   Poll temp directory and move image into permenent
	location andinsert a row into our "images" table.

Its split only so that if some part breaks the uploading from camera's
does not and people can continue to upload from camera's.  Digital 
camera's [at least the ones the government uses :)] upload with the same 
non-unique file names for each upload, so we have to process each batch 
rather quickly.

I didn't write this, but I can say in 3 years its only crashed once and 
makes us millions.

[snipped for breviety of course]
Cronjob1:

use Net::FTP;
my $ftp = Net::FTP->new($PEERADDR, Debug => 0, Timeout => 30)
	|| die "Connect to server failed\n";
$ftp->login($USERNAME, $PASSWORD)
	|| die "Cannot login to FTP server\n";
$ftp->binary();

my @files   = $ftp->ls('-R');
foreach my $file (@files) {
	unless some critera
	$ftp->get("$dir/$file", $localFilename);
}
$ftp->quit();

Mime
View raw message