archiva-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dan Armbrust <>
Subject Re: Uploading large files?
Date Tue, 26 Apr 2011 20:49:38 GMT
On Tue, Apr 26, 2011 at 12:48 PM, Dan Armbrust
<> wrote:
> What all needs to be done to be able to deploy large files to archiva?
>  As in, ~200 MB?
> I tried making the modification, and that doesn't
> seem to help.

So, after some pilot error, it turns out that changing that property
is all that is necessary.  However, it would sure be nice if there was
a was to make this change without having to modify the file inside the
war file (in order to make sure it is permanent)

Ideally, it would just be in the gui.

On a side note, the performance of the Upload Artifact page is
absolutely dreadful.  And I found out why.

The UploadAction class uses what must be the worst copy file
implementation possible.

It says:

        FileOutputStream out = new FileOutputStream( new File(
targetPath, targetFilename ) );
        FileInputStream input = new FileInputStream( sourceFile );

            int i;
            while ( ( i = ) != -1 )
                out.write( i );

Alternating between reading and writing one bit at a time????  No
wonder it was taking me over a 1/2 hour to try to upload a simple 150
MB file (with the CPU locked at 100% utilization).  Wow.

Please replace that implementation with this:

                BufferedOutputStream out = new
BufferedOutputStream(new FileOutputStream(new File(targetPath,
		BufferedInputStream input = new BufferedInputStream(new
			byte[] buf = new byte[1024];
			int len;
			while ((len = > 0)
				out.write(buf, 0, len);

Tis about 100 orders of magnitude faster.



View raw message