turbine-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adam Sherman <a...@tritus.ca>
Subject Re: BLOBs (Binary Large Objects) & Torque/Turbine
Date Tue, 02 Oct 2001 15:11:05 GMT
On Tue, Oct 02, 2001 at 04:24:36PM +0200, Gunnar Rønning wrote:
> | > The problem with PostgreSQL Large Objects is that they _must_ be handled
> | > inside a transaction. This has been an integration problem with Torque and
> | > underlying APIs, which has assumed another model.
> | 
> | This makes sense, however: does it mean that I cannot use BLOBs with
> | PostgreSQL & Torque/Turbine? Or is there a kludge?
> 
> I think there is a kludge that requires you to patch the JDBC driver. I haven't
> tested it with Turbine myself.
> 
> | My problem is simple, the app is going to be used over local 100Base-T
> | by 75% of the users and will contain many 100MB plus files. This is
> | why I really need to use BLOBs, which the InputStream/OutputStream
> | that they support.
> 
> I understand. Does the object model in Turbine expose the stream interfaces
> to BLOBs, so you don't have to read all into memory first ?

Hmmm, I have not the faintest idea. (I'm a little new at this.) Maybe
jvanzyl can anwser that? (Please!)

I haven't been able to test it at all since the postgresql/db.props
file doesn't map any BLOB types.

Thanks,

A.

-- 
Adam Sherman
President & Technology Architect
Tritus Consultant Group Inc.
+1.613.255.5164
http://www.tritus.ca

---------------------------------------------------------------------
To unsubscribe, e-mail: turbine-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: turbine-user-help@jakarta.apache.org


Mime
View raw message