db-ddlutils-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jason <jeac...@hardlight.com.au>
Subject Re: postgres bytea broken
Date Wed, 12 Apr 2006 11:24:01 GMT
	I'm currently using pg74.216.jdbc3.jar as the postgres jdbc driver.
*note this works fine in my application for eg 

  I looked at converter, but since I didnt have a readytogo class I 
didnt pursue it, also I have quite a few tables and many bytea type 
fields all of which would need to be individually handled this way. I 
was hoping to avoid that.

is there anyway to force it top be recognised as a Blob instead?

presumable the byte[] returned is complete and that wouldnt be good for 
out of mem problems with large datafields (plus the jdbc driver is very 
evil in that it also doesn't nicely stream when asked, instead it also 
buffers the complete field in ram at least once, and probably does a few 
nasty copies for fun.)

I'll play around a bit more.

Thomas Dudziak wrote:
> On 4/12/06, Jason <jeacott@hardlight.com.au> wrote:
>> I have tried everything I can think of, but I just cant dump data out of
>> bytea fields in postgres with ddlutils ant tasks. has anyone else had
>> this problem? does anyone know a workaround/solution?
>> running ant 1.6.5
>> java 1.5
>>>       <target name="database-dump-postgres" description="Dumps the database
structure" depends="getddldbinfo">
>>>         <taskdef name="databaseToDdl"
>>>                  classname="org.apache.ddlutils.task.DatabaseToDdlTask">
>>>           <classpath refid="runtime-classpath"/>
>>>         </taskdef>
>>>               <delete dir="${database.backup.path}/src/schema"></delete>
>>>               <mkdir  dir="${database.backup.path}/src/schema"/>
>>>         <databaseToDdl modelName="MyModel" databaseType="postgresql" useDelimitedSqlIdentifiers="true">
>>>          <database url="jdbc:postgresql://${system.database.hostname}/${database.name}"
>>>                     driverClassName="org.postgresql.Driver"
>>>                         username="${database.user}"
>>>                         password="${database.password}"/>
>>>           <writeSchemaToFile
>>>               outputFile="${database.backup.path}/src/schema/${database.backup.path}-schema.xml"/>
>>>           <writeDataToFile
>>>               outputFile="${database.backup.path}/src/schema/${database.backup.path}-data.xml"/>
>>>         </databaseToDdl>
>>>       </target>
>> for a table with def:
>>> CREATE TABLE binarydata (
>>>     binaryid bigint NOT NULL,
>>>     categoryid bigint,
>>>     name character varying(200),
>>>     mimetypeid bigint,
>>>     published boolean,
>>>     properties bytea,
>>>     binarydata bytea,
>>>     currentowner bigint DEFAULT 5,
>>>     modifieddate timestamp with time zone,
>>>     creationdate timestamp with time zone,
>>>     creator bigint DEFAULT 5
>>> );
>> I get this as schema:
>>> <table name="binarydata">
>>>       <column name="binaryid" primaryKey="true" required="true" type="BIGINT"
size="8" autoIncrement="false"/>
>>>       <column name="categoryid" primaryKey="false" required="false" type="BIGINT"
size="8" autoIncrement="false"/>
>>>       <column name="name" primaryKey="false" required="false" type="VARCHAR"
size="50" autoIncrement="false"/>
>>>       <column name="mimetypeid" primaryKey="false" required="false" type="BIGINT"
size="8" autoIncrement="false"/>
>>>       <column name="published" primaryKey="false" required="false" type="BIT"
size="1" autoIncrement="false"/>
>>>       <column name="properties" primaryKey="false" required="false" type="BINARY"
>>>       <column name="binarydata" primaryKey="false" required="false" type="BINARY"
>>>       <column name="currentowner" primaryKey="false" required="false" type="BIGINT"
size="8" default="5" autoIncrement="false"/>
>>>       <column name="modifieddate" primaryKey="false" required="false" type="TIMESTAMP"
size="8" autoIncrement="false"/>
>>>       <column name="creationdate" primaryKey="false" required="false" type="TIMESTAMP"
size="8" autoIncrement="false"/>
>>>       <column name="creator" primaryKey="false" required="false" type="BIGINT"
size="8" default="5" autoIncrement="false"/>
>>>       <unique name="aaaaabinarydata_pk">
>>>         <unique-column name="binaryid"/>
>>>       </unique>
>>>       <index name="categoryid">
>>>         <index-column name="categoryid"/>
>>>       </index>
>>>       <index name="datacatagorytreebinarydata">
>>>         <index-column name="categoryid"/>
>>>       </index>
>>>     </table>
> Mhmm, I wonder whether you get the bytea column as BINARY, not as
> BLOB. What JDBC driver version do you use ?
>> I get this as data:
>>>   <binarydata binaryid="13" categoryid="1008" name="banner_info.jpg" mimetypeid="0"
published="0" binarydata="[B@1ce669e" currentowner="5" modifieddate="2005-
>>> 09-09 14:50:12.343" creationdate="2005-09-09 14:50:12.343" creator="5"/>
>> but the actual data in the binarydata field is NOT "[B@1ce669e"
> That's because the BINARY column is returned as a byte[], and there is
> probably no default converter registered for the type BINARY. You can
> specify converters in the Ant task via the converter sub-element
> (http://db.apache.org/ddlutils/ant-tasks.html#Subelement%3A+converter).
> There are a couple of default converters in DdlUtils that you can look
> at for pointers. You might even be lucky in that the
> ByteArrayBase64Converter
> (http://svn.apache.org/viewcvs.cgi/db/ddlutils/trunk/src/java/org/apache/ddlutils/io/converters/ByteArrayBase64Converter.java?view=markup)
> works directly when you specify it for jdbc type BINARY in the Ant
> task.
> Tom

View raw message