db-derby-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Manish Khettry (JIRA)" <derby-...@db.apache.org>
Subject [jira] Commented: (DERBY-438) Update triggers on tables with blob columns fail at execution time if the triggered-SQL-statement references the blob column(s).
Date Sun, 21 Aug 2005 23:50:58 GMT
    [ http://issues.apache.org/jira/browse/DERBY-438?page=comments#action_12319533 ] 

Manish Khettry commented on DERBY-438:

Even if the blob column is not referenced,  an io exception is thrown. The Exception is caused
by the SQL layer trying to read the same Stream twice. The test case is pretty much the same
as the attached go.java, reproduced here for convenience)

create table t1 (id int, updated smallint, bl blob(64k));
create trigger tr1 after update on t1 referencing 
                  new as n_row for each row mode db2sql values length(n_row.updated);
update t1 set updated = 1 where id = ?;

The first call to load the stream is in NormalizeResultSet; i.e.

        at org.apache.derby.iapi.types.SQLBlob.normalize(SQLBlob.java:110)
        at org.apache.derby.iapi.types.DataTypeDescriptor.normalize(DataTypeDescriptor.java:432)
        at org.apache.derby.impl.sql.execute.NormalizeResultSet.normalizeRow(NormalizeResultSet.java:351)
        at org.apache.derby.impl.sql.execute.NormalizeResultSet.getNextRowCore(NormalizeResultSet.java:207)
        at org.apache.derby.impl.sql.execute.DMLWriteResultSet.getNextRowCore(DMLWriteResultSet.java:124)

The second call is

        at org.apache.derby.iapi.types.SQLBinary.getValue(SQLBinary.java:198)
        at org.apache.derby.iapi.types.SQLBinary.loadStream(SQLBinary.java:542)
        at org.apache.derby.impl.sql.execute.DMLWriteResultSet.objectifyStreams(DMLWriteResultSet.java:151)
        at org.apache.derby.impl.sql.execute.DMLWriteResultSet.getNextRowCore(DMLWriteResultSet.java:132)
        at org.apache.derby.impl.sql.execute.UpdateResultSet.collectAffectedRows(UpdateResultSet.java:453)

If Derby does not support accessing LOB columns in triggers, there does seem to becode that
seems to think otherwise! In particular look at the constructor for DMLWriteResultStream (the
comments at the end, the variable 'needToObjectifyStream'). It looks to me, that for any DML
on a table which has triggers defined, we'll end up materializing stream columns even if the
DML and the trigger do not reference any stream storable columns.

So apart from the 4  "todo's" in the previous comment, I have a couple of questions for those
more familiar with this code.

1. If we don't support referencing blob columns in triggers, do we need the logic for objectifyStream
in DMLWriteResultSet? The comment in the constructor refers to several bug numbers which predate
Jira-- 2432, 3383, 4896. Is it possible to get information about these?

2. Does NormalizeResultSet need to normalize a column that is not referred to in the update?

> Update triggers on tables with blob columns fail at execution time if the triggered-SQL-statement
references the blob column(s).
> --------------------------------------------------------------------------------------------------------------------------------
>          Key: DERBY-438
>          URL: http://issues.apache.org/jira/browse/DERBY-438
>      Project: Derby
>         Type: Bug
>   Components: SQL
>     Versions:,,
>     Reporter: A B
>     Assignee: Manish Khettry
>      Fix For:
>  Attachments: go.java
> Suppose I have 1) a table "t1" with blob data in it, and 2) an UPDATE trigger "tr1" defined
on that table, where the triggered-SQL-action for "tr1" references the blob column from the
updated ("new") row. Ex:
> create table t1 (id int, updated smallint, bl blob(32000));
> create trigger tr1 after update on t1 referencing new as n_row for each row mode db2sql
values length(n_row.bl);
> Assuming that t1 has been populated with some data, then attempts to update t1 will fire
the trigger, but the result will be one of the two following errors:
> 1) If blob data is < 32K...
> If the actual data in the table is less than 32K in length, the result will be:
> ERROR XCL12: An attempt was made to put a data value of type 'org.apache.derby.impl.jdbc.EmbedBlob'
into a data value of type 'BLOB'.
> 2) If blob data is > 32K...
> If at least one row in the table has blob data that is longer than 32K (which means that
Derby will stream it, so far as I can tell), then the error will be:
> ERROR XCL30: An IOException was thrown when reading a 'BLOB' from an InputStream.
> ERROR XJ001: Java exception: ': java.io.EOFException'.
> Note that for data larger than 32K, this error will occur regardless of whether or not
the triggered-SQL-statement
> references the blob column.
> Surprisingly, it doesn't (appear to) matter what the trigger statement is actually doing--so
long as it references the blob column at least once, one of these two errors will occur, depending
on the length of the data.  And if the data is greater than 32k, then the error will happen
regardless of what the trigger does or whether or not it references the blob column.
> I looked at the documentation for UPDATE statements and TRIGGER statements, but nowhere
did I see anything saying that either of these will not work with blobs.  So as far as I can
tell, both of the above scenarios should succeed...

This message is automatically generated by JIRA.
If you think it was sent incorrectly contact one of the administrators:
For more information on JIRA, see:

View raw message