db-derby-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Parker, Patricia \(LNG-CON\)" <patricia.par...@lexisnexis.com>
Subject RE: SQL Exception: Container xxx not found
Date Thu, 10 May 2007 15:55:58 GMT
Hi Stanley,

Thanks for your reply.

I am running Derby 10.2.2 in embedded mode on Windows XP platform in
Eclipse.  I have attached both the derby.log file and a snapshot of my
Eclipse console, since the actual SQL error shows up there on line 231.
The corresponding messages in derby.log occur around line 185, when the
rollback starts to happen.

The only mention I have seen of a similar problem in Derby was in the
mail archives here...
<http://mail-archives.apache.org/mod_mbox/db-derby-user/200702.mbox/%3C4
5DBA554.10809@gmail.com%3E>

The response to the above mail seemed to imply that Derby may lose track
of tables, so I have been trying different timings of when I define my
tables in relation to when I do the inserts. 

Here is more detail on my process.  I am running a "graph" process,
where a node process runs on its own thread and bulk loads a bunch of
data from a flat file into a table that it defines on-the-fly using some
XML that describes the data in the flat file.  After the node process
thread completes, a connector thread runs that applies a filter to all
the data that was just bulk loaded and inserts the filtered data into
another brand new, defined on-the-fly table.  This is done by performing
an INSERT INTO <new connector table> SELECT * FROM <table created for
node> WHERE <some filter statement>.  Since this is a graph process,
many node threads may be processing at once, and many connector threads
may also be processing concurrently.  I sometimes have noticed some
deadlocking between connectors, but it seems to recover and move on ok.
My problem happens whether or not a deadlock message occurs.

My bulk loads have been working fine in the nodes, but I believe the
problem may occur when my program goes to access the bulk loaded data to
use as data to do the inserts into my new tables in the connectors.
Right now, I define my tables right before loading/inserting into them.
I have tried defining all my tables up front before running the bulk
loads and inserts.  I have also tried sharing instances of the
DriverManager, but that did not seem to help.  I now get a new
DriverManager each time I do a bulk load or insert, as the log will
show.

I am currently using the isolation TRANSACTION_READ_COMMITTED, but I
have had the same issue with TRANSACTION_SERIALIZABLE.


Thanks again for your help.  

Patty

Below are my methods for bulk load and insert, and for getting a
connection:

Here is my method for doing the bulk load:  

   public void importData(String tableName, List<FieldType> columnList,
	    List<String> columnIndexList, String inputFile,
	    String columnDelimiter, String charDelimiter, boolean
replace)
    {
	StringBuffer sb = new StringBuffer();
	for (FieldType field : columnList)
	{
	    sb.append(field.getName().toUpperCase());
	    sb.append(",");
	}
	sb.deleteCharAt(sb.length() - 1);

	String insertColumns = sb.toString();
	log.debug("Supposed to bulk load these columns: " +
insertColumns);

	String columnIndexes = null;

	Connection conn = getConnection();

	// convert the boolean 'replace' flag to an integer;
	int intReplace = replace ? 1 : 0;

	PreparedStatement ps = null;
	try
	{
	    ps = conn
		    .prepareStatement("CALL SYSCS_UTIL.SYSCS_IMPORT_DATA
(?,?,?,?,?,?,?,?,?)");
	    ps.setString(1, null); // schema name
	    ps.setString(2, tableName.toUpperCase()); // table name
	    log.debug("tableName: " + tableName.toUpperCase());
	    ps.setString(3, insertColumns.toUpperCase());
	    log.debug("insertColumn: " + insertColumns.toUpperCase());
	    ps.setString(4, columnIndexes);
	    log.debug("columnIndexes: " + columnIndexes);
	    ps.setString(5, inputFile); // input file
	    log.debug("inputFile: " + inputFile);
	    ps.setString(6, columnDelimiter); // column delimiter
	    log.debug("columnDelimiter: " + columnDelimiter);
	    ps.setString(7, charDelimiter); // character delimiter
	    log.debug("charDelimiter: " + charDelimiter);
	    ps.setString(8, null); // code set
	    ps.setInt(9, intReplace); // if true, empty the table before
	    log.debug("intReplace: " + intReplace);
	    // loading. Otherwise, add the data to
	    // what is already there.
	    ps.execute();

	} catch (Throwable e)
	{
	    log.debug("exception thrown:");

	    if (e instanceof SQLException)
	    {
		printSQLError((SQLException) e);
	    }
	    else
	    {
		e.printStackTrace();
	    }
	} finally
	{
	    try
	    {
		ps.close();
		conn.close();
	    } catch (Throwable e)
	    {
		log.debug("exception thrown:");

		if (e instanceof SQLException)
		{
		    printSQLError((SQLException) e);
		}
		else
		{
		    e.printStackTrace();
		}
	    }
	    try
	    {
		while (!conn.isClosed())
		{
		    wait(100L);
		}
	    } 
	    catch (SQLException e)
	    {
		// TODO Auto-generated catch block
		e.printStackTrace();
	    } 
	    catch (InterruptedException e)
	    {
		// TODO Auto-generated catch block
		e.printStackTrace();
	    }

	}
    }


Here is my method for doing inserts:

    public void insertToTable(String tableName, String selectStatement)
    {
	Connection conn = getConnection();
	Statement s = null;

	String insertStatement = ("INSERT INTO " + tableName + " " +
selectStatement);
	log.debug("Insert statement: " + insertStatement);
	try
	{
	    s = conn.createStatement();
	    s.execute(insertStatement);
	} 
	catch (Throwable e)
	{
	    log.debug("exception thrown:");

	    if (e instanceof SQLException)
	    {
		printSQLError((SQLException) e);
	    }
	    else
	    {
		e.printStackTrace();
	    }
	} 
	finally
	{

	    /*
                 * We end the transaction and the connection.
                 */

	    try
	    {
		    s.close();

		//conn.commit();
		conn.close();
		log.debug("closed connection");
	    } 
	    catch (Throwable e)
	    {
		log.debug("exception thrown:");

		if (e instanceof SQLException)
		{
		    printSQLError((SQLException) e);
		}
		else
		{
		    e.printStackTrace();
		}
	    }
	    //Make sure connection is closed before going on
	    try
	    {
		while (!conn.isClosed())
		{
		    wait(100L);
		}
	    } 
	    catch (Throwable e)
	    {
		log.debug("exception thrown:");

		if (e instanceof SQLException)
		{
		    printSQLError((SQLException) e);
		}
		else
		{
		    e.printStackTrace();
		}
	    }

	}

    }


Method for getting connection...

   private static Connection getConnection()
    {
	System.out
		.println("DerbyDBManager starting in " + framework + "
mode.");
	Connection conn = null;
	try
	{


	    Properties props = new Properties();
	    props.put("user", username);
	    props.put("password", "mine");


	    /*
                 * The connection specifies create=true to cause the
database to
                 * be created. To remove the database, remove the
directory
                 * derbyDB and its contents. The directory derbyDB will
be
                 * created under the directory that the system property
                 * derby.system.home points to, or the current directory
if
                 * derby.system.home is not set.
                 */
	    conn =
DriverManager.getConnection("jdbc:derby:C:/Derby881;create=true",
props);

	    log.debug("Created connection");

	    conn.setAutoCommit(true);

	
conn.setTransactionIsolation(Connection.TRANSACTION_READ_COMMITTED);
	} 
	catch (Throwable e)
	{
	    log.debug("exception thrown:");

	    if (e instanceof SQLException)
	    {
		printSQLError((SQLException) e);
	    }
	    else
	    {
		e.printStackTrace();
	    }
	}

	return conn;

    }
-----Original Message-----
From: Stanley Bradbury [mailto:Stan.Bradbury@gmail.com] 
Sent: Wednesday, May 09, 2007 7:59 PM
To: Derby Discussion
Subject: Re: SQL Exception: Container xxx not found

Parker, Patricia (LNG-CON) wrote:
>
> Hi,
>
> I am developing an application that does bulk loading and inserting 
> into tables that are created on-the-fly. I create four tables and 
> bulk-load data into them. Then I run a process that creates four 
> tables and selectively inserts data from the bulk-loaded tables into 
> the newly created tables. The bulk loads seem to go fine, but during 
> the insert process, I am randomly getting the error "SQL Exception: 
> Container xxx not found". The error does not happen on the same table 
> every time, and some times everything runs fine. I am only writing at 
> most eight records to a table, so I have not begun to put a load on 
> the process yet. Can you provide some guidance?
>
> Thanks
>
> Patty
>
Hi Patty -
I haven't heard of problems using Derby as you describe and think some 
additional information would be helpful. There should be detailed 
information in the derby.log file including a stack trace that will be 
helpful in zeroing in on the problem. Posting the whole log will be a 
help in knowing the version and if other issues are being reported. 
Also, please describe how the processes (two?) that perform 1) the bulk 
load and 2) the inserts are initiated and synchronized.

Mime
View raw message