hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Howard <>
Subject ORA-8177 with Hive transactions
Date Fri, 18 Sep 2015 17:54:46 GMT
While troubleshooting an issue with transactions shortly after enabling
them, I noticed the following in an Oracle trace, which is our metastore
for hive...

ORA-8177: can't serialize access for this transaction

These were thrown on "insert into HIVE_LOCKS..."

Traditionally in Oracle, if an application actually needs serializable
transactions, the fix is to to set initrans and maxtrans to the number of
concurrent writers.  When I ran what is below on a table similar to
HIVE_LOCKS, this exception was thrown everywhere.  The fix is to recreate
the table with higher values for initrans (only 1 is the default for
initrans, and 255 is the default for maxtrans).  When I did this and re-ran
what is below, the exceptions were no longer thrown.

Does anyone have any feedback on this performance hint?  The exceptions in
hive are thrown from the checkRetryable method in the TxnHandler class, but
I couldn't find what class.method throws them.  Perhaps the exceptions are
not impactful, but given the fact the method expects them as it checks for
the string in the exception message, I thought I would ask for feedback
before we recreate the HIVE_LOCKS table with a higher value for INITRANS.

import java.sql.*;
public class testLock implements Runnable {
  public static void main (String[] args) throws Exception {
    for (int i = 1; i <= 100; i++) {
      testLock tl = new testLock();

  public testLock() {
    Thread t = new Thread(this);

  public void run() {
    try {
      Connection conn =
      conn.createStatement().execute("alter session set
isolation_level = serializable");
      PreparedStatement pst = conn.prepareStatement("update test set a = ?");
      for (int j = 1; j <= 10000; j++) {
    catch (Exception e) {

View raw message