hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hari Subramaniyan <>
Subject Re: Hive over JDBC: Retrieving job Id and status
Date Fri, 29 May 2015 09:01:40 GMT
Hi ?Kiran,

Which version of Hive are you using.

In 1.2 release, we have an option to set session level logging from client via hive.server2.logging.operation.level.
Setting this parameter to EXECUTION level  should provide map-red job information associated
with the query at the client side, which you should be able to retrieve in a parallel thread
as the query is running.  This idea is demonstrated in the following hive-unit test:

More information about the related parameter can be found here :

For the above parameter to work, hiveserver2 should have logging enabled, i.e. hive.server2.logging.operation.enabled
should be set to true (default is true) when you start hiveserver2.



From: Lonikar, Kiran <>
Sent: Thursday, May 28, 2015 9:23 PM
Subject: Hive over JDBC: Retrieving job Id and status


When a hive query is submitted to hiveserver2 over JDBC, is there a way to get the Hadoop
job id (and status) for the hive query?

The JDBC call "statement.execute(hiveQuery)" is a blocking call. Specifically, is there any
way to execute a query on the same JDBC connection to from another thread know the job Id?

For now, I am following this approach: Before submitting the actual query, I execute the following
on the same statement:

Here <pid> is the process id of the submitting java process and <currentTime>
is obtained using System.currentTimeMillis().

This sets the job name for the subsequent queries. I can then query the job Id for this job
name using the JobClient and then I can monitor the job status using this job Id.

Let me know if there is a better way to proceed.


View raw message