accumulo-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Benjamin Parrish <benjamin.d.parr...@gmail.com>
Subject Re: Installing with Hadoop 2.2.0
Date Wed, 19 Mar 2014 12:29:25 GMT
I adjusted accumulo-env.sh to have hard coded values as seen below.

Are there any logs that could shed some light on this issue?

If it also helps I am using CentOS 6.5, Hadoop 2.2.0, ZooKeeper 3.4.6.

I also ran across this, that didn't look right...

Welcome to ZooKeeper!
2014-03-19 08:25:53,479 [myid:] - INFO
 [main-SendThread(localhost:2181):ClientCnxn$SendThread@975] - Opening
socket connection to server localhost/127.0.0.1:2181. Will not attempt to
authenticat
e using SASL (unknown error)
2014-03-19 08:25:53,483 [myid:] - INFO
 [main-SendThread(localhost:2181):ClientCnxn$SendThread@852] - Socket
connection established to localhost/127.0.0.1:2181, initiating session
JLine support is enabled
[zk: localhost:2181(CONNECTING) 0] 2014-03-19 08:25:53,523 [myid:] - INFO
 [main-SendThread(localhost:2181):ClientCnxn$SendThread@1235] - Session
establishment complete on server localhost/127.0.
0.1:2181, sessionid = 0x144da4e00d90000, negotiated timeout = 30000

should ZooKeeper try to hit localhost/127.0.0.1?

my zoo.cfg looks like this....
tickTime=2000
initLimit=10
syncLimit=5
dataDir=/usr/local/zookeeper/data
clientPort=2181
server.1=hadoop-node-1:2888:3888
server.2=hadoop-node-2:2888:3888
server.3=hadoop-node-3:2888:3888
server.4=hadoop-node-4:2888:3888
server.5=hadoop-node-5:2888:3888

#! /usr/bin/env bash

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

###
### Configure these environment variables to point to your local
installations.
###
### The functional tests require conditional values, so keep this style:
###
### test -z "$JAVA_HOME" && export JAVA_HOME=/usr/local/lib/jdk-1.6.0
###
###
### Note that the -Xmx -Xms settings below require substantial free memory:
### you may want to use smaller values, especially when running everything
### on a single machine.
###
if [ -z "$HADOOP_HOME" ]
then
   test -z "$HADOOP_PREFIX"      && export HADOOP_PREFIX=/usr/local/hadoop
else
   HADOOP_PREFIX="$HADOOP_HOME"
   unset HADOOP_HOME
fi
# test -z "$HADOOP_CONF_DIR"       && export
HADOOP_CONF_DIR="/usr/local/hadoop/conf"
# hadoop-2.0:
test -z "$HADOOP_CONF_DIR"     && export
HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"

#! /usr/bin/env bash

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

###
### Configure these environment variables to point to your local
installations.
###
### The functional tests require conditional values, so keep this style:
###
### test -z "$JAVA_HOME" && export JAVA_HOME=/usr/local/lib/jdk-1.6.0
###
###
### Note that the -Xmx -Xms settings below require substantial free memory:
### you may want to use smaller values, especially when running everything
### on a single machine.
###
if [ -z "$HADOOP_HOME" ]
then
   test -z "$HADOOP_PREFIX"      && export HADOOP_PREFIX=/usr/local/hadoop
else
   HADOOP_PREFIX="$HADOOP_HOME"
   unset HADOOP_HOME
fi
# test -z "$HADOOP_CONF_DIR"       && export
HADOOP_CONF_DIR="/usr/local/hadoop/conf"
# hadoop-2.0:
test -z "$HADOOP_CONF_DIR"     && export
HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"

test -z "$JAVA_HOME"             && export JAVA_HOME=/usr/lib/jvm/jdk1.7.0
test -z "$ZOOKEEPER_HOME"        && export
ZOOKEEPER_HOME=/usr/local/zookeeper
test -z "$ACCUMULO_LOG_DIR"      && export
ACCUMULO_LOG_DIR=/usr/local/accumulo/logs
if [ -f /usr/local/accumulo/conf/accumulo.policy ]
then
   POLICY="-Djava.security.manager
-Djava.security.policy=/usr/local/accumulo/conf/accumulo.policy"
fi
test -z "$ACCUMULO_TSERVER_OPTS" && export ACCUMULO_TSERVER_OPTS="${POLICY}
-Xmx1g -Xms1g -XX:NewSize=500m -XX:MaxNewSize=500m "
test -z "$ACCUMULO_MASTER_OPTS"  && export ACCUMULO_MASTER_OPTS="${POLICY}
-Xmx1g -Xms1g"
test -z "$ACCUMULO_MONITOR_OPTS" && export ACCUMULO_MONITOR_OPTS="${POLICY}
-Xmx1g -Xms256m"
test -z "$ACCUMULO_GC_OPTS"      && export ACCUMULO_GC_OPTS="-Xmx256m
-Xms256m"
test -z "$ACCUMULO_GENERAL_OPTS" && export
ACCUMULO_GENERAL_OPTS="-XX:+UseConcMarkSweepGC
-XX:CMSInitiatingOccupancyFraction=75 -Djava.net.preferIPv4Stack=true"
test -z "$ACCUMULO_OTHER_OPTS"   && export ACCUMULO_OTHER_OPTS="-Xmx1g
-Xms256m"
# what do when the JVM runs out of heap memory
export ACCUMULO_KILL_CMD='kill -9 %p'

# Should the monitor bind to all network interfaces -- default: false
# export ACCUMULO_MONITOR_BIND_ALL="true"


On Tue, Mar 18, 2014 at 8:58 PM, Sean Busbey <busbey+lists@cloudera.com>wrote:

>
> On Mar 18, 2014 7:51 PM, "Benjamin Parrish" <benjamin.d.parrish@gmail.com>
> wrote:
> >
> > HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop is set in all
> ~/.bash_profile files as needed.
> >
> >
>
> Can you add to the gist the output of running
>
> $> find $HADOOP_CONF_DIR
>
> As the user who runs the tablet server on the same host you ran the
> classpath command on?
>
> -Sean
>



-- 
Benjamin D. Parrish
H: 540-597-7860

Mime
View raw message