accumulo-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jim Klucar <klu...@gmail.com>
Subject Re: [External] Re: Need help getting Accumulo running.
Date Mon, 02 Jul 2012 19:55:06 GMT
a-x will remove execute permissions for all hiding that directory.
just do sudo chmod 777 -R /var/zookeeper to open up permissions

Sent from my iPhone

On Jul 2, 2012, at 3:28 PM, "Park, Jee [USA]" <Park_Jee@bah.com> wrote:

> Hi, I used sudo chmod a-x /var/zookeeper, and still am getting permission
> denied
> How do I make sure /var/zookeeper is writable?
>
> -----Original Message-----
> From: William Slacum [mailto:wilhelm.von.cloud@accumulo.net]
> Sent: Monday, July 02, 2012 1:45 PM
> To: dev@accumulo.apache.org
> Subject: Re: [External] Re: Need help getting Accumulo running.
>
> Make sure that /var/zookeeper is writable by the user you're launching
> Zookeeper as. Alternatively, you can reconfigure zookeeper's zoo.cfg file to
> change the directory to somewhere that is writable.
>
> On Mon, Jul 2, 2012 at 1:42 PM, Park, Jee [USA] <Park_Jee@bah.com> wrote:
>
>> Ah, so I realized I wasn't running hadoop or zookeeper, and so I am
>> running hadoop, but cannot get zookeeper to run Here is what I did:
>>
>> $ $ZOOKEEPER_HOME/bin/zkServer.sh start JMX enabled by default Using
>> config: /usr/lib/zookeeper/bin/../conf/zoo.cfg
>> Starting zookeeper ... /usr/lib/zookeeper/bin/zkServer.sh: 110:
>> /usr/lib/zookeeper/bin/zkServer.sh: Cannot create
>> /var/zookeeper/zookeeper_server.pid: Permission denied FAILED TO WRITE
>> PID
>>
>>
>> -----Original Message-----
>> From: Jim Klucar [mailto:klucar@gmail.com]
>> Sent: Monday, July 02, 2012 1:25 PM
>> To: dev@accumulo.apache.org
>> Subject: Re: [External] Re: Need help getting Accumulo running.
>>
>> Did you verify that zookeeper is running?
>>
>> On Mon, Jul 2, 2012 at 1:21 PM, Park, Jee [USA] <Park_Jee@bah.com> wrote:
>>> Thanks everyone for the responses!
>>>
>>> So, I got hadoop to run and installed accumulo following Miguel's
>>> email, the problem now is that when I do
>>>
>>> $ bin/accumulo init
>>>
>>> It tries to connect a few times and then times out. Here is what it
>>> prints out.
>>> Just to let you know I did not change anything in the
>>> accumulo-site.xml file
>>>
>>> Thanks,
>>> Jee
>>>
>>> hduser@ubuntu:~/accumulo$ bin/accumulo init
>>> 02 10:10:07,567 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 0 time(s).
>>> 02 10:10:08,573 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 1 time(s).
>>> 02 10:10:09,574 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 2 time(s).
>>> 02 10:10:10,576 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 3 time(s).
>>> 02 10:10:11,578 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 4 time(s).
>>> 02 10:10:12,580 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 5 time(s).
>>> 02 10:10:13,581 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 6 time(s).
>>> 02 10:10:14,583 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 7 time(s).
>>> 02 10:10:15,585 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 8 time(s).
>>> 02 10:10:16,587 [ipc.Client] INFO : Retrying connect to server:
>>> localhost/127.0.0.1:54310. Already tried 9 time(s).
>>> 02 10:10:16,592 [util.Initialize] FATAL: java.net.ConnectException:
>>> Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused
>>> java.net.ConnectException: Call to localhost/127.0.0.1:54310 failed
>>> on connection exception: java.net.ConnectException: Connection
>>> refused at
>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>>> at $Proxy0.getProtocolVersion(Unknown Source) at
>>> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>>> at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>>> 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>>> at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>>> il
>>> eSyste
>>> m.java:82)
>>> at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>>> 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>>> at
>>> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>>> 54
>>> ) at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>>> .j
>>> ava:57
>>> )
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>>> ss
>>> orImpl
>>> .java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:601)
>>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>>> at java.lang.Thread.run(Thread.java:722)
>>> Caused by: java.net.ConnectException: Connection refused at
>>> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>>> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>>> 1)
>>> at
>>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>>> java:2
>>> 06)
>>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>>> at
>>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>>> 04
>>> ) at
>>> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>>> ... 20 more
>>> Thread "init" died null
>>> java.lang.reflect.InvocationTargetException
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl
>>> .j
>>> ava:57
>>> )
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcce
>>> ss
>>> orImpl
>>> .java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:601)
>>> at org.apache.accumulo.start.Main$1.run(Main.java:89)
>>> at java.lang.Thread.run(Thread.java:722)
>>> Caused by: java.lang.RuntimeException: java.net.ConnectException:
>>> Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:436)
>>> ... 6 more
>>> Caused by: java.net.ConnectException: Call to
>>> localhost/127.0.0.1:54310 failed on connection exception:
>>> java.net.ConnectException: Connection refused at
>>> org.apache.hadoop.ipc.Client.wrapException(Client.java:767)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:743)
>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>>> at $Proxy0.getProtocolVersion(Unknown Source) at
>>> org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>>> at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:10
>>> 6) at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
>>> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
>>> at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedF
>>> il
>>> eSyste
>>> m.java:82)
>>> at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:137
>>> 8) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
>>> at
>>> org.apache.accumulo.core.file.FileUtil.getFileSystem(FileUtil.java:5
>>> 54
>>> ) at
>>> org.apache.accumulo.server.util.Initialize.main(Initialize.java:426)
>>> ... 6 more
>>> Caused by: java.net.ConnectException: Connection refused at
>>> sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at
>>> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:70
>>> 1)
>>> at
>>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.
>>> java:2
>>> 06)
>>> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>>> at
>>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:3
>>> 04
>>> ) at
>>> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>>> at org.apache.hadoop.ipc.Client.getConnection(Client.java:860)
>>> at org.apache.hadoop.ipc.Client.call(Client.java:720)
>>> ... 20 more
>>>
>>> -----Original Message-----
>>> From: Miguel Pereira [mailto:miguelapereira1@gmail.com]
>>> Sent: Friday, June 29, 2012 2:59 PM
>>> To: dev@accumulo.apache.org
>>> Subject: [External] Re: Need help getting Accumulo running.
>>>
>>> Hi Jee,
>>>
>>> I used that same guide to install Accumulo, but I used this guide to
>>> install hadoop.
>>>
>>> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux
>>> -s
>>> ingle-
>>> node-cluster/
>>>
>>> furthermore here are the steps I took to install accumulo were I
>>> used version 1.4.0 and standalone conf.
>>> please note you also need to install java jdk, and set your
>>> JAVA_HOME i used jdk 1.7
>>>
>>> Setting up Accumulo
>>>
>>>
>>>   - git clone     git://github.com/apache/accumulo.git
>>>   - cd accumulo
>>>   - git checkout     tags/1.4.0 -b 1.4.0
>>>   - mvn package && mvn assembly:single -N.             // this can
>>> take a
>>>   while
>>>   - cp conf/examples/512MB/standalone/* conf
>>>   - vi accumulo-env.sh
>>>
>>>
>>> test -z "$JAVA_HOME" && export
>>> JAVA_HOME=/home/hduser/pkg/jdk1.7.0_04
>>> test -z "$HADOOP_HOME" && export
>>> HADOOP_HOME=/home/hduser/developer/workspace/hadoop
>>> test -z "$ZOOKEEPER_HOME" && export
>>> ZOOKEEPER_HOME=/home/hduser/developer/workspace/zookeeper-3.3.5
>>>
>>>   - vi     accumulo-site.xml
>>>
>>>
>>>    modify user, password, secret, memory
>>>
>>>
>>>   - bin/accumulo     init
>>>   - bin/start-all.sh
>>>   - bin/accumulo     shell -u root
>>>
>>> if you get the shell up you know your good.
>>>
>>>
>>> On Fri, Jun 29, 2012 at 2:49 PM, John Vines <john.w.vines@ugov.gov>
>> wrote:
>>>
>>>> We currently don't really support running on Windows. I'm sure
>>>> there are ways to get it running with Cygwin, but our efforts are
>>>> better spend in other directions for now.
>>>>
>>>> As for getting it going in Ubuntu, I haven't seen that guide before.
>>>> Can you let me know where it broke?
>>>>
>>>> For the record, when I was developing ACCUMULO-404, I was working
>>>> in Ubuntu VMs and I used Apache-BigTop and our debians to
>>>> facilitate
>>> installation.
>>>> They don't do everything for you, but I think if you use 1.4.1 (not
>>>> sure if I got the debs into 1..4.0), it should diminish the
>>>> installation work you must do to some minor configuration.
>>>>
>>>> John
>>>>
>>>> On Fri, Jun 29, 2012 at 2:28 PM, Park, Jee [USA] <Park_Jee@bah.com>
>> wrote:
>>>>
>>>>> Hi, ****
>>>>>
>>>>> ** **
>>>>>
>>>>> I had trouble getting Accumulo to work on a VM instance of Ubuntu
>>>>> (11.04) using this guide: https://gist.github.com/1535657.****
>>>>>
>>>>> Does anyone have a step-by-step guide to get it running on either
>>>>> Ubuntu or Windows 7?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Thanks!****
>>>>>
>>>>
>>

Mime
View raw message