From user-return-1839-archive-asf-public=cust-asf.ponee.io@predictionio.apache.org Fri Mar 9 01:14:55 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 82FD518064C for ; Fri, 9 Mar 2018 01:14:53 +0100 (CET) Received: (qmail 4439 invoked by uid 500); 9 Mar 2018 00:14:52 -0000 Mailing-List: contact user-help@predictionio.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@predictionio.apache.org Delivered-To: mailing list user@predictionio.apache.org Received: (qmail 4428 invoked by uid 99); 9 Mar 2018 00:14:52 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 09 Mar 2018 00:14:52 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id EA0D71A0916 for ; Fri, 9 Mar 2018 00:14:51 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 5.062 X-Spam-Level: ***** X-Spam-Status: No, score=5.062 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, HTML_OBFUSCATE_10_20=1.162, KAM_BADIPHTTP=2, NORMAL_HTTP_TO_IP=0.001, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H2=-0.001, SPF_PASS=-0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id E7om6eDUfcaA for ; Fri, 9 Mar 2018 00:14:44 +0000 (UTC) Received: from mail-io0-f179.google.com (mail-io0-f179.google.com [209.85.223.179]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTPS id 105995F254 for ; Fri, 9 Mar 2018 00:14:43 +0000 (UTC) Received: by mail-io0-f179.google.com with SMTP id e30so1694596ioc.3 for ; Thu, 08 Mar 2018 16:14:43 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=zZR83E+BEWrZzPuYsfTMKPMEISKi8BBIXDAApIi5+Bk=; b=LiTl2pc4EjcQ+6Njg0qa007IPPzG1TIPTKQvD8weA4ayIt1pKENPIBShxF63k+12/I ucVnF2qTO+6OY1mxPfRgWl0pt/pCu39glxOvLbsYX8ASbftYzZJR5AvH7azIwo/2mrvB M7Y/1w/TN/Ym1gwvcFzxyjjezfL7iD3gzaGTtg2x258mP7T7Ykw7TZMdwZ1LaSXQnGsR n/nGXwrw/U5uE2Hf/pDm3jNsfPfCZhrbkHOTYAs0ip1Y5lnzshzzNc3KVhWp6B6HW1K9 AR/hFYQEjG9JLWy0K7qoaV7bpN8OPTqeGB6LcN6aYOAEgFFCMffvV6zhxYIkkDpEGQhQ FATw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=zZR83E+BEWrZzPuYsfTMKPMEISKi8BBIXDAApIi5+Bk=; b=JmQ0dtribuMRPr696Otqj57xeMpIp9xcU32mvn+um95//uiIQPkaEeJghNtnwyDlzK mB3RMonjbo3Cw1RjbEfqdFjwNf0DaSfUV0gibP4HpQiALq7Bw+K5uGb2qNAga+1Wsqyd mGanTI3d7d7nVv62mk4LNhdewLN++Kz5fxR6TGEnX9W4c8vZiKbrxlbNi8sT/qFiFKsU zqs2lY5/aMWydzQOmZmWO6YLeFmA7z7TXJWHEHcSNmY4lx+zR877OurBuGyHKxzNf1LG +XqHByfwc37v9IYRBdOIa0CtHXh0LDN+zPw/lKxkJ54FZWUnquwFurJ/Gl7XzGgnTaOO HBzg== X-Gm-Message-State: APf1xPCxTH8TMyCSvGoJAHCJc4ml4g7we1VB24fyph2aQ2aeXADhjisU hxbxzogzN0UyJ3SC5HkiJHwPkwlgpdwTrTHqb7PiaA== X-Google-Smtp-Source: AG47ELsdCVWZVMJWp3TiBJOmaHYa4+lYjWT9yjeCiOZ5N8xt6LNP1MEjIJaf8FwJtcbjjjlVJDfdo3W2N6s6L+uqvYo= X-Received: by 10.107.139.77 with SMTP id n74mr33428887iod.109.1520554481221; Thu, 08 Mar 2018 16:14:41 -0800 (PST) MIME-Version: 1.0 Received: by 10.2.180.129 with HTTP; Thu, 8 Mar 2018 16:14:40 -0800 (PST) In-Reply-To: References: From: Pawan Agnihotri Date: Thu, 8 Mar 2018 19:14:40 -0500 Message-ID: Subject: Re: [ERROR] [Storage$] Required repository (METADATA) configuration is missing. To: user@predictionio.apache.org Content-Type: multipart/alternative; boundary="94eb2c065182b708320566efad8d" --94eb2c065182b708320566efad8d Content-Type: text/plain; charset="UTF-8" Thanks Mars for the response. I tried adding the scheme and port to 9200 for elasticsearch in conf/pio-env.sh but still getting the same error which checking the status. here is my conf for elasticsearch and other logs. Please let me know if you need more details. thank you. *PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch* *#PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=elasticsearch* *PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost* *PIO_STORAGE_SOURCES_ELASTICSEARCH_SCHEMES=http* *PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9200* *PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/dfs/pawan_scala/mapr-predictionio/vendors/elasticsearch-5.5.2* [mapr@valvcshad004vm bin]$ ./pio status /dfs/pawan_scala/mapr-predictionio/bin/pio-class: line 89: /opt/mapr/spark/spark-2.1.0/mapr-util/generate-classpath.sh: No such file or directory /dfs/pawan_scala/mapr-predictionio/bin/pio-class: line 90: generate_compatible_classpath: command not found SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/dfs/pawan_scala/mapr-predictionio/assembly/pio-assembly-0.10.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/data/opt/mapr/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] [INFO] [Console$] Inspecting PredictionIO... [INFO] [Console$] PredictionIO 0.10.0-SNAPSHOT is installed at /dfs/pawan_scala/mapr-predictionio [INFO] [Console$] Inspecting Apache Spark... [INFO] [Console$] Apache Spark is installed at /dfs/pawan_scala/mapr-predictionio/vendors/spark-2.1.1-bin-hadoop2.6 [INFO] [Console$] Apache Spark 2.1.1 detected (meets minimum requirement of 1.3.0) [INFO] [Console$] Inspecting storage backend connections... [INFO] [Storage$] Verifying Meta Data Backend (Source: ELASTICSEARCH)... [ERROR] [Console$] Unable to connect to all storage backends successfully. The following shows the error message from the storage backend. [ERROR] [Console$] None of the configured nodes are available: [] (org.elasticsearch.client.transport.NoNodeAvailableException) [ERROR] [Console$] Dumping configuration of initialized storage backend sources. Please make sure they are correct. [ERROR] [Console$] Source Name: ELASTICSEARCH; Type: elasticsearch; Configuration: HOME -> /dfs/pawan_scala/mapr-predictionio/vendors/elasticsearch-5.5.2, HOSTS -> localhost, PORTS -> 9200, SCHEMES -> http, TYPE -> elasticsearch [mapr@valvcshad004vm bin]$ ------------ [mapr@valvcshad004vm bin]$ cat pio.log 2018-03-08 19:08:54,371 INFO org.apache.predictionio.tools.console.Console$ [main] - Inspecting PredictionIO... 2018-03-08 19:08:54,375 INFO org.apache.predictionio.tools.console.Console$ [main] - PredictionIO 0.10.0-SNAPSHOT is installed at /dfs/pawan_scala/mapr-predictionio 2018-03-08 19:08:54,376 INFO org.apache.predictionio.tools.console.Console$ [main] - Inspecting Apache Spark... 2018-03-08 19:08:54,389 INFO org.apache.predictionio.tools.console.Console$ [main] - Apache Spark is installed at /dfs/pawan_scala/mapr-predictionio/vendors/spark-2.1.1-bin-hadoop2.6 2018-03-08 19:08:54,427 INFO org.apache.predictionio.tools.console.Console$ [main] - Apache Spark 2.1.1 detected (meets minimum requirement of 1.3.0) 2018-03-08 19:08:54,428 INFO org.apache.predictionio.tools.console.Console$ [main] - Inspecting storage backend connections... 2018-03-08 19:08:54,450 INFO org.apache.predictionio.data.storage.Storage$ [main] - Verifying Meta Data Backend (Source: ELASTICSEARCH)... 2018-03-08 19:08:55,636 ERROR org.apache.predictionio.tools.console.Console$ [main] - Unable to connect to all storage backends successfully. The following shows the error message from the storage backend. 2018-03-08 19:08:55,638 ERROR org.apache.predictionio.tools.console.Console$ [main] - None of the configured nodes are available: [] (org.elasticsearch.client.transport.NoNodeAvailableException) org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [] at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:305) at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:200) at org.elasticsearch.client.transport.support.InternalTransportIndicesAdminClient.execute(InternalTransportIndicesAdminClient.java:86) at org.elasticsearch.client.support.AbstractIndicesAdminClient.exists(AbstractIndicesAdminClient.java:178) at org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequestBuilder.doExecute(IndicesExistsRequestBuilder.java:53) at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:91) at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:65) at org.elasticsearch.action.ActionRequestBuilder.get(ActionRequestBuilder.java:73) at org.apache.predictionio.data.storage.elasticsearch.ESEngineInstances.(ESEngineInstances.scala:42) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.predictionio.data.storage.Storage$.getDataObject(Storage.scala:306) at org.apache.predictionio.data.storage.Storage$.getDataObjectFromRepo(Storage.scala:266) at org.apache.predictionio.data.storage.Storage$.getMetaDataEngineInstances(Storage.scala:367) at org.apache.predictionio.data.storage.Storage$.verifyAllDataObjects(Storage.scala:342) at org.apache.predictionio.tools.console.Console$.status(Console.scala:1087) at org.apache.predictionio.tools.console.Console$$anonfun$main$1.apply(Console.scala:737) at org.apache.predictionio.tools.console.Console$$anonfun$main$1.apply(Console.scala:696) at scala.Option.map(Option.scala:145) at org.apache.predictionio.tools.console.Console$.main(Console.scala:696) at org.apache.predictionio.tools.console.Console.main(Console.scala) 2018-03-08 19:08:55,641 ERROR org.apache.predictionio.tools.console.Console$ [main] - Dumping configuration of initialized storage backend sources. Please make sure they are correct. 2018-03-08 19:08:55,644 ERROR org.apache.predictionio.tools.console.Console$ [main] - Source Name: ELASTICSEARCH; Type: elasticsearch; Configuration: HOME -> /dfs/pawan_scala/mapr-predictionio/vendors/elasticsearch-5.5.2, HOSTS -> localhost, PORTS -> 9200, SCHEMES -> http, TYPE -> elasticsearch [mapr@valvcshad004vm bin]$ On Thu, Mar 8, 2018 at 1:30 PM, Mars Hall wrote: > We added support for Elasticsearch 5 last year, so current PredictionIO > uses HTTP/REST protocol on port 9200, not the native protocol on port 9300. > > Here's the local dev config I have working with PIO 0.12.0 and > Elasticsearch 5: > > PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch > PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=$PIO_HOME/vendors/elasticsearch > PIO_STORAGE_SOURCES_ELASTICSEARCH_SCHEMES=http > PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost > PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9200 > > I believe Elasticsearch 1 is only supported via native protocol on port > 9300, which is eventually being removed in a future PIO release. > > On Thu, Mar 8, 2018 at 2:50 AM, Pawan Agnihotri > wrote: > >> Hello Donald and Team, >> >> I am working POC and I would like to use predictionIO.I know its >> configuration issue with elasticsearch but I am kind of stuck with below >> error so reaching out for help. >> >> I am in need of some quick hand here as the time is running out. anything >> you feel I can try out or steps would be helpful please. >> >> >> *2018-03-07 21:36:15,602 ERROR >> org.apache.predictionio.tools.console.Console$ [main] - None of the >> configured nodes are available: [] >> (org.elasticsearch.client.transport.NoNodeAvailableException)* >> *org.elasticsearch.client.transport.NoNodeAvailableException: None of the >> configured nodes are available: []* >> >> >> *I am using the steps from >> - http://predictionio.apache.org/install/install-sourcecode/ >> * >> >> *below is pio-env.sh and Error logs* >> >> *[mapr@valvcshad004vm conf]$ cat pio-env.sh* >> #!/usr/bin/env bash >> # >> # Copy this file as pio-env.sh and edit it for your site's configuration. >> # >> # Licensed to the Apache Software Foundation (ASF) under one or more >> # contributor license agreements. See the NOTICE file distributed with >> # this work for additional information regarding copyright ownership. >> # The ASF licenses this file to You under the Apache License, Version 2.0 >> # (the "License"); you may not use this file except in compliance with >> # the License. You may obtain a copy of the License at >> # >> # http://www.apache.org/licenses/LICENSE-2.0 >> # >> # Unless required by applicable law or agreed to in writing, software >> # distributed under the License is distributed on an "AS IS" BASIS, >> # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. >> # See the License for the specific language governing permissions and >> # limitations under the License. >> # >> >> # PredictionIO Main Configuration >> # >> # This section controls core behavior of PredictionIO. It is very likely >> that >> # you need to change these to fit your site. >> >> # SPARK_HOME: Apache Spark is a hard dependency and must be configured. >> #SPARK_HOME=$PIO_HOME/vendors/spark-1.5.1-bin-hadoop2.6 >> SPARK_HOME=/dfs/pawan_scala/mapr-predictionio/vendors/spark- >> 2.1.1-bin-hadoop2.6 >> POSTGRES_JDBC_DRIVER=$PIO_HOME/lib/postgresql-42.2.1.jar >> MYSQL_JDBC_DRIVER=$PIO_HOME/lib/mysql-connector-java-5.1.37.jar >> >> # ES_CONF_DIR: You must configure this if you have advanced configuration >> for >> # your Elasticsearch setup. >> # ES_CONF_DIR=/opt/elasticsearch >> >> # HADOOP_CONF_DIR: You must configure this if you intend to run >> PredictionIO >> # with Hadoop 2. >> # HADOOP_CONF_DIR=/opt/hadoop >> >> # HBASE_CONF_DIR: You must configure this if you intend to run >> PredictionIO >> # with HBase on a remote cluster. >> # HBASE_CONF_DIR=$PIO_HOME/vendors/hbase-1.0.0/conf >> >> # Filesystem paths where PredictionIO uses as block storage. >> PIO_FS_BASEDIR=$HOME/.pio_store >> PIO_FS_ENGINESDIR=$PIO_FS_BASEDIR/engines >> PIO_FS_TMPDIR=$PIO_FS_BASEDIR/tmp >> >> # PredictionIO Storage Configuration >> # >> # This section controls programs that make use of PredictionIO's built-in >> # storage facilities. Default values are shown below. >> # >> # For more information on storage configuration please refer to >> # http://predictionio.incubator.apache.org/system/anotherdatastore/ >> >> # Storage Repositories >> >> # Default is to use PostgreSQL >> #PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta >> #PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=PGSQL >> >> #PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event >> #PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=PGSQL >> >> #PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model >> #PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=PGSQL >> >> PIO_STORAGE_REPOSITORIES_METADATA_NAME=predictionio_metadata >> PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH >> >> PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=predictionio_eventdata >> PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE >> >> PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_ >> PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=LOCALFS >> >> # Storage Data Sources >> >> >> # PostgreSQL Default Settings >> # Please change "pio" to your database name in >> PIO_STORAGE_SOURCES_PGSQL_URL >> # Please change PIO_STORAGE_SOURCES_PGSQL_USERNAME and >> # PIO_STORAGE_SOURCES_PGSQL_PASSWORD accordingly >> #PIO_STORAGE_SOURCES_PGSQL_TYPE=jdbc >> #PIO_STORAGE_SOURCES_PGSQL_URL=jdbc:postgresql://localhost/pio >> #PIO_STORAGE_SOURCES_PGSQL_USERNAME=pio >> #$PIO_STORAGE_SOURCES_PGSQL_PASSWORD=pio >> >> # MySQL Example >> # PIO_STORAGE_SOURCES_MYSQL_TYPE=jdbc >> # PIO_STORAGE_SOURCES_MYSQL_URL=jdbc:mysql://localhost/pio >> # PIO_STORAGE_SOURCES_MYSQL_USERNAME=pio >> # PIO_STORAGE_SOURCES_MYSQL_PASSWORD=pio >> >> # Elasticsearch Example >> PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch >> #PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=elasticsearch >> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost >> PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300 >> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/dfs/pawan_scala/ >> mapr-predictionio/vendors/elasticsearch-5.5.2 >> >> # Local File System Example >> PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs >> PIO_STORAGE_SOURCES_LOCALFS_PATH=$PIO_FS_BASEDIR/models >> >> # HBase Example >> PIO_STORAGE_SOURCES_HBASE_TYPE=hbase >> PIO_STORAGE_SOURCES_HBASE_HOME=/dfs/pawan_scala/mapr-predict >> ionio/vendors/hbase-1.2.6 >> >> >> >> *ERROR in the logs---* >> >> 2018-03-07 21:36:14,491 INFO org.apache.predictionio.data.storage.Storage$ >> [main] - Verifying Meta Data Backend (Source: ELASTICSEARCH)... >> 2018-03-07 21:36:15,601 ERROR org.apache.predictionio.tools.console.Console$ >> [main] - Unable to connect to all storage backends successfully. The >> following shows the error message from the storage backend. >> 2018-03-07 21:36:15,602 ERROR org.apache.predictionio.tools.console.Console$ >> [main] - None of the configured nodes are available: [] >> (org.elasticsearch.client.transport.NoNodeAvailableException) >> org.elasticsearch.client.transport.NoNodeAvailableException: None of the >> configured nodes are available: [] >> at org.elasticsearch.client.transport.TransportClientNodesServi >> ce.ensureNodesAreAvailable(TransportClientNodesService.java:305) >> at org.elasticsearch.client.transport.TransportClientNodesServi >> ce.execute(TransportClientNodesService.java:200) >> at org.elasticsearch.client.transport.support.InternalTransport >> IndicesAdminClient.execute(InternalTransportIndicesAdminClient.java:86) >> at org.elasticsearch.client.support.AbstractIndicesAdminClient. >> exists(AbstractIndicesAdminClient.java:178) >> at org.elasticsearch.action.admin.indices.exists.indices.Indice >> sExistsRequestBuilder.doExecute(IndicesExistsRequestBuilder.java:53) >> at org.elasticsearch.action.ActionRequestBuilder.execute(Action >> RequestBuilder.java:91) >> at org.elasticsearch.action.ActionRequestBuilder.execute(Action >> RequestBuilder.java:65) >> at org.elasticsearch.action.ActionRequestBuilder.get(ActionRequ >> estBuilder.java:73) >> at org.apache.predictionio.data.storage.elasticsearch.ESEngineI >> nstances.(ESEngineInstances.scala:42) >> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >> Method) >> at sun.reflect.NativeConstructorAccessorImpl.newInstance(Native >> ConstructorAccessorImpl.java:62) >> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(De >> legatingConstructorAccessorImpl.java:45) >> at java.lang.reflect.Constructor.newInstance(Constructor.java:4 >> 23) >> at org.apache.predictionio.data.storage.Storage$.getDataObject( >> Storage.scala:306) >> at org.apache.predictionio.data.storage.Storage$.getDataObjectF >> romRepo(Storage.scala:266) >> at org.apache.predictionio.data.storage.Storage$.getMetaDataEng >> ineInstances(Storage.scala:367) >> at org.apache.predictionio.data.storage.Storage$.verifyAllDataO >> bjects(Storage.scala:342) >> at org.apache.predictionio.tools.console.Console$.status(Consol >> e.scala:1087) >> at org.apache.predictionio.tools.console.Console$$anonfun$main$ >> 1.apply(Console.scala:737) >> at org.apache.predictionio.tools.console.Console$$anonfun$main$ >> 1.apply(Console.scala:696) >> at scala.Option.map(Option.scala:145) >> at org.apache.predictionio.tools.console.Console$.main(Console. >> scala:696) >> at org.apache.predictionio.tools.console.Console.main(Console.s >> cala) >> 2018-03-07 21:36:15,605 ERROR org.apache.predictionio.tools.console.Console$ >> [main] - Dumping configuration of initialized storage backend sources. >> Please make sure they are correct. >> 2018-03-07 21:36:15,607 ERROR org.apache.predictionio.tools.console.Console$ >> [main] - Source Name: ELASTICSEARCH; Type: elasticsearch; Configuration: >> HOSTS -> localhost, TYPE -> elasticsearch, HOME -> >> /dfs/pawan_scala/mapr-predictionio/vendors/elasticsearch-5.5.2, PORTS -> >> 9300 >> 2018-03-07 21:36:42,649 INFO org.apache.predictionio.tools.console.Console$ >> [main] - Creating Event Server at 0.0.0.0:7070 >> 2018-03-07 21:36:43,417 ERROR org.apache.predictionio.data.storage.Storage$ >> [main] - Error initializing storage client for source HBASE >> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseC >> onfiguration >> at org.apache.predictionio.data.storage.hbase.StorageClient.> it>(StorageClient.scala:46) >> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >> Method) >> at sun.reflect.NativeConstructorAccessorImpl.newInstance(Native >> ConstructorAccessorImpl.java:62) >> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(De >> legatingConstructorAccessorImpl.java:45) >> at java.lang.reflect.Constructor.newInstance(Constructor.java:4 >> 23) >> at org.apache.predictionio.data.storage.Storage$.getClient(Stor >> age.scala:220) >> at org.apache.predictionio.data.storage.Storage$.org$apache$pre >> dictionio$data$storage$Storage$$updateS2CM(Storage.scala:251) >> at org.apache.predictionio.data.storage.Storage$$anonfun$source >> sToClientMeta$1.apply(Storage.scala:212) >> at org.apache.predictionio.data.storage.Storage$$anonfun$source >> sToClientMeta$1.apply(Storage.scala:212) >> >> >> On Wed, Mar 7, 2018 at 1:21 AM, Pawan Agnihotri < >> pawan.agnihotri@gmail.com> wrote: >> >>> Hello, >>> >>> I need your help to configure predictionIO on linux 7.2 - >>> >>> I am using >>> http://predictionio.apache.org/install/install-sourcecode/ link for >>> steps and installed spark, elastic search and hbase but getting below error >>> >>> >>> [mapr@valvcshad004vm bin]$ ./pio status >>> /dfs/pawan_scala/mapr-predictionio/bin/pio-class: line 89: >>> /opt/mapr/spark/spark-2.1.0/mapr-util/generate-classpath.sh: No such >>> file or directory >>> /dfs/pawan_scala/mapr-predictionio/bin/pio-class: line 90: >>> generate_compatible_classpath: command not found >>> SLF4J: Class path contains multiple SLF4J bindings. >>> SLF4J: Found binding in [jar:file:/dfs/pawan_scala/map >>> r-predictionio/assembly/pio-assembly-0.10.0-SNAPSHOT.jar!/or >>> g/slf4j/impl/StaticLoggerBinder.class] >>> SLF4J: Found binding in [jar:file:/data/opt/mapr/lib/s >>> lf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class] >>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an >>> explanation. >>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] >>> [INFO] [Console$] Inspecting PredictionIO... >>> [INFO] [Console$] PredictionIO 0.10.0-SNAPSHOT is installed at >>> /dfs/pawan_scala/mapr-predictionio >>> [INFO] [Console$] Inspecting Apache Spark... >>> [INFO] [Console$] Apache Spark is installed at >>> /dfs/pawan_scala/mapr-predictionio/vendors/spark-2.1.1-bin-hadoop2.6 >>> [INFO] [Console$] Apache Spark 2.1.1 detected (meets minimum requirement >>> of 1.3.0) >>> [INFO] [Console$] Inspecting storage backend connections... >>> *[WARN] [Storage$] There is no properly configured repository.* >>> *[ERROR] [Storage$] Required repository (METADATA) configuration is >>> missing.* >>> *[ERROR] [Storage$] There were 1 configuration errors. Exiting.* >>> [mapr@valvcshad004vm bin]$ >>> >>> Here is my *pio-env.sh *file >>> >>> [mapr@valvcshad004vm conf]$ cat pio-env.sh >>> # Default is to use PostgreSQL >>> #PIO_STORAGE_REPOSITORIES_METADATA_NAME=pio_meta >>> #PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=PGSQL >>> >>> #PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=pio_event >>> #PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=PGSQL >>> >>> #PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_model >>> #PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=PGSQL >>> >>> # Storage Data Sources >>> >>> # PostgreSQL Default Settings >>> # Please change "pio" to your database name in >>> PIO_STORAGE_SOURCES_PGSQL_URL >>> # Please change PIO_STORAGE_SOURCES_PGSQL_USERNAME and >>> # PIO_STORAGE_SOURCES_PGSQL_PASSWORD accordingly >>> #PIO_STORAGE_SOURCES_PGSQL_TYPE=jdbc >>> #PIO_STORAGE_SOURCES_PGSQL_URL=jdbc:postgresql://localhost/pio >>> #PIO_STORAGE_SOURCES_PGSQL_USERNAME=pio >>> #$PIO_STORAGE_SOURCES_PGSQL_PASSWORD=pio >>> >>> # MySQL Example >>> # PIO_STORAGE_SOURCES_MYSQL_TYPE=jdbc >>> # PIO_STORAGE_SOURCES_MYSQL_URL=jdbc:mysql://localhost/pio >>> # PIO_STORAGE_SOURCES_MYSQL_USERNAME=pio >>> # PIO_STORAGE_SOURCES_MYSQL_PASSWORD=pio >>> >>> # Elasticsearch Example >>> PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch >>> PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=elasticsearch >>> _cluster_name >>> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost >>> PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300 >>> PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=/dfs/pawan_scala/map >>> r-predictionio/vendors/elasticsearch-5.5.2 >>> >>> # Local File System Example >>> PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs >>> PIO_STORAGE_SOURCES_LOCALFS_PATH=$PIO_FS_BASEDIR/models >>> >>> # HBase Example >>> PIO_STORAGE_SOURCES_HBASE_TYPE=hbase >>> PIO_STORAGE_SOURCES_HBASE_HOME=/dfs/pawan_scala/mapr-predic >>> tionio/vendors/hbase-1.2.6 >>> >>> [mapr@valvcshad004vm conf]$ >>> >>> >>> -- >>> Thanks, >>> Pawan Agnihotri >>> >> >> >> >> -- >> Thanks, >> Pawan Agnihotri >> > > > > -- > *Mars Hall > 415-818-7039 <(415)%20818-7039> > Customer Facing Architect > Salesforce Platform / Heroku > San Francisco, California > -- Thanks, Pawan Agnihotri --94eb2c065182b708320566efad8d Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
Thanks Mars for the response. I tried adding the scheme an= d port to 9200 for elasticsearch in conf/pio-env.sh but still getting the s= ame error which checking the status.=C2=A0=C2=A0

here is my conf for elasticsearch and other logs. Please let me know if = you need more details. thank you.=C2=A0

= PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=3Delasticsearch
#PI= O_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=3Delasticsearch
<= b>PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=3Dlocalhost
PIO_= STORAGE_SOURCES_ELASTICSEARCH_SCHEMES=3Dhttp
PIO_STORAGE_S= OURCES_ELASTICSEARCH_PORTS=3D9200
PIO_STORAGE_SOURCES_ELAS= TICSEARCH_HOME=3D/dfs/pawan_scala/mapr-predictionio/vendors/elasticsearch-5= .5.2


[mapr@valvcshad= 004vm bin]$ ./pio status
/dfs/pawan_scala/mapr-predictionio/bin/p= io-class: line 89: /opt/mapr/spark/spark-2.1.0/mapr-util/generate-classpath= .sh: No such file or directory
/dfs/pawan_scala/mapr-predictionio= /bin/pio-class: line 90: generate_compatible_classpath: command not found
SLF4J: Class path contains multiple SLF4J bindings.
SLF4= J: Found binding in [jar:file:/dfs/pawan_scala/mapr-predictionio/assembly/p= io-assembly-0.10.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/opt/mapr/lib/slf4j-log4j12= -1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4= J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
= [INFO] [Console$] Inspecting PredictionIO...
[INFO] [Console$] Pr= edictionIO 0.10.0-SNAPSHOT is installed at /dfs/pawan_scala/mapr-prediction= io
[INFO] [Console$] Inspecting Apache Spark...
[INFO] = [Console$] Apache Spark is installed at /dfs/pawan_scala/mapr-predictionio/= vendors/spark-2.1.1-bin-hadoop2.6
[INFO] [Console$] Apache Spark = 2.1.1 detected (meets minimum requirement of 1.3.0)
[INFO] [Conso= le$] Inspecting storage backend connections...
[INFO] [Storage$] = Verifying Meta Data Backend (Source: ELASTICSEARCH)...
[ERROR] [C= onsole$] Unable to connect to all storage backends successfully. The follow= ing shows the error message from the storage backend.
[ERROR] [Co= nsole$] None of the configured nodes are available: [] (org.elasticsearch.c= lient.transport.NoNodeAvailableException)
[ERROR] [Console$] Dump= ing configuration of initialized storage backend sources. Please make sure = they are correct.
[ERROR] [Console$] Source Name: ELASTICSEARCH; = Type: elasticsearch; Configuration: HOME -> /dfs/pawan_scala/mapr-predic= tionio/vendors/elasticsearch-5.5.2, HOSTS -> localhost, PORTS -> 9200= , SCHEMES -> http, TYPE -> elasticsearch
[mapr@valvcshad004= vm bin]$

------------
[mapr@v= alvcshad004vm bin]$ cat pio.log

2018-03-08 19:08:5= 4,371 INFO=C2=A0 org.apache.predictionio.tools.console.Console$ [main] - In= specting PredictionIO...
2018-03-08 19:08:54,375 INFO=C2=A0 org.a= pache.predictionio.tools.console.Console$ [main] - PredictionIO 0.10.0-SNAP= SHOT is installed at /dfs/pawan_scala/mapr-predictionio
2018-03-0= 8 19:08:54,376 INFO=C2=A0 org.apache.predictionio.tools.console.Console$ [m= ain] - Inspecting Apache Spark...
2018-03-08 19:08:54,389 INFO=C2= =A0 org.apache.predictionio.tools.console.Console$ [main] - Apache Spark is= installed at /dfs/pawan_scala/mapr-predictionio/vendors/spark-2.1.1-bin-ha= doop2.6
2018-03-08 19:08:54,427 INFO=C2=A0 org.apache.predictioni= o.tools.console.Console$ [main] - Apache Spark 2.1.1 detected (meets minimu= m requirement of 1.3.0)
2018-03-08 19:08:54,428 INFO=C2=A0 org.ap= ache.predictionio.tools.console.Console$ [main] - Inspecting storage backen= d connections...
2018-03-08 19:08:54,450 INFO=C2=A0 org.apache.pr= edictionio.data.storage.Storage$ [main] - Verifying Meta Data Backend (Sour= ce: ELASTICSEARCH)...
2018-03-08 19:08:55,636 ERROR org.apache.pr= edictionio.tools.console.Console$ [main] - Unable to connect to all storage= backends successfully. The following shows the error message from the stor= age backend.
2018-03-08 19:08:55,638 ERROR org.apache.predictioni= o.tools.console.Console$ [main] - None of the configured nodes are availabl= e: [] (org.elasticsearch.client.transport.NoNodeAvailableException)
org.elasticsearch.client.transport.NoNodeAvailableException: None of the= configured nodes are available: []
=C2=A0 =C2=A0 =C2=A0 =C2=A0 a= t org.elasticsearch.client.transport.TransportClientNodesService.ensureNode= sAreAvailable(TransportClientNodesService.java:305)
=C2=A0 =C2=A0= =C2=A0 =C2=A0 at org.elasticsearch.client.transport.TransportClientNodesSe= rvice.execute(TransportClientNodesService.java:200)
=C2=A0 =C2=A0= =C2=A0 =C2=A0 at org.elasticsearch.client.transport.support.InternalTransp= ortIndicesAdminClient.execute(InternalTransportIndicesAdminClient.java:86)<= /div>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.elasticsearch.client.support.A= bstractIndicesAdminClient.exists(AbstractIndicesAdminClient.java:178)
=
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.elasticsearch.action.admin.indices.= exists.indices.IndicesExistsRequestBuilder.doExecute(IndicesExistsRequestBu= ilder.java:53)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.elasticsearch.a= ction.ActionRequestBuilder.execute(ActionRequestBuilder.java:91)
= =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.elasticsearch.action.ActionRequestBuilde= r.execute(ActionRequestBuilder.java:65)
=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at org.elasticsearch.action.ActionRequestBuilder.get(ActionRequestBuild= er.java:73)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictioni= o.data.storage.elasticsearch.ESEngineInstances.<init>(ESEngineInstanc= es.scala:42)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.NativeCon= structorAccessorImpl.newInstance0(Native Method)
=C2=A0 =C2=A0 = =C2=A0 =C2=A0 at sun.reflect.NativeConstructorAccessorImpl.newInstance(Nati= veConstructorAccessorImpl.java:62)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at= sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstr= uctorAccessorImpl.java:45)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.la= ng.reflect.Constructor.newInstance(Constructor.java:423)
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.data.storage.Storage$.getDa= taObject(Storage.scala:306)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.ap= ache.predictionio.data.storage.Storage$.getDataObjectFromRepo(Storage.scala= :266)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.data= .storage.Storage$.getMetaDataEngineInstances(Storage.scala:367)
= =C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.data.storage.Storage= $.verifyAllDataObjects(Storage.scala:342)
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 at org.apache.predictionio.tools.console.Console$.status(Console.sca= la:1087)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.t= ools.console.Console$$anonfun$main$1.apply(Console.scala:737)
=C2= =A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.tools.console.Console$$= anonfun$main$1.apply(Console.scala:696)
=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at scala.Option.map(Option.scala:145)
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 at org.apache.predictionio.tools.console.Console$.main(Console.scala= :696)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.tool= s.console.Console.main(Console.scala)
2018-03-08 19:08:55,641 ERR= OR org.apache.predictionio.tools.console.Console$ [main] - Dumping configur= ation of initialized storage backend sources. Please make sure they are cor= rect.
2018-03-08 19:08:55,644 ERROR org.apache.predictionio.tools= .console.Console$ [main] - Source Name: ELASTICSEARCH; Type: elasticsearch;= Configuration: HOME -> /dfs/pawan_scala/mapr-predictionio/vendors/elast= icsearch-5.5.2, HOSTS -> localhost, PORTS -> 9200, SCHEMES -> http= , TYPE -> elasticsearch
[mapr@valvcshad004vm bin]$
=


On Thu, Mar 8, 2018 at 1:30 PM, Mars Hall <<= a href=3D"mailto:mars.hall@salesforce.com" target=3D"_blank">mars.hall@sale= sforce.com> wrote:
We added support for Elasticsearch 5 last year, so current Predict= ionIO uses HTTP/REST protocol on port 9200, not the native protocol on port= 9300.

Here's the local dev config I have working wi= th PIO 0.12.0 and Elasticsearch 5:

=C2=A0 =C2=A0 PIO_STORAGE_SO= URCES_ELASTICSEARCH_TYPE=3Delasticsearch
=C2=A0 =C2=A0 PIO_S= TORAGE_SOURCES_ELASTICSEARCH_HOME=3D$PIO_HOME/vendors/elasticsear= ch
=C2=A0 =C2=A0 PIO_STORAGE_SOURCES_ELASTICSEARCH_SCHEMES= =3Dhttp
=C2=A0 =C2=A0 PIO_STORAGE_SOURCES_ELASTICSEARCH_HOST= S=3Dlocalhost
=C2=A0 =C2=A0 PIO_STORAGE_SOURCES_ELASTICSEARC= H_PORTS=3D9200

I believe Elasticsearch 1 is = only supported via=C2=A0native protocol on port 9= 300, which is=C2=A0eventually being removed in a future PIO release.=

On Thu, Mar 8, 2018 at 2:50 AM, Pawan Agnihotri <pawan.agnihotri@gmail.com> wrote:
Hello Donald and Team,

I am wor= king POC and I would like to use predictionIO.I know its configuration issu= e with elasticsearch but I am kind of stuck with below error so reaching ou= t for help.=C2=A0

I am in need of some quick hand = here as the time is running out. anything you feel I can try out or steps w= ould be helpful please.


2018-03-07 21:= 36:15,602 ERROR org.apache.predictionio.tools.console.Console$ [main] = - None of the configured nodes are available: [] (org.elasticsearch.client.= transport.NoNodeAvailableException)
org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available:= []



below is pio-env.sh and Error logs=

[mapr@valvcshad004vm conf]$ cat pio= -env.sh
#!/usr/bin/env bash
#
# Co= py this file as pio-env.sh and edit it for your site's configuration.
#
# Licensed to the Apache Software Foundation (ASF) und= er one or more
# contributor license agreements.=C2=A0 See the NO= TICE file distributed with
# this work for additional information= regarding copyright ownership.
# The ASF licenses this file to Y= ou under the Apache License, Version 2.0
# (the "License&quo= t;); you may not use this file except in compliance with
# the Li= cense.=C2=A0 You may obtain a copy of the License at
#
= #
# Unless required by applicable law or agreed to in writing, so= ftware
# distributed under the License is distributed on an "= ;AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIN= D, either express or implied.
# See the License for the specific = language governing permissions and
# limitations under the Licens= e.
#

# PredictionIO Main Configuration
#
# This section controls core behavior of PredictionIO.= It is very likely that
# you need to change these to fit your si= te.

# SPARK_HOME: Apache Spark is a hard dependenc= y and must be configured.
#SPARK_HOME=3D$PIO_HOME/vendors/sp= ark-1.5.1-bin-hadoop2.6
SPARK_HOME=3D/dfs/pawan_scala/mapr-p= redictionio/vendors/spark-2.1.1-bin-hadoop2.6
POSTGRES_JDBC_= DRIVER=3D$PIO_HOME/lib/postgresql-42.2.1.jar
MYSQL_JDBC_DRIV= ER=3D$PIO_HOME/lib/mysql-connector-java-5.1.37.jar

=
# ES_CONF_DIR: You must configure this if you have advanced conf= iguration for
#=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 y= our Elasticsearch setup.
# ES_CONF_DIR=3D/opt/elasticsearch
=

# HADOOP_CONF_DIR: You must configure this if you inten= d to run PredictionIO
#=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 with Hadoop 2.
# HADOOP_CONF_DIR=3D/opt/hado= op

# HBASE_CONF_DIR: You must configure this if yo= u intend to run PredictionIO
#=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0 =C2=A0with HBase on a remote cluster.
# HBAS= E_CONF_DIR=3D$PIO_HOME/vendors/hbase-1.0.0/conf

# Filesystem paths where PredictionIO uses as block storage.
P= IO_FS_BASEDIR=3D$HOME/.pio_store
PIO_FS_ENGINESDIR=3D$PIO_FS= _BASEDIR/engines
PIO_FS_TMPDIR=3D$PIO_FS_BASEDIR/tmp

# PredictionIO Storage Configuration
#
# This section controls programs that make use of PredictionIO's= built-in
# storage facilities. Default values are shown below.
#
# For more information on storage configuration please= refer to

# = Storage Repositories

# Default is to use Pos= tgreSQL
#PIO_STORAGE_REPOSITORIES_METADATA_NAME=3Dpio_meta
#PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=3DPGSQL
<= br>
#PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=3Dpio_event
#PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=3DPGSQL
#PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=3Dpio_model
#PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=3DPGSQL
PIO_STORAGE_REPOSITORIES_METADATA_NAME=3Dprediction= io_metadata
PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=3DELAST= ICSEARCH

PIO_STORAGE_REPOSITORIES_EVENTDATA_N= AME=3Dpredictionio_eventdata
PIO_STORAGE_REPOSITORIES_EVENT<= wbr>DATA_SOURCE=3DHBASE

PIO_STORAGE_REPOSITORIES_M= ODELDATA_NAME=3Dpio_
PIO_STORAGE_REPOSITORIES_MODELDATA= _SOURCE=3DLOCALFS

# Storage Data Sources


# PostgreSQL Default Settings
# Please change "pio" to your database name in PIO_STORAGE_SOUR= CES_PGSQL_URL
# Please change PIO_STORAGE_SOURCES_PGSQL_USER= NAME and
# PIO_STORAGE_SOURCES_PGSQL_PASSWORD accordingly
#PIO_STORAGE_SOURCES_PGSQL_TYPE=3Djdbc
#PIO_STORAGE_= SOURCES_PGSQL_URL=3Djdbc:postgresql://localhost/pio
#PI= O_STORAGE_SOURCES_PGSQL_USERNAME=3Dpio
#$PIO_STORAGE_SOURCES= _PGSQL_PASSWORD=3Dpio

# MySQL Example
# PIO_STORAGE_SOURCES_MYSQL_TYPE=3Djdbc
# PIO_STORAGE_SOU= RCES_MYSQL_URL=3Djdbc:mysql://localhost/pio
# PIO_STORAGE_SO= URCES_MYSQL_USERNAME=3Dpio
# PIO_STORAGE_SOURCES_MYSQL_PASS<= wbr>WORD=3Dpio

# Elasticsearch Example
P= IO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=3Delasticsearch
#PIO_STORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=3Delasticsearch
<= span>
PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=3Dlocalhost
PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=3D9300
PIO_STORAGE= _SOURCES_ELASTICSEARCH_HOME=3D/dfs/pawan_scala/mapr-predictionio/= vendors/elasticsearch-5.5.2

# Local File Syst= em Example
PIO_STORAGE_SOURCES_LOCALFS_TYPE=3Dlocalfs
<= div>PIO_STORAGE_SOURCES_LOCALFS_PATH=3D$PIO_FS_BASEDIR/models

# HBase Example
PIO_STORAGE_SOURCES_HBASE_TYPE=3Dhbase
PIO_STORAGE_SOURCES_HBASE_HOME=3D/dfs/pawan_scal= a/mapr-predictionio/vendors/hbase-1.2.6



ERROR in the logs---

2018-03-07 21:36:14,491 INFO=C2=A0 org.apache.predicti= onio.data.storage.Storage$ [main] - Verifying Meta Data Backend (Sourc= e: ELASTICSEARCH)...
2018-03-07 21:36:15,601 ERROR org.apache.pre= dictionio.tools.console.Console$ [main] - Unable to connect to all sto= rage backends successfully. The following shows the error message from the = storage backend.
2018-03-07 21:36:15,602 ERROR org.apache.predict= ionio.tools.console.Console$ [main] - None of the configured nodes are= available: [] (org.elasticsearch.client.transport.NoNodeAvailableExce= ption)
org.elasticsearch.client.transport.NoNodeAvailab= leException: None of the configured nodes are available: []
=C2= =A0 =C2=A0 =C2=A0 =C2=A0 at org.elasticsearch.client.transport.Transpo= rtClientNodesService.ensureNodesAreAvailable(TransportClientNodes= Service.java:305)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.elastic= search.client.transport.TransportClientNodesService.execute(Trans= portClientNodesService.java:200)
=C2=A0 =C2=A0 =C2=A0 =C2=A0= at org.elasticsearch.client.transport.support.InternalTransportI= ndicesAdminClient.execute(InternalTransportIndicesAdminClient.jav= a:86)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.elasticsearch.client.sup= port.AbstractIndicesAdminClient.exists(AbstractIndicesAdminClient.java:178)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.elasticsearch.= action.admin.indices.exists.indices.IndicesExistsRequestBuilder.d= oExecute(IndicesExistsRequestBuilder.java:53)
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.elasticsearch.action.ActionRequestBuilder.= execute(ActionRequestBuilder.java:91)
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 at org.elasticsearch.action.ActionRequestBuilder.execute(Action= RequestBuilder.java:65)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.e= lasticsearch.action.ActionRequestBuilder.get(ActionRequestBuilder= .java:73)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.= data.storage.elasticsearch.ESEngineInstances.<init>(ESEngin= eInstances.scala:42)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.refl= ect.NativeConstructorAccessorImpl.newInstance0(Native Method)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.NativeConstructorAcc= essorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.DelegatingConstructor= AccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:4= 5)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.reflect.Constructor.<= wbr>newInstance(Constructor.java:423)
=C2=A0 =C2=A0 =C2=A0 = =C2=A0 at org.apache.predictionio.data.storage.Storage$.getDataObject(= Storage.scala:306)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache= .predictionio.data.storage.Storage$.getDataObjectFromRepo(Storage= .scala:266)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictioni= o.data.storage.Storage$.getMetaDataEngineInstances(Storage.scala:= 367)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.= data.storage.Storage$.verifyAllDataObjects(Storage.scala:342)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.tools.co= nsole.Console$.status(Console.scala:1087)
=C2=A0 =C2=A0 =C2= =A0 =C2=A0 at org.apache.predictionio.tools.console.Console$$anonfun$m= ain$1.apply(Console.scala:737)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 a= t org.apache.predictionio.tools.console.Console$$anonfun$main$1.a= pply(Console.scala:696)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at scala.Opti= on.map(Option.scala:145)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.= apache.predictionio.tools.console.Console$.main(Console.scala:696= )
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.tools.console.Console.main(Console.scala)
2018-03-07 21:36:15,6= 05 ERROR org.apache.predictionio.tools.console.Console$ [main] - Dumpi= ng configuration of initialized storage backend sources. Please make sure t= hey are correct.
2018-03-07 21:36:15,607 ERROR org.apache.predict= ionio.tools.console.Console$ [main] - Source Name: ELASTICSEARCH; Type= : elasticsearch; Configuration: HOSTS -> localhost, TYPE -> elasticse= arch, HOME -> /dfs/pawan_scala/mapr-predictionio/vendors/elasticsea= rch-5.5.2, PORTS -> 9300
2018-03-07 21:36:42,649 INFO=C2= =A0 org.apache.predictionio.tools.console.Console$ [main] - Creating E= vent Server at 0.0.0.0:70= 70
2018-03-07 21:36:43,417 ERROR org.apache.predictionio.data= .storage.Storage$ [main] - Error initializing storage client for sourc= e HBASE
java.lang.NoClassDefFoundError: org/apache/hadoop/hb= ase/HBaseConfiguration
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.ap= ache.predictionio.data.storage.hbase.StorageClient.<init>(S= torageClient.scala:46)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect= .NativeConstructorAccessorImpl.newInstance0(Native Method)
<= div>=C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.NativeConstructorAccess= orImpl.newInstance(NativeConstructorAccessorImpl.java:62)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at sun.reflect.DelegatingConstructorAcc= essorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)<= /div>
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
=C2=A0 =C2=A0 =C2=A0 =C2= =A0 at org.apache.predictionio.data.storage.Storage$.getClient(Storage.scala:220)
=C2=A0 =C2=A0 =C2=A0 =C2=A0 at org.apache.predic= tionio.data.storage.Storage$.org$apache$predictionio$data$storage= $Storage$$updateS2CM(Storage.scala:251)
=C2=A0 =C2=A0 = =C2=A0 =C2=A0 at org.apache.predictionio.data.storage.Storage$$anonfun= $sourcesToClientMeta$1.apply(Storage.scala:212)
=C2=A0 = =C2=A0 =C2=A0 =C2=A0 at org.apache.predictionio.data.storage.Storage$$= anonfun$sourcesToClientMeta$1.apply(Storage.scala:212)


On Wed, Mar 7, 2018 at 1:= 21 AM, Pawan Agnihotri <pawan.agnihotri@gmail.com> w= rote:
Hello,

I need your help to configure predictionIO on linux 7.2 -=C2=A0=C2= =A0

I am using=C2=A0=C2=A0
http://predictionio.apache.org/install/i= nstall-sourcecode/=C2=A0 link for steps and installed spark, elastic se= arch and hbase but getting below error


<= /div>
[mapr@valvcshad004vm bin]$ ./pio status
/dfs/pawan= _scala/mapr-predictionio/bin/pio-class: line 89: /opt/mapr/spark/spark= -2.1.0/mapr-util/generate-classpath.sh: No such file or directory
/dfs/pawan_scala/mapr-predictionio/bin/pio-class: line 90: gener= ate_compatible_classpath: command not found
SLF4J: Class path con= tains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:= /dfs/pawan_scala/mapr-predictionio/assembly/pio-assembly-0.10.0-S= NAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
S= LF4J: Found binding in [jar:file:/data/opt/mapr/lib/slf4j-log4j12-1.7.= 12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J:= See http://www.slf4j.org/codes.html#multiple_bindings for = an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.= Log4jLoggerFactory]
[INFO] [Console$] Inspecting PredictionI= O...
[INFO] [Console$] PredictionIO 0.10.0-SNAPSHOT is installed = at /dfs/pawan_scala/mapr-predictionio
[INFO] [Console$] Insp= ecting Apache Spark...
[INFO] [Console$] Apache Spark is installe= d at /dfs/pawan_scala/mapr-predictionio/vendors/spark-2.1.1-bin-h= adoop2.6
[INFO] [Console$] Apache Spark 2.1.1 detected (meets min= imum requirement of 1.3.0)
[INFO] [Console$] Inspecting storage b= ackend connections...
[WARN] [Storage$] There is no properly c= onfigured repository.
[ERROR] [Storage$] Required reposito= ry (METADATA) configuration is missing.
[ERROR] [Storage$]= There were 1 configuration errors. Exiting.
[mapr@valvcshad0= 04vm bin]$

Here is my pio-env.sh file=

[mapr@valvcshad004vm conf]$ cat pio-env.sh
# Default is to use PostgreSQL
#PIO_STORAGE_REPO= SITORIES_METADATA_NAME=3Dpio_meta
#PIO_STORAGE_REPOSITORIES_= METADATA_SOURCE=3DPGSQL

#PIO_STORAGE_REPOSITO= RIES_EVENTDATA_NAME=3Dpio_event
#PIO_STORAGE_REPOSITORIES_EV= ENTDATA_SOURCE=3DPGSQL

#PIO_STORAGE_REPOSITOR= IES_MODELDATA_NAME=3Dpio_model
#PIO_STORAGE_REPOSITORIES_MOD= ELDATA_SOURCE=3DPGSQL

# Storage Data Sources<= /div>

# PostgreSQL Default Settings
# Please c= hange "pio" to your database name in PIO_STORAGE_SOURCES_PGSQL_UR= L
# Please change PIO_STORAGE_SOURCES_PGSQL_USERNAME and
# PIO_STORAGE_SOURCES_PGSQL_PASSWORD accordingly
#PIO= _STORAGE_SOURCES_PGSQL_TYPE=3Djdbc
#PIO_STORAGE_SOURCES_PGSQ= L_URL=3Djdbc:postgresql://localhost/pio
#PIO_STORAGE_SO= URCES_PGSQL_USERNAME=3Dpio
#$PIO_STORAGE_SOURCES_PGSQL_PASSWORD=3Dpio

# MySQL Example
# PIO_STO= RAGE_SOURCES_MYSQL_TYPE=3Djdbc
# PIO_STORAGE_SOURCES_MYSQL_U= RL=3Djdbc:mysql://localhost/pio
# PIO_STORAGE_SOURCES_MYSQL_= USERNAME=3Dpio
# PIO_STORAGE_SOURCES_MYSQL_PASSWORD=3Dp= io

# Elasticsearch Example
=C2=A0PIO_STO= RAGE_SOURCES_ELASTICSEARCH_TYPE=3Delasticsearch
=C2=A0PIO_ST= ORAGE_SOURCES_ELASTICSEARCH_CLUSTERNAME=3Delasticsearch_cluster_n= ame
=C2=A0PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=3Dlocalhos= t
=C2=A0PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=3D9300
=
=C2=A0PIO_STORAGE_SOURCES_ELASTICSEARCH_HOME=3D/dfs/pawan_scala/m= apr-predictionio/vendors/elasticsearch-5.5.2

=
# Local File System Example
=C2=A0PIO_STORAGE_SOURCES_LOCALF= S_TYPE=3Dlocalfs
=C2=A0PIO_STORAGE_SOURCES_LOCALFS_PATH= =3D$PIO_FS_BASEDIR/models

# HBase Example
=C2=A0PIO_STORAGE_SOURCES_HBASE_TYPE=3Dhbase
=C2=A0PIO_STO= RAGE_SOURCES_HBASE_HOME=3D/dfs/pawan_scala/mapr-predictionio/vend= ors/hbase-1.2.6

[mapr@valvcshad004vm conf]$
<= span class=3D"m_-3153194695242133612m_-285582775651606593HOEnZb">


--
Than= ks,
Pawan Agnihotri



--
Thanks,
Pawan Agnihotri



--
*Mars Hall
Customer Facing Architect
Salesforce Platform / Heroku
San Francisco, California



--
Thanks,
Pawan Ag= nihotri
--94eb2c065182b708320566efad8d--