Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 3E1BA200CD2 for ; Thu, 27 Jul 2017 10:59:40 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 3C7CA16A8AD; Thu, 27 Jul 2017 08:59:40 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 8B85516A8A8 for ; Thu, 27 Jul 2017 10:59:38 +0200 (CEST) Received: (qmail 77513 invoked by uid 500); 27 Jul 2017 08:59:32 -0000 Mailing-List: contact user-help@predictionio.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@predictionio.incubator.apache.org Delivered-To: mailing list user@predictionio.incubator.apache.org Received: (qmail 77503 invoked by uid 99); 27 Jul 2017 08:59:32 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Jul 2017 08:59:32 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 27AC7C3D21 for ; Thu, 27 Jul 2017 08:59:32 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.633 X-Spam-Level: ** X-Spam-Status: No, score=2.633 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, FREEMAIL_ENVFROM_END_DIGIT=0.25, HTML_MESSAGE=2, HTML_OBFUSCATE_05_10=0.001, RCVD_IN_DNSWL_BLOCKED=0.001, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RCVD_IN_SORBS_SPAM=0.5, SPF_PASS=-0.001, URIBL_BLOCKED=0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id aAVjTCON8vny for ; Thu, 27 Jul 2017 08:59:23 +0000 (UTC) Received: from mail-qt0-f177.google.com (mail-qt0-f177.google.com [209.85.216.177]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTPS id 0558C5FDBF for ; Thu, 27 Jul 2017 08:59:23 +0000 (UTC) Received: by mail-qt0-f177.google.com with SMTP id v29so38970330qtv.3 for ; Thu, 27 Jul 2017 01:59:22 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=WrqDKYl696Jt1dWS1Y42G8WeIiObHtIlbts015C9UX0=; b=IeeWAhwwGtGH0ViithmGeHrwfw/cwrDRu5xQd3OGPzT3uLDibXrTsIu15f7uLT5rab kOgCx6W7lPiqoZL27f0X6uwMaNq8asMYbKgm/hUTAVpWa5DixBXerwrgUccfkL2j9Hvs 3472S/3X6MVNgqlgLZRU6TEILKj4Kb8i9vIgk3xmb31amKB/TjZx1A7cR0ywRBNrjevw ZU3j0eFa7JCbMel/ST7LDJy05MJvRG16J2CqZpk3kDxd+KyoQ3R5fd1urRLDXkAT+1tX CxaLW5YH/x02vajYWUAMmuLWnjsH1selLCluzx5c6gHwfrMXila9Jqt2uLCjEzFQqRfK ekbQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=WrqDKYl696Jt1dWS1Y42G8WeIiObHtIlbts015C9UX0=; b=h4rom4NDGNnGqkTfTHAaOPVsz8RxuryxaqTYaCf/drKZISC/jDi2PE53IBEFHqdQ1b C5TDB1TzDKwfBxMxmMiUdkKLV5axTpVo7e0I6haAeCLVJCNh5SX56Lrb7vSIIvM9QJkL 2tdl3JnPKaPZ9zYosht/b0vMAg4eu9heYP54eeljMjvhDYDr2QIqZfNm3YsgFpioK74E wV0fScaj/RhNJ0SyZH8Y7aBqeJWdn65tZrAx6CE1xNcEFjSguDpC+PGNHKGjayiKWq+K QBZXQosZnCc8DNAdpMv2eniUfYvJgtvbBZPTGea3MkO2nzoQrk6Vkg+f4WTQ0gNLd9W1 Ab8w== X-Gm-Message-State: AIVw112htdddtsgw/s6u4Vwdo4EDLWdvhcMvYXWAJVdPtKoThVGmBpMk QERKVTSXYI3qEEeveXt/lay9Ph5LQY5h X-Received: by 10.237.60.152 with SMTP id d24mr5059178qtf.126.1501145961402; Thu, 27 Jul 2017 01:59:21 -0700 (PDT) MIME-Version: 1.0 Received: by 10.140.39.233 with HTTP; Thu, 27 Jul 2017 01:59:20 -0700 (PDT) In-Reply-To: References: From: "Darshan A.N." Date: Thu, 27 Jul 2017 14:29:20 +0530 Message-ID: Subject: Re: Pio build success with error, pio train is faililng. To: user@predictionio.incubator.apache.org Content-Type: multipart/alternative; boundary="001a1140a10ec915f0055548c702" archived-at: Thu, 27 Jul 2017 08:59:40 -0000 --001a1140a10ec915f0055548c702 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable hi Tom Chan, i did tried with directory u told me to do. but got an error, i know where did that error emerging from, but as for the document i myself edited the appName to appId, now the error is like this, darshan@darshu:~/PredictionIO/tapster-episode-similar$ pio train SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/darshan/PredictionIO/lib/spark/pio-data-hdfs-assembly-0.11.= 0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/darshan/PredictionIO/lib/pio-assembly-0.11.0-incubating.jar= !/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] [INFO] [Runner$] Submission command: /home/darshan/PredictionIO/vendors//spark-1.6.3/bin/spark-submit --class org.apache.predictionio.workflow.CreateWorkflow --jars file:/home/darshan/PredictionIO/lib/mysql-connector-java-5.1.37.jar,file:/h= ome/darshan/PredictionIO/tapster-episode-similar/target/scala-2.10/template= -scala-parallel-similarproduct_2.10-0.1-SNAPSHOT.jar,file:/home/darshan/Pre= dictionIO/tapster-episode-similar/target/scala-2.10/template-scala-parallel= -similarproduct-assembly-0.1-SNAPSHOT-deps.jar,file:/home/darshan/Predictio= nIO/lib/spark/pio-data-hbase-assembly-0.11.0-incubating.jar,file:/home/dars= han/PredictionIO/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar,fil= e:/home/darshan/PredictionIO/lib/spark/pio-data-jdbc-assembly-0.11.0-incuba= ting.jar,file:/home/darshan/PredictionIO/lib/spark/pio-data-elasticsearch1-= assembly-0.11.0-incubating.jar,file:/home/darshan/PredictionIO/lib/spark/pi= o-data-localfs-assembly-0.11.0-incubating.jar --files file:/home/darshan/PredictionIO/conf/log4j.properties --driver-class-path /home/darshan/PredictionIO/conf:/home/darshan/PredictionIO/lib/mysql-connec= tor-java-5.1.37.jar --driver-java-options -Dpio.log.dir=3D/home/darshan file:/home/darshan/PredictionIO/lib/pio-assembly-0.11.0-incubating.jar --engine-id org.example.similarproduct.SimilarProductEngine --engine-version 7ff66e8607f7d36b79cb9e9f3b97b53287e553f7 --engine-variant file:/home/darshan/PredictionIO/tapster-episode-similar/engine.json --verbosity 0 --json-extractor Both --env PIO_ENV_LOADED=3D1,PIO_STORAGE_SOURCES_MYSQL_PASSWORD=3Dpio,PIO_STORAGE_REP= OSITORIES_METADATA_NAME=3Dpio_meta,PIO_FS_BASEDIR=3D/home/darshan/.pio_stor= e,PIO_STORAGE_SOURCES_MYSQL_URL=3Djdbc:mysql://localhost:3306/pio,PIO_HOME= =3D/home/darshan/PredictionIO,PIO_FS_ENGINESDIR=3D/home/darshan/.pio_store/= engines,PIO_STORAGE_SOURCES_MYSQL_TYPE=3Djdbc,PIO_STORAGE_REPOSITORIES_META= DATA_SOURCE=3DMYSQL,PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=3DMYSQL,PIO_S= TORAGE_REPOSITORIES_EVENTDATA_NAME=3Dpio_event,PIO_STORAGE_SOURCES_MYSQL_US= ERNAME=3Dpio,PIO_FS_TMPDIR=3D/home/darshan/.pio_store/tmp,PIO_STORAGE_REPOS= ITORIES_MODELDATA_NAME=3Dpio_model,PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURC= E=3DMYSQL,PIO_CONF_DIR=3D/home/darshan/PredictionIO/conf [INFO] [Engine] Extracting datasource params... [INFO] [WorkflowUtils$] No 'name' is found. Default empty String will be used. [INFO] [Engine] Datasource params: (,DataSourceParams(null)) [INFO] [Engine] Extracting preparator params... [INFO] [Engine] Preparator params: (,Empty) [INFO] [Engine] Extracting serving params... [INFO] [Engine] Serving params: (,Empty) [WARN] [Utils] Your hostname, darshu resolves to a loopback address: 127.0.0.1; using 192.168.2.103 instead (on interface wlx001ea6631cc7) [WARN] [Utils] Set SPARK_LOCAL_IP if you need to bind to another address [INFO] [Remoting] Starting remoting [INFO] [Remoting] Remoting started; listening on addresses :[akka.tcp:// sparkDriverActorSystem@192.168.2.103:45309] [INFO] [Engine$] EngineWorkflow.train [INFO] [Engine$] DataSource: org.example.similarproduct.DataSource@168b4cb0 [INFO] [Engine$] Preparator: org.example.similarproduct.Preparator@45545e7a [INFO] [Engine$] AlgorithmList: List(org.example.similarproduct.ALSAlgorithm@23cbbd07) [INFO] [Engine$] Data sanity check is on. [ERROR] [Common$] Invalid app name null Exception in thread "main" java.lang.IllegalArgumentException: Invalid app name null at org.apache.predictionio.data.store.Common$$anonfun$appNameToId$2.apply(Comm= on.scala:50) at org.apache.predictionio.data.store.Common$$anonfun$appNameToId$2.apply(Comm= on.scala:48) at scala.Option.getOrElse(Option.scala:120) at org.apache.predictionio.data.store.Common$.appNameToId(Common.scala:48) at org.apache.predictionio.data.store.PEventStore$.aggregateProperties(PEventS= tore.scala:108) at org.example.similarproduct.DataSource.readTraining(DataSource.scala:31) at org.example.similarproduct.DataSource.readTraining(DataSource.scala:18) at org.apache.predictionio.controller.PDataSource.readTrainingBase(PDataSource= .scala:40) at org.apache.predictionio.controller.Engine$.train(Engine.scala:644) at org.apache.predictionio.controller.Engine.train(Engine.scala:177) at org.apache.predictionio.workflow.CoreWorkflow$.runTrain(CoreWorkflow.scala:= 67) at org.apache.predictionio.workflow.CreateWorkflow$.main(CreateWorkflow.scala:= 250) at org.apache.predictionio.workflow.CreateWorkflow.main(CreateWorkflow.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:6= 2) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp= l.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$r= unMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) i m using MySQL as my repository. i followed the article from official website only, still i m not able to execute the pio build or pio train. My src/main/scala DataSource.scala program is still as it is in previously mentioned. Thanks, -DAN On Wed, Jul 26, 2017 at 3:22 PM, Tom Chan wrote: > darshan@darshu:~/PredictionIO/tapster-episode-similar/src/main/scala$ pio > train > > Can you try the command from the tapster-episode-similar directory? > > Tom > > On Jul 26, 2017 2:45 AM, "Darshan A.N." wrote: > >> hi team, >> i am trying to install demo tapster. i followed >> http://predictionio.incubator.apache.org/demo/tapster/. It took more >> than a week to install predictionio. >> now that i m installed the PIO, its throwing me error while running the >> $pio build command. the error goes like this: >> darshan@darshu:~/PredictionIO/tapster-episode-similar$ pio build >> SLF4J: Class path contains multiple SLF4J bindings. >> SLF4J: Found binding in [jar:file:/home/darshan/Predic >> tionIO/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.ja >> r!/org/slf4j/impl/StaticLoggerBinder.class] >> SLF4J: Found binding in [jar:file:/home/darshan/Predic >> tionIO/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/imp >> l/StaticLoggerBinder.class] >> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an >> explanation. >> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] >> [INFO] [Engine$] Using command '/home/darshan/PredictionIO/sbt/sbt' at >> /home/darshan/PredictionIO/tapster-episode-similar to build. >> [INFO] [Engine$] If the path above is incorrect, this process will fail. >> [INFO] [Engine$] Uber JAR disabled. Making sure >> lib/pio-assembly-0.11.0-incubating.jar is absent. >> [INFO] [Engine$] Going to run: /home/darshan/PredictionIO/sbt/sbt >> package assemblyPackageDependency in /home/darshan/PredictionIO/tap >> ster-episode-similar >> [ERROR] [Engine$] [error] /home/darshan/PredictionIO/tap >> ster-episode-similar/src/main/scala/DataSource.scala:63: not found: >> value eventsDb >> [ERROR] [Engine$] [error] val viewEventsRDD: RDD[ViewEvent] =3D >> eventsDb.find( >> [ERROR] [Engine$] [error] ^ >> [ERROR] [Engine$] [error] one error found >> [ERROR] [Engine$] [error] (compile:compileIncremental) Compilation faile= d >> [ERROR] [Engine$] [error] Total time: 5 s, completed 26 Jul, 2017 1:56:5= 0 >> AM >> [ERROR] [Engine$] Return code of build command: >> /home/darshan/PredictionIO/sbt/sbt package assemblyPackageDependency is >> 1. Aborting. >> [INFO] [Engine$] Looking for an engine... >> [INFO] [Engine$] Found template-scala-parallel-simila >> rproduct_2.10-0.1-SNAPSHOT.jar >> [INFO] [Engine$] Found template-scala-parallel-simila >> rproduct-assembly-0.1-SNAPSHOT-deps.jar >> [INFO] [Engine$] Build finished successfully. >> [INFO] [Pio$] Your engine is ready for training. >> >> >> and the file DataSource.scala, >> >> >> package org.example.similarproduct >> >> import org.apache.predictionio.controller.PDataSource >> import org.apache.predictionio.controller.EmptyEvaluationInfo >> import org.apache.predictionio.controller.EmptyActualResult >> import org.apache.predictionio.controller.Params >> import org.apache.predictionio.data.storage.Event >> import org.apache.predictionio.data.store.PEventStore >> >> import org.apache.spark.SparkContext >> import org.apache.spark.SparkContext._ >> import org.apache.spark.rdd.RDD >> >> import grizzled.slf4j.Logger >> >> case class DataSourceParams(appName: String) extends Params >> >> class DataSource(val dsp: DataSourceParams) >> extends PDataSource[TrainingData, >> EmptyEvaluationInfo, Query, EmptyActualResult] { >> >> @transient lazy val logger =3D Logger[this.type] >> >> override >> def readTraining(sc: SparkContext): TrainingData =3D { >> >> // create a RDD of (entityID, User) >> val usersRDD: RDD[(String, User)] =3D PEventStore.aggregatePropertie= s( >> appName =3D dsp.appName, >> entityType =3D "user" >> )(sc).map { case (entityId, properties) =3D> >> val user =3D try { >> User() >> } catch { >> case e: Exception =3D> { >> logger.error(s"Failed to get properties ${properties} of" + >> s" user ${entityId}. Exception: ${e}.") >> throw e >> } >> } >> (entityId, user) >> }.cache() >> >> // create a RDD of (entityID, Item) >> val itemsRDD: RDD[(String, Item)] =3D PEventStore.aggregatePropertie= s( >> appName =3D dsp.appName, >> entityType =3D "item" >> )(sc).map { case (entityId, properties) =3D> >> val item =3D try { >> // Assume categories is optional property of item. >> Item(categories =3D properties.getOpt[List[String]]("categories"= )) >> } catch { >> case e: Exception =3D> { >> logger.error(s"Failed to get properties ${properties} of" + >> s" item ${entityId}. Exception: ${e}.") >> throw e >> } >> } >> (entityId, item) >> }.cache() >> >> // get all "user" "view" "item" events >> val viewEventsRDD: RDD[ViewEvent] =3D eventsDb.find( >> appId =3D dsp.appId, >> entityType =3D Some("user"), >> eventNames =3D Some(List("like")), >> // targetEntityType is optional field of an event. >> targetEntityType =3D Some(Some("item")))(sc) >> // eventsDb.find() returns RDD[Event] >> .map { event =3D> >> val viewEvent =3D try { >> event.event match { >> case "like" =3D> ViewEvent( >> user =3D event.entityId, >> item =3D event.targetEntityId.get, >> t =3D event.eventTime.getMillis) >> case _ =3D> throw new Exception(s"Unexpected event ${event} = is >> read.") >> } >> } catch { >> case e: Exception =3D> { >> logger.error(s"Cannot convert ${event} to ViewEvent." + >> s" Exception: ${e}.") >> throw e >> } >> } >> viewEvent >> }.cache() >> >> new TrainingData( >> users =3D usersRDD, >> items =3D itemsRDD, >> viewEvents =3D viewEventsRDD >> ) >> } >> } >> >> case class User() >> >> case class Item(categories: Option[List[String]]) >> >> case class ViewEvent(user: String, item: String, t: Long) >> >> class TrainingData( >> val users: RDD[(String, User)], >> val items: RDD[(String, Item)], >> val viewEvents: RDD[ViewEvent] >> ) extends Serializable { >> override def toString =3D { >> s"users: [${users.count()} (${users.take(2).toList}...)]" + >> s"items: [${items.count()} (${items.take(2).toList}...)]" + >> s"viewEvents: [${viewEvents.count()}] (${viewEvents.take(2).toList}. >> ..)" >> } >> } >> >> >> >> >> while training, it gives me following error, >> >> >> >> >> darshan@darshu:~/PredictionIO/tapster-episode-similar/src/main/scala$ >> pio train >> SLF4J: Class path contains multiple SLF4J bindings. >> SLF4J: Found binding in [jar:file:/home/darshan/Predic >> tionIO/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.ja >> r!/org/slf4j/impl/StaticLoggerBinder.class] >> SLF4J: Found binding in [jar:file:/home/darshan/Predic >> tionIO/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j/imp >> l/StaticLoggerBinder.class] >> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an >> explanation. >> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] >> [WARN] [Template$] /home/darshan/PredictionIO/tap >> ster-episode-similar/src/main/scala/template.json does not exist. >> Template metadata will not be available. (This is safe to ignore if you = are >> not working on a template.) >> Exception in thread "main" java.io.FileNotFoundException: >> /home/darshan/PredictionIO/tapster-episode-similar/src/main/scala/engine= .json >> (No such file or directory) >> at java.io.FileInputStream.open0(Native Method) >> at java.io.FileInputStream.open(FileInputStream.java:195) >> at java.io.FileInputStream.(FileInputStream.java:138) >> at scala.io.Source$.fromFile(Source.scala:90) >> at scala.io.Source$.fromFile(Source.scala:75) >> at org.apache.predictionio.tools.console.Console$.getEngineInfo >> (Console.scala:724) >> at org.apache.predictionio.tools.RunWorkflow$.runWorkflow(RunWo >> rkflow.scala:54) >> at org.apache.predictionio.tools.commands.Engine$.train(Engine. >> scala:186) >> at org.apache.predictionio.tools.console.Pio$.train(Pio.scala:85) >> at org.apache.predictionio.tools.console.Console$$anonfun$main$ >> 1.apply(Console.scala:626) >> at org.apache.predictionio.tools.console.Console$$anonfun$main$ >> 1.apply(Console.scala:611) >> at scala.Option.map(Option.scala:145) >> at org.apache.predictionio.tools.console.Console$.main(Console. >> scala:611) >> at org.apache.predictionio.tools.console.Console.main(Console.scala) >> >> >> >> i tried almost all the things, but could not find the proper solution. >> please help me... >> i know you may feel bit odd about this mail, but in need of your help. >> >> thanks, >> >> >> -DAN >> > --001a1140a10ec915f0055548c702 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
hi Tom Chan,
i did = tried with directory u told me to do.

but got an error, i know= where did that error emerging from, but as for the document i myself edite= d the appName to appId,

now the error is like this,



darshan@darshu:~/PredictionIO/tapster-episode-similar$ pio trainSLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found bindi= ng in [jar:file:/home/darshan/PredictionIO/lib/spark/pio-data-hdfs-assembly= -0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: = Found binding in [jar:file:/home/darshan/PredictionIO/lib/pio-assembly-0.11= .0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See <= a href=3D"http://www.slf4j.org/codes.html#multiple_bindings">http://www.slf= 4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actua= l binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[INFO] [Runner$= ] Submission command: /home/darshan/PredictionIO/vendors//spark-1.6.3/bin/s= park-submit --class org.apache.predictionio.workflow.CreateWorkflow --jars = file:/home/darshan/PredictionIO/lib/mysql-connector-java-5.1.37.jar,file:/h= ome/darshan/PredictionIO/tapster-episode-similar/target/scala-2.10/template= -scala-parallel-similarproduct_2.10-0.1-SNAPSHOT.jar,file:/home/darshan/Pre= dictionIO/tapster-episode-similar/target/scala-2.10/template-scala-parallel= -similarproduct-assembly-0.1-SNAPSHOT-deps.jar,file:/home/darshan/Predictio= nIO/lib/spark/pio-data-hbase-assembly-0.11.0-incubating.jar,file:/home/dars= han/PredictionIO/lib/spark/pio-data-hdfs-assembly-0.11.0-incubating.jar,fil= e:/home/darshan/PredictionIO/lib/spark/pio-data-jdbc-assembly-0.11.0-incuba= ting.jar,file:/home/darshan/PredictionIO/lib/spark/pio-data-elasticsearch1-= assembly-0.11.0-incubating.jar,file:/home/darshan/PredictionIO/lib/spark/pi= o-data-localfs-assembly-0.11.0-incubating.jar --files file:/home/darshan/Pr= edictionIO/conf/log4j.properties --driver-class-path /home/darshan/Predicti= onIO/conf:/home/darshan/PredictionIO/lib/mysql-connector-java-5.1.37.jar --= driver-java-options -Dpio.log.dir=3D/home/darshan file:/home/darshan/Predic= tionIO/lib/pio-assembly-0.11.0-incubating.jar --engine-id org.example.simil= arproduct.SimilarProductEngine --engine-version 7ff66e8607f7d36b79cb9e9f3b9= 7b53287e553f7 --engine-variant file:/home/darshan/PredictionIO/tapster-epis= ode-similar/engine.json --verbosity 0 --json-extractor Both --env PIO_ENV_L= OADED=3D1,PIO_STORAGE_SOURCES_MYSQL_PASSWORD=3Dpio,PIO_STORAGE_REPOSITORIES= _METADATA_NAME=3Dpio_meta,PIO_FS_BASEDIR=3D/home/darshan/.pio_store,PIO_STO= RAGE_SOURCES_MYSQL_URL=3Djdbc:mysql://localhost:3306/pio,PIO_HOME=3D/home/d= arshan/PredictionIO,PIO_FS_ENGINESDIR=3D/home/darshan/.pio_store/engines,PI= O_STORAGE_SOURCES_MYSQL_TYPE=3Djdbc,PIO_STORAGE_REPOSITORIES_METADATA_SOURC= E=3DMYSQL,PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=3DMYSQL,PIO_STORAGE_REP= OSITORIES_EVENTDATA_NAME=3Dpio_event,PIO_STORAGE_SOURCES_MYSQL_USERNAME=3Dp= io,PIO_FS_TMPDIR=3D/home/darshan/.pio_store/tmp,PIO_STORAGE_REPOSITORIES_MO= DELDATA_NAME=3Dpio_model,PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=3DMYSQL,= PIO_CONF_DIR=3D/home/darshan/PredictionIO/conf
[INFO] [Engine] Extractin= g datasource params...
[INFO] [WorkflowUtils$] No 'name' is foun= d. Default empty String will be used.
[INFO] [Engine] Datasource params:= (,DataSourceParams(null))
[INFO] [Engine] Extracting preparator params.= ..
[INFO] [Engine] Preparator params: (,Empty)
[INFO] [Engine] Extrac= ting serving params...
[INFO] [Engine] Serving params: (,Empty)
[WARN= ] [Utils] Your hostname, darshu resolves to a loopback address: 127.0.0.1; = using 192.168.2.103 instead (on interface wlx001ea6631cc7)
[WARN] [Utils= ] Set SPARK_LOCAL_IP if you need to bind to another address
[INFO] [Remo= ting] Starting remoting
[INFO] [Remoting] Remoting started; listening on= addresses :[akka.tcp://sparkDriverActorSystem@192.168.2.103:45309]
[INFO] [Engin= e$] EngineWorkflow.train
[INFO] [Engine$] DataSource: org.example.simila= rproduct.DataSource@168b4cb0
[INFO] [Engine$] Preparator: org.example.si= milarproduct.Preparator@45545e7a
[INFO] [Engine$] AlgorithmList: List(or= g.example.similarproduct.ALSAlgorithm@23cbbd07)
[INFO] [Engine$] Data sa= nity check is on.
[ERROR] [Common$] Invalid app name null
Exception i= n thread "main" java.lang.IllegalArgumentException: Invalid app n= ame null
=C2=A0=C2=A0=C2=A0 at org.apache.predictionio.data.store.Common= $$anonfun$appNameToId$2.apply(Common.scala:50)
=C2=A0=C2=A0=C2=A0 at org= .apache.predictionio.data.store.Common$$anonfun$appNameToId$2.apply(Common.= scala:48)
=C2=A0=C2=A0=C2=A0 at scala.Option.getOrElse(Option.scala:120)=
=C2=A0=C2=A0=C2=A0 at org.apache.predictionio.data.store.Common$.appNam= eToId(Common.scala:48)
=C2=A0=C2=A0=C2=A0 at org.apache.predictionio.dat= a.store.PEventStore$.aggregateProperties(PEventStore.scala:108)
=C2=A0= =C2=A0=C2=A0 at org.example.similarproduct.DataSource.readTraining(DataSour= ce.scala:31)
=C2=A0=C2=A0=C2=A0 at org.example.similarproduct.DataSource= .readTraining(DataSource.scala:18)
=C2=A0=C2=A0=C2=A0 at org.apache.pred= ictionio.controller.PDataSource.readTrainingBase(PDataSource.scala:40)
= =C2=A0=C2=A0=C2=A0 at org.apache.predictionio.controller.Engine$.train(Engi= ne.scala:644)
=C2=A0=C2=A0=C2=A0 at org.apache.predictionio.controller.E= ngine.train(Engine.scala:177)
=C2=A0=C2=A0=C2=A0 at org.apache.predictio= nio.workflow.CoreWorkflow$.runTrain(CoreWorkflow.scala:67)
=C2=A0=C2=A0= =C2=A0 at org.apache.predictionio.workflow.CreateWorkflow$.main(CreateWorkf= low.scala:250)
=C2=A0=C2=A0=C2=A0 at org.apache.predictionio.workflow.Cr= eateWorkflow.main(CreateWorkflow.scala)
=C2=A0=C2=A0=C2=A0 at sun.reflec= t.NativeMethodAccessorImpl.invoke0(Native Method)
=C2=A0=C2=A0=C2=A0 at = sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:6= 2)
=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccessorImpl.invoke= (DelegatingMethodAccessorImpl.java:43)
=C2=A0=C2=A0=C2=A0 at java.lang.r= eflect.Method.invoke(Method.java:498)
=C2=A0=C2=A0=C2=A0 at org.apache.s= park.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(Spark= Submit.scala:731)
=C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSub= mit$.doRunMain$1(SparkSubmit.scala:181)
=C2=A0=C2=A0=C2=A0 at org.apache= .spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
=C2=A0=C2=A0=C2= =A0 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
= =C2=A0=C2=A0=C2=A0 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.= scala)

i m using MySQL as my repository. i followed the articl= e from official website only, still i m not able to execute the pio build o= r pio train. My src/main/scala DataSource.scala program is still as it is i= n previously mentioned.

Thanks,
-DAN

On Wed, Jul 26, 2017 a= t 3:22 PM, Tom Chan <yukhei.chan@gmail.com> wrote:
darshan@darshu:~/PredictionIO/tapster-episode-simi= lar/src/ma= in/scala$ pio train

Can you try the command f= rom the tapster-episode-similar directory?=C2=A0

Tom
=

On Jul 26, 2017 2:45 AM, "Darshan = A.N." <= darshanan.24@gmail.com> wrote:
hi team,
i am trying to install = demo tapster. i=C2=A0 followed http://prediction= io.incubator.apache.org/demo/tapster/.=C2=A0=C2=A0=C2=A0=C2=A0 It = took more than a week to install predictionio.
now that i m installed th= e PIO, its throwing me error while running the $pio build command. the erro= r goes like this:
darshan@darshu:~/PredictionIO/tapster-episode-sim= ilar$ pio build
SLF4J: Class path contains multiple SLF4J bindings.
S= LF4J: Found binding in [jar:file:/home/darshan/PredictionIO/lib/spark/= pio-data-hdfs-assembly-0.11.0-incubating.jar!/org/slf4j/impl/Stat= icLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/darsh= an/PredictionIO/lib/pio-assembly-0.11.0-incubating.jar!/org/slf4j= /impl/StaticLoggerBinder.class]
SLF4J: See ht= tp://www.slf4j.org/codes.html#multiple_bindings for an explanation= .
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFacto= ry]
[INFO] [Engine$] Using command '/home/darshan/PredictionIO/sbt/sbt' at /home/darshan/PredictionIO/tapster-episode-similar to = build.
[INFO] [Engine$] If the path above is incorrect, this process wil= l fail.
[INFO] [Engine$] Uber JAR disabled. Making sure lib/pio-assembly= -0.11.0-incubating.jar is absent.
[INFO] [Engine$] Going to run: /h= ome/darshan/PredictionIO/sbt/sbt=C2=A0 package assemblyPackageDependen= cy in /home/darshan/PredictionIO/tapster-episode-similar
[ERROR] [E= ngine$] [error] /home/darshan/PredictionIO/tapster-episode-similar/src= /main/scala/DataSource.scala:63: not found: value eventsDb
[ERROR] = [Engine$] [error]=C2=A0=C2=A0=C2=A0=C2=A0 val viewEventsRDD: RDD[ViewEvent]= =3D eventsDb.find(
[ERROR] [Engine$] [error]=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ^
= [ERROR] [Engine$] [error] one error found
[ERROR] [Engine$] [error] (com= pile:compileIncremental) Compilation failed
[ERROR] [Engine$] [error] To= tal time: 5 s, completed 26 Jul, 2017 1:56:50 AM
[ERROR] [Engine$] Retur= n code of build command: /home/darshan/PredictionIO/sbt/sbt=C2=A0 pack= age assemblyPackageDependency is 1. Aborting.
[INFO] [Engine$] Looking f= or an engine...
[INFO] [Engine$] Found template-scala-parallel-similarproduct_2.10-0.1-SNAPSHOT.jar
[INFO] [Engine$] Found template-scala-p= arallel-similarproduct-assembly-0.1-SNAPSHOT-deps.jar
[INFO] [= Engine$] Build finished successfully.
[INFO] [Pio$] Your engine is ready= for training.


and the file DataSource.scala,


package= org.example.similarproduct

import org.apache.predictionio.controller.PDataSource
import org.apache.predictionio.controller.EmptyE= valuationInfo
import org.apache.predictionio.controller.EmptyActual= Result
import org.apache.predictionio.controller.Params
import o= rg.apache.predictionio.data.storage.Event
import org.apache.predict= ionio.data.store.PEventStore

import org.apache.spark.SparkConte= xt
import org.apache.spark.SparkContext._
import org.apache.spar= k.rdd.RDD

import grizzled.slf4j.Logger

case class DataSourceP= arams(appName: String) extends Params

class DataSource(val dsp: Data= SourceParams)
=C2=A0 extends PDataSource[TrainingData,
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 EmptyEvaluationInfo, Query, EmptyActualResult] {

= =C2=A0 @transient lazy val logger =3D Logger[this.type]

=C2=A0 overr= ide
=C2=A0 def readTraining(sc: SparkContext): TrainingData =3D {
=C2=A0=C2=A0=C2=A0 // create a RDD of (entityID, User)
=C2=A0=C2=A0=C2= =A0 val usersRDD: RDD[(String, User)] =3D PEventStore.aggregateProperties(
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 appName =3D dsp.appName,
=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 entityType =3D "user"
=C2=A0=C2=A0=C2= =A0 )(sc).map { case (entityId, properties) =3D>
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 val user =3D try {
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 User()
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 } catch {
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 case e: Exception =3D> {
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 logger.error(s"Failed to get p= roperties ${properties} of" +
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 s" user ${entityId}. Exception: ${e}.&q= uot;)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 throw e
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 }
=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 }
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 (entityId, user)
=C2=A0=C2=A0=C2= =A0 }.cache()

=C2=A0=C2=A0=C2=A0 // create a RDD of (entityID, Item)=
=C2=A0=C2=A0=C2=A0 val itemsRDD: RDD[(String, Item)] =3D PEventStore.ag= gregateProperties(
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 appName =3D dsp.a= ppName,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 entityType =3D "item"=C2=A0=C2=A0=C2=A0 )(sc).map { case (entityId, properties) =3D>
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 val item =3D try {
=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 // Assume categories is optional property of item.
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 Item(categories =3D properties.g= etOpt[List[String]]("categories"))
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 } catch {
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 case e: E= xception =3D> {
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 logger.error(s"Failed to get properties ${properties} of" +=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 s"= ; item ${entityId}. Exception: ${e}.")
=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 throw e
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 }
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 }
=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 (entityId, item)
=C2=A0=C2=A0=C2=A0 }.cache()

=C2=A0=C2=A0=C2= =A0 // get all "user" "view" "item" events=C2=A0=C2=A0=C2=A0 val viewEventsRDD: RDD[ViewEvent] =3D eventsDb.find(=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 appId =3D dsp.appId,
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 entityType =3D Some("user"),
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 eventNames =3D Some(List("like")),
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 // targetEntityType is optional field of an event.
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 targetEntityType =3D Some(Some("item&qu= ot;)))(sc)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 // eventsDb.find() returns RDD= [Event]
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 .map { event =3D>
=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 val viewEvent =3D try {
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 event.event match {
=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 case "like&q= uot; =3D> ViewEvent(
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 user =3D event.entityId,
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 item =3D ev= ent.targetEntityId.get,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 t =3D event.eventTime.getMillis)
=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 case _ =3D>= throw new Exception(s"Unexpected event ${event} is read.")
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 }
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 } catch {
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0 case e: Exception =3D> {
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 logger.error(s"Cannot co= nvert ${event} to ViewEvent." +
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 s" Exception: ${e}."= ;)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 th= row e
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 }
=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 }
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 viewEvent
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 }.cache()

=C2= =A0=C2=A0=C2=A0 new TrainingData(
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 users = =3D usersRDD,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 items =3D itemsRDD,
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 viewEvents =3D viewEventsRDD
=C2=A0=C2=A0=C2= =A0 )
=C2=A0 }
}

case class User()

case class Item(cate= gories: Option[List[String]])

case class ViewEvent(user: String, ite= m: String, t: Long)

class TrainingData(
=C2=A0 val users: RDD[(St= ring, User)],
=C2=A0 val items: RDD[(String, Item)],
=C2=A0 val viewE= vents: RDD[ViewEvent]
) extends Serializable {
=C2=A0 override def to= String =3D {
=C2=A0=C2=A0=C2=A0 s"users: [${users.count()} (${users= .take(2).toList}...)]" +
=C2=A0=C2=A0=C2=A0 s"items: [${items.= count()} (${items.take(2).toList}...)]" +
=C2=A0=C2=A0=C2=A0 s"= ;viewEvents: [${viewEvents.count()}] (${viewEvents.take(2).toList}...)= "
=C2=A0 }
}




while training, it gives=C2=A0 m= e following error,




darshan@darshu:~/PredictionIO/ta= pster-episode-similar/src/main/scala$ pio train
SLF4J: Class path c= ontains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home= /darshan/PredictionIO/lib/spark/pio-data-hdfs-assembly-0.11.0-inc= ubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: F= ound binding in [jar:file:/home/darshan/PredictionIO/lib/pio-assembly-= 0.11.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#mul= tiple_bindings for an explanation.
SLF4J: Actual binding is of type = [org.slf4j.impl.Log4jLoggerFactory]
[WARN] [Template$] /home/darsha= n/PredictionIO/tapster-episode-similar/src/main/scala/template.js= on does not exist. Template metadata will not be available. (This is safe=20 to ignore if you are not working on a template.)
Exception in thread &qu= ot;main" java.io.FileNotFoundException: /home/darshan/PredictionIO/tap= ster-episode-similar/src/main/scala/engine.json (No such file or = directory)
=C2=A0=C2=A0=C2=A0 at java.io.FileInputStream.open0(Nati= ve Method)
=C2=A0=C2=A0=C2=A0 at java.io.FileInputStream.open(FileI= nputStream.java:195)
=C2=A0=C2=A0=C2=A0 at java.io.FileInputStream.<i= nit>(FileInputStream.java:138)
=C2=A0=C2=A0=C2=A0 at scala.io.So= urce$.fromFile(Source.scala:90)
=C2=A0=C2=A0=C2=A0 at scala.io.Sour= ce$.fromFile(Source.scala:75)
=C2=A0=C2=A0=C2=A0 at org.apache.pred= ictionio.tools.console.Console$.getEngineInfo(Console.scala:724)<= br>=C2=A0=C2=A0=C2=A0 at org.apache.predictionio.tools.RunWorkflow$.ru= nWorkflow(RunWorkflow.scala:54)
=C2=A0=C2=A0=C2=A0 at org.apache.pr= edictionio.tools.commands.Engine$.train(Engine.scala:186)
=C2= =A0=C2=A0=C2=A0 at org.apache.predictionio.tools.console.Pio$.train(Pi= o.scala:85)
=C2=A0=C2=A0=C2=A0 at org.apache.predictionio.tools.console.Console$$anonfun$main$1.apply(Console.scala:626)
=C2=A0= =C2=A0=C2=A0 at org.apache.predictionio.tools.console.Console$$anonfun= $main$1.apply(Console.scala:611)
=C2=A0=C2=A0=C2=A0 at scala.Option= .map(Option.scala:145)
=C2=A0=C2=A0=C2=A0 at org.apache.predictioni= o.tools.console.Console$.main(Console.scala:611)
=C2=A0=C2=A0= =C2=A0 at org.apache.predictionio.tools.console.Console.main(Console.s= cala)



i tried almost all the things, but could not find= the proper solution. please help me...
i know you may feel bit od= d about this mail, but in need of your help.

thanks,

-DAN

--001a1140a10ec915f0055548c702--