predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hossein Bahrami <h.bahr...@live.com>
Subject RE: Customizing Recommender engine
Date Wed, 14 Dec 2016 08:36:00 GMT
Hi Pat,
Thanks for your reply, the reason I switched back to the Recommendation Engine is getting
error during build.

This is my pio version :  0.10.0-incubating

And I’m getting below error during pio build of the UR template you shared, and also you
can find the engine.json as well
##########################################################################
{
  "comment":" This config file uses default settings for all but the required values see README.md
for docs",
  "id": "default",
  "description": "Default settings",
  "engineFactory": "org.template.RecommendationEngine",
  "datasource": {
    "params" : {
      "name": "sample-handmade-data.txt",
      "appName": "ur2",
      "eventNames": ["rate", "$set"]
    }
  },
  "sparkConf": {
    "spark.serializer": "org.apache.spark.serializer.KryoSerializer",
    "spark.kryo.registrator": "org.apache.mahout.sparkbindings.io.MahoutKryoRegistrator",
    "spark.kryo.referenceTracking": "false",
    "spark.kryoserializer.buffer": "300m",
    "es.index.auto.create": "true"
  },
  "algorithms": [
    {
      "comment": "simplest setup where all values are default, popularity based backfill,
must add eventsNames",
      "name": "ur",
      "params": {
        "appName": "ur2",
        "indexName": "urindex",
        "typeName": "items",
        "comment": "must have data for the first event or the model will not build, other
events are optional",
        "indicators": [
          {
            "name": "rate"
          },{
            "name": "$set",
            "maxCorrelatorsPerItem": 50
          }
        ],
        "availableDateName": "available",
        "expireDateName": "expires",
        "dateName": "date",
        "num": 4
      }
    }
  ]
}
###############################################################################
ERRORS

[INFO] [Console$] [warn]        ::::::::::::::::::::::::::::::::::::::::::::::
[INFO] [Console$] [warn]        ::          UNRESOLVED DEPENDENCIES         ::
[INFO] [Console$] [warn]        ::::::::::::::::::::::::::::::::::::::::::::::
[INFO] [Console$] [warn]        :: org.apache.mahout#mahout-math-scala_2.10;0.13.0-SNAPSHOT:
not found
[INFO] [Console$] [warn]        :: org.apache.mahout#mahout-spark_2.10;0.13.0-SNAPSHOT: not
found
[INFO] [Console$] [warn]        :: org.apache.mahout#mahout-math;0.13.0-SNAPSHOT: not found
[INFO] [Console$] [warn]        :: org.apache.mahout#mahout-hdfs;0.13.0-SNAPSHOT: not found
[INFO] [Console$] [warn]        ::::::::::::::::::::::::::::::::::::::::::::::
[INFO] [Console$] [warn]
[INFO] [Console$] [warn]        Note: Unresolved dependencies path:
[INFO] [Console$] [warn]                org.apache.mahout:mahout-math-scala_2.10:0.13.0-SNAPSHOT
(/root/workspace/hossein.bahrami/ur2/build.sbt#L17-38)
[INFO] [Console$] [warn]                  +- com.actionml:template-scala-parallel-universal-recommendation_2.10:0.5.0
[INFO] [Console$] [warn]                org.apache.mahout:mahout-spark_2.10:0.13.0-SNAPSHOT
(/root/workspace/hossein.bahrami/ur2/build.sbt#L17-38)
[INFO] [Console$] [warn]                  +- com.actionml:template-scala-parallel-universal-recommendation_2.10:0.5.0
[INFO] [Console$] [warn]                org.apache.mahout:mahout-math:0.13.0-SNAPSHOT (/root/workspace/hossein.bahrami/ur2/build.sbt#L17-38)
[INFO] [Console$] [warn]                  +- com.actionml:template-scala-parallel-universal-recommendation_2.10:0.5.0
[INFO] [Console$] [warn]                org.apache.mahout:mahout-hdfs:0.13.0-SNAPSHOT (/root/workspace/hossein.bahrami/ur2/build.sbt#L17-38)
[INFO] [Console$] [warn]                  +- com.actionml:template-scala-parallel-universal-recommendation_2.10:0.5.0
[INFO] [Console$] sbt.ResolveException: unresolved dependency: org.apache.mahout#mahout-math-scala_2.10;0.13.0-SNAPSHOT:
not found
[INFO] [Console$] unresolved dependency: org.apache.mahout#mahout-spark_2.10;0.13.0-SNAPSHOT:
not found
[INFO] [Console$] unresolved dependency: org.apache.mahout#mahout-math;0.13.0-SNAPSHOT: not
found
[INFO] [Console$] unresolved dependency: org.apache.mahout#mahout-hdfs;0.13.0-SNAPSHOT: not
found
[INFO] [Console$]       at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:278)
[INFO] [Console$]       at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:175)
[INFO] [Console$]       at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:157)
[INFO] [Console$]       at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
[INFO] [Console$]       at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:151)
[INFO] [Console$]       at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:128)
[INFO] [Console$]       at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:56)
[INFO] [Console$]       at sbt.IvySbt$$anon$4.call(Ivy.scala:64)
[INFO] [Console$]       at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
[INFO] [Console$]       at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
[INFO] [Console$]       at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
[INFO] [Console$]       at xsbt.boot.Using$.withResource(Using.scala:10)
[INFO] [Console$]       at xsbt.boot.Using$.apply(Using.scala:9)
[INFO] [Console$]       at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
[INFO] [Console$]       at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
[INFO] [Console$]       at xsbt.boot.Locks$.apply0(Locks.scala:31)
[INFO] [Console$]       at xsbt.boot.Locks$.apply(Locks.scala:28)
[INFO] [Console$]       at sbt.IvySbt.withDefaultLogger(Ivy.scala:64)
[INFO] [Console$]       at sbt.IvySbt.withIvy(Ivy.scala:123)
[INFO] [Console$]       at sbt.IvySbt.withIvy(Ivy.scala:120)
[INFO] [Console$]       at sbt.IvySbt$Module.withModule(Ivy.scala:151)
[INFO] [Console$]       at sbt.IvyActions$.updateEither(IvyActions.scala:157)
[INFO] [Console$]       at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1318)
[INFO] [Console$]       at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1315)
[INFO] [Console$]       at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1345)
[INFO] [Console$]       at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$85.apply(Defaults.scala:1343)
[INFO] [Console$]       at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
[INFO] [Console$]       at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1348)
[INFO] [Console$]       at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1342)
[INFO] [Console$]       at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
[INFO] [Console$]       at sbt.Classpaths$.cachedUpdate(Defaults.scala:1360)
[INFO] [Console$]       at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1300)
[INFO] [Console$]       at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1275)
[INFO] [Console$]       at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
[INFO] [Console$]       at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
[INFO] [Console$]       at sbt.std.Transform$$anon$4.work(System.scala:63)
[INFO] [Console$]       at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
[INFO] [Console$]       at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
[INFO] [Console$]       at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
[INFO] [Console$]       at sbt.Execute.work(Execute.scala:235)
[INFO] [Console$]       at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
[INFO] [Console$]       at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
[INFO] [Console$]       at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
[INFO] [Console$]       at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
[INFO] [Console$]       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[INFO] [Console$]       at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[INFO] [Console$]       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[INFO] [Console$]       at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[INFO] [Console$]       at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[INFO] [Console$]       at java.lang.Thread.run(Thread.java:745)
[INFO] [Console$] [error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.mahout#mahout-math-scala_2.10;0.13.0-SNAPSHOT:
not found
[INFO] [Console$] [error] unresolved dependency: org.apache.mahout#mahout-spark_2.10;0.13.0-SNAPSHOT:
not found
[INFO] [Console$] [error] unresolved dependency: org.apache.mahout#mahout-math;0.13.0-SNAPSHOT:
not found
[INFO] [Console$] [error] unresolved dependency: org.apache.mahout#mahout-hdfs;0.13.0-SNAPSHOT:
not found
[INFO] [Console$] [error] Total time: 4 s, completed Dec 14, 2016 9:16:15 AM
[ERROR] [Console$] Return code of previous step is 1. Aborting.

Regards
Hossein
………………………………

From: Pat Ferrel<mailto:pat@occamsmachete.com>
Sent: Tuesday, December 13, 2016 10:08 PM
To: user@predictionio.incubator.apache.org<mailto:user@predictionio.incubator.apache.org>
Cc: Magnus Kragelund<mailto:mak@ida.dk>
Subject: Re: Customizing Recommender engine

The UR has a new Apache PIO compatible repo here: https://github.com/actionml/universal-recommender.git

    git clone https://github.com/actionml/universal-recommender.git ur

and proceed. The UR allows boosts or filters by properties. You are using a filter (bias -1),
which does not work with the template you are using. Each template defines its own queries,
config (engine.json), and input formats. There are a few common ideas but each one will have
quite a few differences also and very different features.



On Dec 13, 2016, at 6:20 AM, Hossein Bahrami <h.bahrami@live.com<mailto:h.bahrami@live.com>>
wrote:

Hi, thanks for your reply. I don’t use Universal recommender, actually I use the Recommender
Engine (http://predictionio.incubator.apache.org/templates/recommendation/quickstart/)

First I tried to use Universal recommender but it was difficulties building the project so
I switched to the Recommendation Engine (RE), the RE works fine but don’t know how to customize
it to take other events (items’ properties as I described ) into account.

Seems the RE returns very high score recommendations with my data so I guess it’s the choice
here for me and just want to do some customization on it to make it predict recommendations
per category (i.e property) of items.

Regards
Hossein

From: Magnus Kragelund<mailto:mak@ida.dk>
Sent: Tuesday, December 13, 2016 12:17 PM
To: user@predictionio.incubator.apache.org<mailto:user@predictionio.incubator.apache.org>
Subject: Re: Customizing Recommender engine

Hi,
Assuming that you are using the Universal Recommender Engine, you should have a look at the
"Queries" section here: https://github.com/PredictionIO/template-scala-parallel-universal-recommendation#queries

Try this request instead, where the "fields" property is used to filter by category


{
"user": "674296",
"num": 10,
"fields": [
{
"name": "categories",
"values": ["CAT1", "CAT2"],
"bias": -1
}
 ]
}

/magnus


________________________________
From: Hossein Bahrami <h.bahrami@live.com<mailto:h.bahrami@live.com>>
Sent: Tuesday, December 13, 2016 10:55:00 AM
To: user@predictionio.incubator.apache.org<mailto:user@predictionio.incubator.apache.org>
Subject: Customizing Recommender engine

Dear all,

I’m new to Predictionio, currently I’m using it and I managed to import (rate, buy) events
and getting pretty good results querying it. But now I want to limit the results for items
in specific categories. I’ve created events for items’ properties (categories) as well
.

I am posting this query to engine but it seems doesn’t care about the categories and returns
same result every time.

{ "user": "674296", "num": 10, "categories" : ["CAT2", "CAT1"] }

I’ve imported bellow events

client.create_event(event="rate",
entity_type="user",
entity_id=int(row['userid']),
target_entity_type="item",
target_entity_id=int(row['itemid']),
properties= { "rating" : float(row['rating']) });

client.create_event(
event="buy",
entity_type="user",
entity_id=int(row['userid']),
target_entity_type="item",
target_entity_id=int(row['itemid']), );

client.create_event(
event="$set",
entity_type="item",
entity_id=itemid,
properties= { "categories": itemcats }
);

Could someone give me a solution or hint to how customize this recommender engine to take
the categories into account.

Thanks in advance
Hossein


Mime
View raw message