Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 54C9F200C0B for ; Sun, 29 Jan 2017 20:33:08 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 53478160B4F; Sun, 29 Jan 2017 19:33:08 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 812A9160B47 for ; Sun, 29 Jan 2017 20:33:06 +0100 (CET) Received: (qmail 75977 invoked by uid 500); 29 Jan 2017 19:33:05 -0000 Mailing-List: contact commits-help@predictionio.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@predictionio.incubator.apache.org Delivered-To: mailing list commits@predictionio.incubator.apache.org Received: (qmail 75968 invoked by uid 99); 29 Jan 2017 19:33:05 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 29 Jan 2017 19:33:05 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 22A78185F61 for ; Sun, 29 Jan 2017 19:33:05 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -6.218 X-Spam-Level: X-Spam-Status: No, score=-6.218 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, KAM_LAZY_DOMAIN_SECURITY=1, MANY_SPAN_IN_TEXT=0.001, RCVD_IN_DNSWL_HI=-5, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RP_MATCHES_RCVD=-2.999] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id mxxmBXE8zPTM for ; Sun, 29 Jan 2017 19:32:50 +0000 (UTC) Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with SMTP id 7304C5FC75 for ; Sun, 29 Jan 2017 19:32:44 +0000 (UTC) Received: (qmail 72620 invoked by uid 99); 29 Jan 2017 19:32:14 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 29 Jan 2017 19:32:14 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 9AC61DFEF5; Sun, 29 Jan 2017 19:32:14 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: chanlee@apache.org To: commits@predictionio.incubator.apache.org Date: Sun, 29 Jan 2017 19:32:39 -0000 Message-Id: <1535bf71251b4710a843f5cc5ae80fea@git.apache.org> In-Reply-To: References: X-Mailer: ASF-Git Admin Mailer Subject: [26/51] [abbrv] [partial] incubator-predictionio-site git commit: Documentation based on apache/incubator-predictionio#d674b89c7c3a17437bd406a497a08773c24c8007 archived-at: Sun, 29 Jan 2017 19:33:08 -0000 http://git-wip-us.apache.org/repos/asf/incubator-predictionio-site/blob/df530df4/resources/intellij/index.html ---------------------------------------------------------------------- diff --git a/resources/intellij/index.html b/resources/intellij/index.html new file mode 100644 index 0000000..01e0fd3 --- /dev/null +++ b/resources/intellij/index.html @@ -0,0 +1,69 @@ +Developing Engines with IntelliJ IDEA

Prerequisites

This documentation assumes that you have a fully functional PredictionIO setup. If you have not installed PredictionIO yet, please follow these instructions.

Preparing IntelliJ for Engine Development

Installing IntelliJ Scala Plugin

First of all, you will need to install the Scala plugin if you have not already done so.

Go to the Preferences menu item, and look for Plugins. You should see the following screen.

IntelliJ Plugins

Click Install JetBrains plugin..., the search for Scala. You should arrive at something similar to the following.

Scala Plugin

Click the green Install plugin button to install the plugin. Restart IntelliJ IDEA if asked to do so.

Setting Up the Engine Directory

It is very important to run at least pio build once in your engine directory so that the project correctly recognizes the version of PredictionIO that you are using. If you upgraded your PredictionIO installation later, you will need to run pio build again in order for the engine to pick up the latest version of PredictionIO.

Create an engine directory from a template. This requires that you install a template that you wish to start from or modify. Follow template install and deploy instructions or go through the Quick Start if you are planning to modify a recommender. Make sure to build, train, and deploy the engine to make sure all is configured properly.

From IntelliJ IDEA, choose File > New > Project from Existing Sources.... When asked to select a directory to import, browse to the engine directory that you downloaded too and proceed. Make sure you pick Import project from external model > SBT, then proceed to finish.

You should be able to build the project at this point. To run and debug PredictionIO server, continue on to the rest of the steps.

If you are running on OS X, you will need to do the following due to this known issue.

Edit build.sbt and add the following under libraryDependencies

1
"org.xerial.snappy" % "snappy-java" % "1.1.1.7"
+

Updating build.sbt

When you are done editing, IntelliJ should either refresh the project automatically or prompt you to refresh.

Dependencies

IntelliJ has the annoying tendency to drop some dependencies when you refresh your build.sbt after any changes. To avoid this we put any jars that must be available at runtime into a separate empty module in the project then we make the main engine project depend on this dummy module for runtime classes.

Right click on the project and click Open Module Settings. In the second modules column hit + and create a new Scala module. Name it pio-runtime-jars and add these assemblies under the module dependencies tab and remember to change the scope of the jars to runtime:

  • pio-assembly-0.10.0-incubating.jar

    This JAR can be found inside the assembly or lib directory of your PredictionIO installation directory.

  • spark-assembly-1.5.2-hadoop2.4.0.jar

    This JAR can be found inside the assembly or lib directory of your Apache Spark installation directory.

Create empty module and add dependencies

Now make your engine module dependent on the pio-runtime-jars module for scope = runtime.

Create empty module and add dependencies

Running and Debugging in IntelliJ IDEA

Simulating pio train

Create a new Run/Debug Configuration by going to Run > Edit Configu rations.... Click on the + button and select Application. Name it pio train and put in the following.

1
+2
+3
Main class: org.apache.predictionio.workflow.CreateWorkflow
+VM options: -Dspark.master=local -Dlog4j.configuration=file:/**replace_with_your_PredictionIO_path**/conf/log4j.properties
+Program arguments: --engine-id dummy --engine-version dummy --engine-variant engine.json
+

Click the ... button to the right of Environment variables, and paste the following.

1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
+21
SPARK_HOME=/**reaplce_w_your_spark_binary_path**
+PIO_FS_BASEDIR=/**replace_w_your_path_to**/.pio_store
+PIO_FS_ENGINESDIR=/**replace_w_your_path_to**/.pio_store/engines
+PIO_FS_TMPDIR=/**replace_w_your_path_to*/.pio_store/tmp
+PIO_STORAGE_REPOSITORIES_METADATA_NAME=predictionio_metadata
+PIO_STORAGE_REPOSITORIES_METADATA_SOURCE=ELASTICSEARCH
+PIO_STORAGE_REPOSITORIES_MODELDATA_NAME=pio_
+PIO_STORAGE_REPOSITORIES_MODELDATA_SOURCE=LOCALFS
+PIO_STORAGE_REPOSITORIES_APPDATA_NAME=predictionio_appdata
+PIO_STORAGE_REPOSITORIES_APPDATA_SOURCE=ELASTICSEARCH
+PIO_STORAGE_REPOSITORIES_EVENTDATA_NAME=predictionio_eventdata
+PIO_STORAGE_REPOSITORIES_EVENTDATA_SOURCE=HBASE
+PIO_STORAGE_SOURCES_ELASTICSEARCH_TYPE=elasticsearch
+PIO_STORAGE_SOURCES_ELASTICSEARCH_HOSTS=localhost
+PIO_STORAGE_SOURCES_ELASTICSEARCH_PORTS=9300
+PIO_STORAGE_SOURCES_LOCALFS_TYPE=localfs
+PIO_STORAGE_SOURCES_LOCALFS_HOSTS=/**replace_w_your_path_to**/.pio_store/models
+PIO_STORAGE_SOURCES_LOCALFS_PORTS=0
+PIO_STORAGE_SOURCES_HBASE_TYPE=hbase
+PIO_STORAGE_SOURCES_HBASE_HOSTS=0
+PIO_STORAGE_SOURCES_HBASE_PORTS=0
+

Remember to replace all paths that start with **replace with actual values. The directory .pio_store typically locates inside your home directory.

The end result should look something similar to this.

Run Configuration

Save and you can run or debug pio train with the new configuration!

Simulating pio deploy

For pio deploy, simply duplicate the previous configuration and replace with the following.

1
+2
Main class: org.apache.predictionio.workflow.CreateServer
+Program Arguments: --engineInstanceId **replace_with_the_id_from_pio_train**
+

Executing a Query

You can execute a query with the correct SDK. For a recommender that has been trained with the sample MovieLens dataset perhaps the easiest query is a curl one. Start by running or debuging your deploy config so the service is waiting for the query. Then go to the "Terminal" tab at the very bottom of the IDEA window and enter the curl request:

$ curl -H "Content-Type: application/json" -d '{ "user": "1", "num": 4 }' http://localhost:8000/queries.json

This should return something like:

1
+2
+3
+4
+5
+6
{"itemScores":[
+  {"item":"52","score":9.582509402541834},
+  {"item":"95","score":8.017236650368387},
+  {"item":"89","score":6.975951244053634},
+  {"item":"34","score":6.857457277981334}
+]}
+

If you hit a breakpoint you are likely to get a connection timeout. To see the data that would have been returned, just place a breakpoint where the response is created or run the query with no breakpoints.

Loading a Template Into Intellij IDEA

To customize an existing template using Intellij IDEA, first pull it from the template gallery:

1
$ pio template get <Template Source> <New Engine Directory>
+

Now, before opening the template with Intellij, run the following command in the new engine template directory

1
$ pio build
+

This should update the pioVersion key in SBT to the version of PredictionIO you have installed, so that Intellij loads the correct JARS via its Auto-Import feature. Now, you can go ahead and open the file build.sbt with Intellij IDEA. You are now ready to customize your new engine template.

\ No newline at end of file http://git-wip-us.apache.org/repos/asf/incubator-predictionio-site/blob/df530df4/resources/intellij/index.html.gz ---------------------------------------------------------------------- diff --git a/resources/intellij/index.html.gz b/resources/intellij/index.html.gz new file mode 100644 index 0000000..70c4aa3 Binary files /dev/null and b/resources/intellij/index.html.gz differ http://git-wip-us.apache.org/repos/asf/incubator-predictionio-site/blob/df530df4/resources/upgrade/index.html ---------------------------------------------------------------------- diff --git a/resources/upgrade/index.html b/resources/upgrade/index.html new file mode 100644 index 0000000..2432d88 --- /dev/null +++ b/resources/upgrade/index.html @@ -0,0 +1,278 @@ +Upgrade Instructions

PredictionIO Docs

Upgrade Instructions

This page highlights major changes in each version and upgrade tools.

How to upgrade

To upgrade and use new version of PredictionIO, do the following:

  • Download and unzip the new PredictionIO binary (the download path can be found in the Download PredictionIO section)
  • Retain the setting from current PredictionIO/conf/pio-env.sh to the new PredictionIO/conf/pio-env.sh.
  • If you have added PredictionIO/bin to your PATH environment variable before, change it to the new PredictionIO/bin as well.

Additi onal Notes for Specific Versions Upgrade

In addition, please take notes of the following for specific version upgrade.

Upgrade to 0.9.2

The Spark dependency has been upgraded to version 1.3.0. All engines must be rebuilt against it in order to work.

Open and edit build.sbt of your engine, and look for these two lines:

1
+2
+3
"org.apache.spark" %% "spark-core"    % "1.2.0" % "provided"
+
+"org.apache.spark" %% "spark-mllib"   % "1.2.0" % "provided"
+

Change 1.2.0 to 1.3.0, and do a clean rebuild by pio build --clean. Your engine should now work with the latest Apache Spark.

New PEventStore and LEventStore API

In addition, new PEventStore and LEventStore API are introduced so that appName can be used as parameters in engine.json to access Event Store.

The following changes are not required for using 0.9.2 but it's recommended to upgrade your engine code as described below because the old API will be deprecated.

1. In DataSource.scala:

  • remove this line of code:

    1
    import org.apache.predictionio.data.storage.Storage
    +

    and replace it by

    1
    import org.apache.predictionio.data.store.PEventStore
    +
  • Change appId: Int to appName: String in DataSourceParams

    For example,

    1
    case class DataSourceParams(appName: String) extends Params
    +
  • remove this line of code: val eventsDb = Storage.getPEvents()

  • locate where eventsDb.aggregateProperties() is used, change it to PEventStore.aggregateProperties():

    For example,

    1
    +2
    +3
    +4
    +5
    +6
    +  val usersRDD: RDD[(String, User)] = PEventStore.aggregateProperties( // CHANGED
    +    appName = dsp.appName, // CHANGED: use appName
    +    entityType = "user"
    +  )(sc).map { ... }
    +
    +
  • locate where eventsDb.find()is used, change it to PEventStore.find()

    For example,

    1
    +2
    +3
    +4
    +5
    +6
    +  val viewEventsRDD: RDD[ViewEvent] = PEventStore.find( // CHANGED
    +    appName = dsp.appName, // CHANGED: use appName
    +    entityType = Some("user"),
    +    ...
    +
    +

2. In XXXAlgorithm.scala:

If Storage.getLEvents() is also used in Algorithm (such as ALSAlgorithm of E-Commerce Recommendation template), you also need to do following:

If org.apache.predictionio.data.storage.Storage is not used at all (such as Recommendation, Similar Product, Classification, Lead Scoring, Product Ranking template), there is no need to change Algorithm and can go to the later engine.json section.

  • remove import org.apache.predictionio.data.storage.Storage and replace it by import org.apache.predictionio.data.store.LEventStore
  • change appId to appName in the XXXAlgorithmParams class.
  • remove this line of code: @transient lazy val lEventsDb = Storage.getLEvents()
  • locate where LEventStore.findByEntity() is used, change it to LEventStore.findByEntity():

    For example, change following code

    1
    +2
    +3
    +4
    +5
    +6
    +7
    +8
    +9
    +10
    +11
    +12
    +13
    +14
    +15
    +16
      ...
    +  val seenEvents: Iterator[Event] = lEventsDb.findSingleEntity(
    +    appId = ap.appId,
    +    entityType = "user",
    +    entityId = query.user,
    +    eventNames = Some(ap.seenEvents),
    +    targetEntityType = Some(Some("item")),
    +    // set time limit to avoid super long DB access
    +    timeout = Duration(200, "millis")
    +  ) match {
    +    case Right(x) => x
    +    case Left(e) => {
    +      logger.error(s"Error when read seen events: ${e}")
    +      Iterator[Event]()
    +    }
    +  }
    +

    to

    1
    +2
    +3
    +4
    +5
    +6
    +7
    +8
    +9
    +10
    +11
    +12
    +13
    +14
    +15
    +16
    +17
    +18
    +19
      val seenEvents: Iterator[Event] = try { // CHANGED: try catch block is used
    +    LEventStore.findByEntity( // CHANGED: new API
    +      appName = ap.appName, // CHANGED: use appName
    +      entityType = "user",
    +      entityId = query.user,
    +      eventNames = Some(ap.seenEvents),
    +      targetEntityType = Some(Some("item")),
    +      // set time limit to avoid super long DB access
    +      timeout = Duration(200, "millis")
    +    )
    +  } catch { // CHANGED: try catch block is used
    +    case e: scala.concurrent.TimeoutException =>
    +      logger.error(s"Timeout when read seen events." +
    +        s" Empty list is used. ${e}")
    +      Iterator[Event]()
    +    case e: Exception =>
    +      logger.error(s"Error when read seen events: ${e}")
    +      throw e
    +  }
    +

    If you are using E-Commerce Recommendation template, please refer to the latest version for other updates related to LEventStore.findByEntity()

3. In engine.json:

locate where appId is used, change it to appName and specify the name of the app instead.

For example:

1
+2
+3
+4
+5
+6
+7
+8
  ...
+
+  "datasource": {
+    "params" : {
+      "appName": "MyAppName"
+    }
+  },
+
+

Note that other components such as algorithms may also have appId param (e.g. E-Commerce Recommendation template). Remember to change it to appName as well.

That's it! You can re-biuld your engine to try it out!

Upgrade to 0.9.0

0.9.0 has the following new changes:

  • The signature of P2LAlgorithm and PAlgorithm's train() method is changed from

    1
      def train(pd: PD): M
    +

    to

    1
      def train(sc: SparkContext, pd: PD): M
    +

    which allows you to access SparkContext inside train() with this new parameter sc.

  • A new SBT build plugin (pio-build) is added for engine template

If you have existing engine templates running with previous version of PredictionIO, you need to either download the latest templates which are compatible with 0.9.0, or follow the instructions below to modify them.

Follow instructions below to modify existing engine templates to be compatible with PredictionIO 0.9.0:

  1. Add a new parameter sc: SparkContext in the signature of train() method of algorithm in the templates.

    For example, in Recommendation engine template, you will find the following train() function in ALSAlgorithm.scala

    1
    +2
    +3
    +4
    +5
    +6
    +7
    +8
    +9
    class ALSAlgorithm(val ap: ALSAlgorithmParams)
    +  extends P2LAlgorithm[PreparedData, ALSModel, Query, PredictedResult] {
    +
    +  ...
    +
    +  def train(data: PreparedData): ALSModel = ...
    +
    +  ...
    +}
    +

    Simply add the new parameter sc: SparkContext, to train() function signature:

    1
    +2
    +3
    +4
    +5
    +6
    +7
    +8
    +9
    class ALSAlgorithm(val ap: ALSAlgorithmParams)
    +  extends P2LAlgorithm[PreparedData, ALSModel, Query, PredictedResult] {
    +
    +  ...
    +
    +  def train(sc: SparkContext, data: PreparedData): ALSModel = ...
    +
    +  ...
    +}
    +

    You need to add the following import for your algorithm as well if it is not there:

    1
    import org.apache.spark.SparkContext
    +
  2. Modify the file build.sbt in your template directory to use pioVersion.value as the version of org.apache.predictionio.core dependency:

    Under your template's root directory, you should see a file build.sbt which has the following content:

    1
    +2
    +3
    +4
    libraryDependencies ++= Seq(
    +  "org.apache.predictionio"    %% "core"          % "0.8.6" % "provided",
    +  "org.apache.spark" %% "spark-core"    % "1.2.0" % "provided",
    +  "org.apache.spark" %% "spark-mllib"   % "1.2.0" % "provided")
    +

    Change the version of "org.apache.predictionio" && "core" to pioVersion.value:

    1
    +2
    +3
    +4
    libraryDependencies ++= Seq(
    +  "org.apache.predictionio"    %% "core"          % pioVersion.value % "provided",
    +  "org.apache.spark" %% "spark-core"    % "1.2.0" % "provided",
    +  "org.apache.spark" %% "spark-mllib"   % "1.2.0" % "provided")
    +
  3. Create a new file pio-build.sbt in template's project/ directory with the following content:

    1
    addSbtPlugin("org.apache.predictionio" % "pio-build" % "0.9.0")
    +

    Then, you should see the following two files in the project/ directory:

    1
    +2
    your_template_directory$ ls project/
    +assembly.sbt  pio-build.sbt
    +
  4. Create a new file template.json file in the engine template's root directory with the following content:

    1
    {"pio": {"version": { "min": "0.9.0" }}}
    +

    This is to specify the minium PredictionIO version which the engine can run with.

  5. Lastly, you can add /pio.sbt into your engine template's .gitignore. pio.sbt is automatically generated by pio build.

That's it! Now you can run pio build, pio train and pio deploy with PredictionIO 0.9.0 in the same way as before!

Upgrade to 0.8.4

engine.json has slightly changed its format in 0.8.4 in order to make engine more flexible. If you are upgrading to 0.8.4, engine.json needs to have the params field for datasource, preparator, and serving. Here is the sample engine.json from templates/scala-parallel-recommendation-custom-preparator that demonstrate the change for datasource (line 7).

1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
In 0.8.3
+{
+  "id": "default",
+  "description": "Default settings",
+  "engineFactory": "org.template.recommendation.RecommendationEngine",
+  "datasource": {
+    "appId": 1
+  },
+  "algorithms": [
+    {
+      "name": "als",
+      "params": {
+        "rank": 10,
+        "numIterations": 20,
+        "lambda": 0.01
+      }
+    }
+  ]
+}
+
1
+2
+3
+4
+5
+6
+7
+8
+9
+10
+11
+12
+13
+14
+15
+16
+17
+18
+19
+20
In 0.8.4
+{
+  "id": "default",
+  "description": "Default settings",
+  "engineFactory": "org.template.recommendation.RecommendationEngine",
+  "datasource": {
+    "params" : {
+      "appId": 1
+    }
+  },
+  "algorithms": [
+    {
+      "name": "als",
+      "params": {
+        "rank": 10,
+        "numIterations": 20,
+        "lambda": 0.01
+      }
+    }
+  ]
+

Upgrade from 0.8.2 to 0.8.3

0.8.3 disallows entity types pio_user and pio_item. These types are used by default for most SDKs. They are deprecated in 0.8.3, and SDKs helper functions have been updated to use user and item instead.

If you are upgrading to 0.8.3, you can follow these steps to migrate your data.

1. Create a new app
1
$ pio app new <my app name>
+

Please take note of the generated for the new app.

2. Run the upgrade command
1
$ pio upgrade 0.8.2 0.8.3 <old app id> <new app id>
+

It will run a script that creates a new app with the new app id and migreate the data to the new app.

3. Update engine.json to use the new app id. Engine.json is located under your engine project directory.
1
+2
+3
  "datasource": {
+    "appId": <new app id>
+  },
+

Schema Changes in 0.8.2

0.8.2 contains HBase and Elasticsearch schema changes from previous versions. If you are upgrading from a pre-0.8.2 version, you need to first clear HBase and ElasticSearch. These will clear out all data in Elasticsearch and HBase. Please be extra cautious.

ALL EXISTING DATA WILL BE LOST!

Clearing Elasticsearch

With Elasticsearch running, do

1
$ curl -X DELETE http://localhost:9200/_all
+

For details see http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/indices-delete-index.html.

Clearing HBase

1
+2
+3
+4
+5
+6
$ $HBASE_HOME/bin/hbase shell
+...
+> disable_all 'predictionio.*'
+...
+> drop_all 'predictionio.*'
+...
+

For details see http://wiki.apache.org/hadoop/Hbase/Shell.

Experimental upgrade tool (Upgrade HBase schema from 0.8.0/0.8.1 to 0.8.2)

Create an app to store the data

1
$ bin/pio app new <my app>
+

Replace by the returned app ID: ( is the original app ID used in 0.8.0/0.8.2.)

1
+2
+3
+4
$ set -a
+$ source conf/pio-env.sh
+$ set +a
+$ sbt/sbt "data/run-main org.apache.predictionio.data.storage.hbase.upgrade.Upgrade <from app ID>" "<to app ID>"
+
\ No newline at end of file http://git-wip-us.apache.org/repos/asf/incubator-predictionio-site/blob/df530df4/resources/upgrade/index.html.gz ---------------------------------------------------------------------- diff --git a/resources/upgrade/index.html.gz b/resources/upgrade/index.html.gz new file mode 100644 index 0000000..cdf3f21 Binary files /dev/null and b/resources/upgrade/index.html.gz differ http://git-wip-us.apache.org/repos/asf/incubator-predictionio-site/blob/df530df4/robots.txt ---------------------------------------------------------------------- diff --git a/robots.txt b/robots.txt new file mode 100644 index 0000000..e42f0d7 --- /dev/null +++ b/robots.txt @@ -0,0 +1,4 @@ +User-agent: * +Disallow: + +Sitemap: http://docs.prediction.io/sitemap.xml