predictionio-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From git-site-r...@apache.org
Subject [11/12] predictionio-site git commit: Documentation based on apache/predictionio#37c17935475cf9f753f6be4d9ab22dbbd7d3cd4a
Date Fri, 29 Jun 2018 03:12:40 GMT
http://git-wip-us.apache.org/repos/asf/predictionio-site/blob/8ce20376/evaluation/paramtuning/index.html
----------------------------------------------------------------------
diff --git a/evaluation/paramtuning/index.html b/evaluation/paramtuning/index.html
index acc21b1..229b890 100644
--- a/evaluation/paramtuning/index.html
+++ b/evaluation/paramtuning/index.html
@@ -11,11 +11,9 @@
 <span class="o">}</span>
 </pre></td></tr></tbody></table> </div> <h3 id='build-and-run-the-evaluation'
class='header-anchors'>Build and run the evaluation</h3><p>To run an evaluation,
the command <code>pio eval</code> is used. It takes two mandatory parameter, 1.
the <code>Evaluation</code> object, which tells PredictionIO the engine and metric
we use for the evaluation; and 2. the <code>EngineParamsGenerator</code>, which
contains a list of engine params to test against. The following command kickstarts the evaluation
workflow for the classification template.</p><div class="highlight shell"><table
style="border-spacing: 0"><tbody><tr><td class="gutter gl" style="text-align:
right"><pre class="lineno">1
 2
-3
-4</pre></td><td class="code"><pre><span class="gp">$ </span>pio
build
+3</pre></td><td class="code"><pre><span class="gp">$ </span>pio
build
 ...
-<span class="gp">$ </span>pio <span class="nb">eval </span>org.template.classification.AccuracyEvaluation
<span class="se">\</span>
-    org.template.classification.EngineParamsList
+<span class="gp">$ </span>pio <span class="nb">eval </span>org.example.classification.AccuracyEvaluation
org.example.classification.EngineParamsList
 </pre></td></tr></tbody></table> </div> <p>You
will see the following output:</p><div class="highlight shell"><table style="border-spacing:
0"><tbody><tr><td class="gutter gl" style="text-align: right"><pre
class="lineno">1
 2
 3
@@ -102,7 +100,7 @@ Optimal Engine Params:
   <span class="o">}</span>
 <span class="o">}</span>
 Metrics:
-  org.template.classification.Accuracy: 0.9281045751633987
+  org.example.classification.Accuracy: 0.9281045751633987
 The best variant params can be found <span class="k">in </span>best.json
 <span class="o">[</span>INFO] <span class="o">[</span>CoreWorkflow<span
class="nv">$]</span> runEvaluation completed
 </pre></td></tr></tbody></table> </div> <p>The
console prints out the evaluation metric score of each engine params, and finally pretty print
the optimal engine params. Amongst the 3 engine params we evaluate, <em>lambda = 10.0</em>
yields the highest accuracy score of ~0.9281.</p><h3 id='deploy-the-best-engine-parameter'
class='header-anchors'>Deploy the best engine parameter</h3><p>The evaluation
module also writes out the best engine parameter to disk at <code>best.json</code>.
We can train and deploy this specify engine variant using the extra parameter <code>-v</code>.
For example:</p><div class="highlight shell"><table style="border-spacing:
0"><tbody><tr><td class="gutter gl" style="text-align: right"><pre
class="lineno">1
@@ -291,11 +289,9 @@ The best variant params can be found <span class="k">in </span>best.json
 <span class="o">}</span>
 </pre></td></tr></tbody></table> </div> <p>A good
practise is to first define a base engine params, it contains the common parameters used in
all evaluations (lines 7 to 8). With the base params, we construct the list of engine params
we want to evaluation by adding or replacing the controller parameter. Lines 13 to 16 generate
3 engine parameters, each has a different smoothing parameters.</p><h2 id='running-the-evaluation'
class='header-anchors'>Running the Evaluation</h2><p>It remains to run the
evaluation. Let&#39;s recap the quick start section above. The <code>pio eval</code>
command kick starts the evaluation, and the result can be seen from the console.</p><div
class="highlight shell"><table style="border-spacing: 0"><tbody><tr><td
class="gutter gl" style="text-align: right"><pre class="lineno">1
 2
-3
-4</pre></td><td class="code"><pre><span class="gp">$ </span>pio
build
+3</pre></td><td class="code"><pre><span class="gp">$ </span>pio
build
 ...
-<span class="gp">$ </span>pio <span class="nb">eval </span>org.template.classification.AccuracyEvaluation
<span class="se">\</span>
-    org.template.classification.EngineParamsList
+<span class="gp">$ </span>pio <span class="nb">eval </span>org.example.classification.AccuracyEvaluation
org.example.classification.EngineParamsList
 </pre></td></tr></tbody></table> </div> <p>You
will see the following output:</p><div class="highlight shell"><table style="border-spacing:
0"><tbody><tr><td class="gutter gl" style="text-align: right"><pre
class="lineno">1
 2
 3
@@ -390,4 +386,4 @@ The best variant params can be found <span class="k">in </span>best.json
 e=d.getElementsByTagName(t)[0];s.async=1;s.src=u;e.parentNode.insertBefore(s,e);
 })(window,document,'script','//s.swiftypecdn.com/install/v1/st.js','_st');
 
-_st('install','HaUfpXXV87xoB_zzCQ45');</script><script src="/javascripts/application-1a70e440.js"></script></body></html>
\ No newline at end of file
+_st('install','HaUfpXXV87xoB_zzCQ45');</script><script src="/javascripts/application-6d62b164.js"></script></body></html>
\ No newline at end of file


Mime
View raw message