predictionio-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pat Ferrel <>
Subject Re: Validate the built model
Date Wed, 06 Sep 2017 13:39:17 GMT
We do cross-validation tests to see how well the model predicts actual behavior. As to the
best data mix, cross-validation works with any engine tuning or data input. Typically this
requires re-traiing between test runs so make sure you use exatly the same training/test split.
If you want to examine the usefulness of different events you can compare event-type 1 to
event type 1 + event type 2 etc. This is made easier by inputting all events, then using a
test trick in the UR to mask out any combination of events for the cross-validation, using
the single existing model so no need to re-train for this type of analysis. We have an un-supported
script that does this but I warn you that you are on your own using it. <>

On Sep 6, 2017, at 6:15 AM, Saarthak Chandra <> wrote:


With the Universal Recommender,

1. How can we validate the model after we train and deploy it?

2. How can we find an appropriate method of data mixing ??

Saarthak Chandra,
Masters in Computer Science,
Cornell University.

You received this message because you are subscribed to the Google Groups "actionml-user"
To unsubscribe from this group and stop receiving emails from it, send an email to
To post to this group, send email to <>.
To view this discussion on the web visit
For more options, visit <>.

View raw message