This notebook shows how you can create a Kaggle kernel (R notebook) and submit your entry.
Yes that's right. You are entering a microprediction contest from within Kaggle.
Refer to the notebook. The key step is
lagged <- fromJSON("https://api.microprediction.org/lagged/c5_iota.json")
which you can mimic to get any stream's historical data.
Just as an example, we compute empirical quantiles. First generate evenly spaced probabilities between 0 and 1
probs <- seq(1/(2*n), 1-1/(2*n), (1-1/n)/(n-1))
Then compute quantiles
q <- quantile(y, probs = probs, names=FALSE )
This provides us with a vector of length 225. You hope that some of them will be close to the truth, which will be revealed as the next update to the time series that occurs after a delay of 70 seconds. We will use 70 seconds as an example here, but you can find the list of possible delays at config.json or in the mw.DELAYS property.
Until there is an R client, just open up this python notebook in colab and run it. You don't need Python familiarity.(b) Your first submission
We don't have an R package for this (yet) but you can just use the boilerplate provided in the script to submit directly with a PUT request. First, the vector of values must be converted into strings.
q_str <- to_vec(for(i in 1:length(q)) toString(q[i]))
and joined into a comma-separated list
values <- paste(q_str,collapse=",")
so we can submit (see the notebook)
print( PUT(url = paste("https://api.microprediction.org/submit/",name),
body = list(write_key=write_key,delay=70,values=values)
))
and that's all! Race over to your dashboard at https://www.microprediction.org/ and paste in your key. You'll see that you are entered.