What if you could publish a live data value once every fifteen minutes and, without any further action on your part, data and algorithms would find their way to it and provide you state of the art predictions?
There are millions of business uses.
So start now...
Here are the steps shown in the video
As with Modules 1 and 2, we instantiate a MicroWriter.
pip install microprediction
Then we import
from microprediction import MicroWriter, new_key
However this time, the key must be much rarer (and thus powerful). You need a key with difficulty at least 12, at time of writing.
write_key = new_key(difficulty=12)
writer = MicroWriter(write_key=write_key)
Don't hold your breath waiting for that code to run. As an alternative, reach out to us for a key that is strong enough to create data streams, and tell us your idea in case we can help.
writer.set(name='my_own_stream.json',value=3.14157)
And repeat as often as your quantity is measured.
As an alternative, you can use the API directly with a live/put call. There are other clients as well, such as the TypeScript client, but these are outside of the scope of these Python modules.
Use MicroWriter.set(name=, value= ) to publish a value. Repeat!
To create a stream your write key must be difficulty at least 12. So call new_key(difficulty=12)
See the MicroWriter code if you are interested in how this interacts with the API.