Hackernoon logoVisualizing Bitcoin prices moving averages using Dash by@aknvictor

Visualizing Bitcoin prices moving averages using Dash

Author profile picture

@aknvictorVictor Akinwande

Everyone seems ecstatic about Bitcoin right now, and for a while I’ve been interested in writing about Dash by plotly — the cool frontend tool for building analytical applications. The reason I like Dash is it allows me to remain in my comfort zone: Python. And it boasts of being lightweight, easily customizable, you know, all those good things.

Moving averages (MAs) are a pretty useful indicator in stock trading. It’s still a grey area as to whether Bitcoin is like a stock. As such, I’m using the term loosely here. MAs help identify trends, provide trade signals, and smooth out price action by filtering out the “noise” from random price fluctuations. There are various variants of, and ways to calculate moving averages. Simple Moving Average (SMA) is simply the average of the last N values with N being the period to average over. Weighted Moving Average (WMA) and Exponential Moving Average (EMA) are less simple in that they introduce weights and also take into consideration the fact that more recent values may provide a better indication of the trend and thus apply some form of decay to the weights.

Here, I’ll give a brief tutorial on how to visualize Bitcoin prices, the SMA and the EMA using Dash.

Our first challenge is getting real-time Bitcoin prices. Thankfully, Coindesk provides an API that does just that.

{"time":{"updated":"Dec 31, 2017 19:26:00 UTC","updatedISO":"2017-12-31T19:26:00+00:00","updateduk":"Dec 31, 2017 at 19:26 GMT"},"disclaimer":"This data was produced from the CoinDesk Bitcoin Price Index (USD). Non-USD currency data converted using hourly conversion rate from openexchangerates.org","bpi":{"USD":{"code":"USD","rate":"13,935.3500","description":"United States Dollar","rate_float":13935.35}}}

Nice. The prices are updated every minute so we need a service that calls this API minute by minute. What we would like is a process that schedules the REST call every minute which is then executed in the background.

According to Heroku:

Scheduling a job and executing a job are two related but independent tasks. Separating a job’s execution from its scheduling ensures the responsibilities of each component are clearly defined and results in a more structured and manageable system.

As you would expect, Python is up to the task. The Advanced Python Scheduler (APScheduler) is a Python library that lets you schedule your Python code to be executed later, either just once or periodically.

We also need a queuing system. We’ll use RQ (Redis Queue) simply because it makes it easy to add background tasks to Python code on Heroku — my favorite PAAS.

$ pip install apscheduler rq redis

We also need to install Redis . There is a comprehensive guide here. All I had to do from my Mac was brew install redis and from my Ubuntu sudo apt install redis-server

We can then go on to create a process that listens to, and processes queued jobs.

import os
import redis
from rq import Worker, Queue, Connection
listen = ['high', 'default', 'low']
redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')
conn = redis.from_url(redis_url)
if __name__ == '__main__':
with Connection(conn):
worker = Worker(map(Queue, listen))
worker.work()

This is simply a worker process that reads jobs from the given queues in an endless loop, waiting for new work to arrive when all jobs are done. Name the file worker.py

Now we can go ahead to author a file namedscheduler.py that would serve as our scheduler. We’ll use the BlockingScheduler class which is quite simple, and obtain a connection to our queue.

from apscheduler.schedulers.blocking import BlockingScheduler
from rq import Queue
scheduler = BlockingScheduler()
from worker import conn
q = Queue(connection=conn)

Having created and obtained a connection to the queue in our worker process, we can now schedule our job i.e making a REST call to Coindesk’s API.

We’ll go ahead to author a file named utils.py You may want to install the following packages first

$ pip install iso8601 requests

And then

import requests, calendar, time
from iso8601 import parse_date
def retrieve_current_price(endpoint):

response = requests.get(url=endpoint)
data = response.json()
    updatedTime = data['time']['updatedISO']
currentPrice = data['bpi']["USD"]['rate_float']

parsed = parse_date(updatedTime)
timetuple = parsed.timetuple()
parsedTime = calendar.timegm(timetuple)
    return

Nothing groundbreaking here. We are simply making an http GET request to the endpoint and extracting the needed information from the JSON response formatted as shown earlier. What we would like to do next is save these values to a datastore. We’ll use TinyDB — a straightforward document oriented DB.

$ pip install tinydb

We’ll then go on to add the following snippets of code into utils.py Notice that the prices are stored in a file named bpi.db

from tinydb import TinyDB, Query

db = TinyDB('bpi.db')
BPI = Query()
db.insert({'updatedTime': int(parsedTime), 'QueryTime': int(time.time()), 'currentPrice': currentPrice})

So that utils.py looks like this

import requests, calendar, time
from iso8601 import parse_date
from tinydb import TinyDB, Query
def retrieve_current_price(endpoint):

response = requests.get(url=endpoint)
data = response.json()
    updatedTime = data['time']['updatedISO']
currentPrice = data['bpi']["USD"]['rate_float']

parsed = parse_date(updatedTime)
timetuple = parsed.timetuple()
parsedTime = calendar.timegm(timetuple)
    db = TinyDB('bpi.db')
BPI = Query()
db.insert({'updatedTime': int(parsedTime), 'QueryTime': int(time.time()), 'currentPrice': currentPrice})
    return

Now that we’ve defined our job, we can go ahead to schedule it in our scheduler.py file. Let’s dig up the file again and add the following snippet of code just underneath everything else in the file.

from utils import retrieve_current_price
endpoint = "https://api.coindesk.com/v1/bpi/currentprice/USD.json"
#To attach the needed metadata we use a decorator
#Coinbase publishes prices every minute (60 seconds)
@scheduler.scheduled_job('interval', minutes=1)
def retrieve_current_price_job():
q.enqueue(retrieve_current_price, endpoint)

scheduler.start()

Sweet. Notice that we first import the method that executes our REST call. This method is then added to the redis queue withq.enqueue(retrieve_current_price, endpoint) and another such job is added every minute.

Running scheduler.py and worker.py should begin to populate the bpi.db file every minute with the Bitcoin price indexes.

We now need to move over to visualizations using Dash. As a teaser, here is a look at the expected final output from this tutorial. Its dynamic, so this just gives a peek. Dash is written on top of Flask, Plotly.js, and React.js.

Snip of the Dash app

As you can see, it does use a bit of styling here and there and most of the styling code is borrowed from https://github.com/plotly/dash-wind-streaming. That said, we won’t go over the styling. Go ahead and download the stylesheet.

Stylesheet for the Dash app

A bit annoying to install but we also need to download the requirements.txt file that contains packages for the application here. You can install them using

$ pip install -r requirements.txt

Once that is done, we can go ahead to author a file named app.py that would contain the code for our Dash application.

As usual, import a few modules. We’ll describe the functionality of some later on.

import os
import datetime as dt
import time
import dash
import dash_core_components as dcc
import dash_html_components as html
from dash.dependencies import Input, Output, State, Event
import plotly.plotly as py
from plotly.graph_objs import *
import numpy as np
from scipy.stats import rayleigh
from tinydb import TinyDB, Query
from flask import Flask, send_from_directory

We’ll go ahead and create our Dash object which takes a Flask instance as an argument.

server = Flask('BitcoinMAvgs')
app = dash.Dash('BitcoinMAvgs-App', server=server, url_base_pathname='/dash/', csrf_protect=False)

Voila! we have a Dash application. Pretty useless I know but all we need to do now is define our layouts and callbacks. To get a comprehensive dive into Dash head here.

The Dash layout describes what your app would look like and is composed of a set of declarative Dash components. On the other hand, callbacks make Dash apps interactive using Python functions that are automatically called whenever an input component’s property changes. There are core components and html components.

we’ll go ahead to start defining our layout. First we define the root div

app.layout = html.Div([...], className='main-div')

We can then add sub-components to this div

If you look at the final layout you’ll notice the banner. We’ll go ahead to add that. Replace the ellipsis in the snippet above with the code below

html.Div([html.H2("Bitcoin prices with moving averages")], className='banner'),

This basically creates a div, sets the reference to its styling and adds a header text.

The next thing we need to do is create the container for our graph, its title, and the corresponding components. Go ahead and add the snippet beneath the banner component.

html.Div([
   html.Div([ 
html.H3("Price(USD) " + str(dt.datetime.now().date()))
], className='Title'),
   html.Div([
dcc.Graph(id='bpi'),],
className='twelve columns bpi'),
   dcc.Interval(id='bpi-update', interval=12000)
],className='row bpi-row'),

A few explanations. As we’ve seen already the divs and headerare simply html components as we know them. But now we have 2 core components. dcc.Graph and dcc.Interval. The Graph component shares the same syntax as written in the open-source plotly.py library. The Interval element allows you to update components on a predefined interval. In our case, every 2 minutes or 12000 milliseconds. Also note that these components have IDs. We’ll use these IDs when writing our callbacks.

Beneath the div that contains our graph, we’ll add 3 more components —all Dropdowns wrapped in their own divs each having a corresponding label.

html.Div([
   html.P("Period"),
   dcc.Dropdown(id='period', 
options= [{'label': i, 'value': i}
for i in ['5', '10', '15']], value='10')]
   style={'width': '31%', 'display': 'inline-block'}),

html.Div([
   html.P("Box plot Period"),
   dcc.Dropdown(id='box_plot_period',
options=[{'label': i, 'value': i}
for i in ['5', '10', '15']], value='10')],
   style={'width': '31%', 'margin-left': '3%', 'display': 'inline-block'}),

html.Div([
   html.P("Time frame"),
   dcc.Dropdown(id='timeframe',
options=[{'label': i, 'value': i}
for i in ['1', '2', '3', '4']], value='2')],
   style={'width': '31%', 'float': 'right', 'display': 'inline-block'})

These dropdowns will dynamically control the period for our averages, the grouping for our box-plots and the timeframe in hours for which to show the averages respectively.

Having defined our layouts, we can then go ahead to write what happens when we interact with the layouts — callbacks

@app.callback(
Output(component_id='bpi', component_property='figure'),[
Input(component_id='period', component_property='value'),
Input(component_id='box_plot_period', component_property='value'),
Input(component_id='timeframe', component_property='value')],
[],
[Event('bpi-update', 'interval')])

What we have here is simply the decorator for our interface which we will define shortly that describes declaratively its input and output. The inputs are the properties of a particular component. Here, our inputs are the value properties of the components that have the IDs period, box_plot_period, and timeframe i.e our dropdowns and our output is the figure property of the component with the ID bpi which is our graph.

Now, we can code up the method to load the data from our datastore and return the graph figure based on our arguments. Beneath our decorator, we’ll add the code below

def get_bpi(N, bN, T):
T = int(T)
curr_time = time.time()
begin_time = curr_time - T * 60 * 60
    db = TinyDB('bpi.db')
BPI = Query()
data = db.search(BPI.updatedTime > begin_time)

First we need to determine what slice of the data to process. And this is controlled by the timeframe argument; T . We simply query the datastore for the last T hours of data. Now that we have the raw data, we want to extract the prices and also build a friendly format of the timestamp — we’ll show the hour and minute for each data point in our graph.

for bpi in data:

prices.append(bpi["currentPrice"])
a = dt.datetime.fromtimestamp(int(bpi["updatedTime"]))
    dtime.append(str(a.hour) + ":" + str(a.minute))
plen = len(prices)

So we would like to build box-plots for our prices but remember that we have parameterized the grouping of these box-plots so that we can group our prices into various chunks — 10 minute, 15 minute or as you wish.

boxtraces = []
names = []
for i in xrange(0, plen, bN):
    y = prices[i:i + bN]
ind = i + bN - 1
    if (i + bN) > len(dtime):

ind = len(dtime) - 1
    name = dtime[ind]
names.append(name)
    trace = Box(y=y, name=name, showlegend=False,
x=[i for j in range(len(y))])
    boxtraces.append(trace)

Some explanations here. We loop through our prices list skipping over in bN steps — the boxplot period parameter and for each of these data slices, we associate a label — which is the timestamp of the last datapoint in that slice. We then build a Boxplot graph object using trace = Box(…) for each slice and append all those objects to a list.

Now, we can calculate and plot our first moving average. The SMA. The SMA is just as its called — simple. We take an average over the prices as we move across the data in N chunks — which is our period parameter.

SMA = []
if plen > N:
    SMA = [None for i in range(N - 1)]

for i in xrange(0, plen - N - 1, 1):
        y = prices[i: i + N]
sma = reduce(lambda x, y__: x + y__, y) / len(y)

SMA.append(sma)

The Exponetial moving average — EMA is slightly different and has the following formular

new EMA = [Closing price - previous EMA] x (2 / N+1) + previous EMA
where N = period

With that, we can go ahead and code up its implementation

EMA = []
if plen > N:

EMA = [None for i in range(N - 1)]
y = prices[0: i + N]

avg = reduce(lambda x, y_: x + y_, y) / len(y)
EMA.append(avg)
    for i in xrange(N, plen, 1):

new_ema = ((prices[i] - EMA[-1]) * 2.0 / 11.0) + EMA[-1]
     EMA.append(new_ema)

Now to some plotly things.

Just as we did for our box-plot traces, we also create a line-plot using the our SMA and EMA data.

trace = Scatter(
y=SMA, x=[i for i in xrange(0, plen, 1)],
line=Line(color='#42C4F7'), mode='lines', name='SMA'
)
trace2 = Scatter(
y=EMA, x=[i for i in xrange(0, plen, 1)],
line=Line(color='#32DD32'), mode='lines',name=str(N) + '-period-EMA'
)

Nothing spectacular here, we’re simply creating a Scatter object and passing data and styling properties.

With that done, we can create our plotly layout

layout = Layout(
xaxis=dict(tickmode="array",ticktext=names,
tickvals=[i for i in xrange(0, plen, bN)],
showticklabels=True),
height=450,margin=Margin(t=45,l=50,r=50)
)
traces = []
traces.append(trace)
traces.append(trace2)
boxtraces.extend(traces)
return Figure(data=boxtraces, layout=layout)

The names variable here is simply the list of our human friendly timestamps and our boxplots are grouped by bN so we want to use that too to set our tickvals. After which we can create a Figure object instantiated with the traces and the layout.

And that’s it! What’s left is to reference our stylesheet the Flask way. Just go ahead and copy the code below, and place it beneath all the code in app.py

external_css = [
"https://cdnjs.cloudflare.com/ajax/libs/skeleton/2.0.4/skeleton.min.css",
"https://fonts.googleapis.com/css?family=Raleway:400,400i,700,700i",
"https://fonts.googleapis.com/css?family=Product+Sans:400,400i,700,700i"]

for css in external_css:

app.css.append_css({"external_url": css})

css_directory = os.getcwd()
stylesheets = ['BitcoinMAvgs.css']
static_css_route = '/static/'

@app.server.route('{}<stylesheet>'.format(static_css_route))
def serve_stylesheet(stylesheet):
    if stylesheet not in stylesheets:
        raise Exception('"{}" is excluded from the allowed static files'.format(stylesheet))
    return send_from_directory(css_directory, stylesheet)

for stylesheet in stylesheets:
app.css.append_css({"external_url":"/static/{}".format(stylesheet)})

This snippet of code assumes the local stylesheet BitcoinMAvgs.css is located in the same directory as app.py

Finally! we can run our application.

if __name__ == '__main__':
app.run_server()

Start your Redis server if its not already running

$ redis-server

and execute these in separate shells

$ python worker.py
$ python scheduler.py
$ python app.py

Your application should be accessible via http://localhost:8050/dash/

There we have it. Dynamic moving averages of Bitcoin prices without writing a single line of Javascript. The complete code can be found on GitHub https://github.com/Viktour19/BitcoinMAvgs

Drop a comment if there are any thoughts on how to make this better.

Cheers!

Tags

Join Hacker Noon

Create your free account to unlock your custom reading experience.