Open Source Analytics In A Beautiful Dashboard? - Part 2
Picking up where we left off in the last post, we’re going to do some housekeeping. Namely, I’m going to frame this in the context of a docker container and ditch the virtual environment… If you don’t care about dockerizing this application, that’s fine, just skip ahead a bit. This will make the postgres addition a cake walk in the future, and remove the dependencies of having python on your machine… but adding a docker requirement instead (can’t win all your battles). So Let’s add a few files to the root level, namely the docker files.
|-- Dockerfile
|-- README.md
|-- docker-compose.override.yml
|-- docker-compose.prod.yml
|-- docker-compose.yml
|-- requirements.txt
|-- run.py
|-- venv
`-- webapp
The Dockerfile
simply let’s us define how to build the dash app, so we just copy our instructions of how we installed it before. I’m going to include a production grade server as well, green unicorn, but you can use something else if you like. To generate the requirements.txt
, you can use pip freeze > requirements.txt
from your old virtual env (or just use the one I’ve provided in the repo). With the requirements in hand, our Dockerfile
should look something like
I won’t go into too much detail about how the above works, just accept it does for now. Now let’s add the compose files in. The base docker-compose.yml
has the template for our application. It looks something like
which you can see has a reference to the name of our dash image (which we will build), postgres (our relational db we’ll be using) and our caching server redis. For local development (what we’re doing right now) I’ve included an override to add in our local dev settings. So in docker-compose.override.yml
we see our local settings
Here we’re overriding the command in the dockerfile, with the old python run.py
we’ve been using to start the server and binding the code on our machine with the code in the container. This means that when we’re changing our code, the code in the container will change as well (even if it’s running). Now we can safely remove our venv
and .env
(but you can keep them around if you’re more comfortable with them, don’t forgot to add them to your .dockerignore
if you keep them around), and start our local dev server with docker-compose up
(docker-compose is bundled in docker for windows. If you’re on a Unix system. I’m fairly sure you need to get it separately). Last thing you’ll need to update is the run.py
and update the bind:
Let’s get back to the app. The next short goal will implement is a selector for the vender, and get that change what our spending is for that vender. I’ve updated my load_data method to add all the my monthly csv files I got from the bank, since we’ll add the db connection piece later. I’ve also added a simple method to place the vender data into a format plotly likes(which follows {'label':'This is shown as an option','value':'this is value of that option'}
):
This vendor method simply gives us all the selections that will work with the data loaded, so to add the options to dashboard we go to index.py
.
This adds the options to the dashboard, but it currently doesn’t do anything to the page. So let’s add a callback for some action. First we’ll update our data manager with the method to filter for our selection. You’ll see that the values of the dropdown element will always come back as an [int]
, so we’ll plan for an index match to see what’s included. Also, I realized we aren’t filling null dates at this point, and I think the graph looks better with the 0’s included. So Let’s make the following edits to data.py
Now our callback is simple to create, we just have to update the figure we’re showing. So let’s create a file called callback.py
and include the one callback:
The syntax here might be a little weird for you, but we’re adding a callback to dash app (app) which will trigger when there’s a change to the property value
of the HTML element id=VENDOR_SELECTOR_ID
. The name of the function below the attribute isn’t important since the function directly below get’s called. In our case we’re updating the figure
property of the HTML element with id=OUTPUT_GRAPH_ID
. Lastly we just have to include this callback to the webapp by importing it in the __init__.py
file.
Now let’s have a look at what we have so far. My dashboard is looking something like:
The last thing we’ll add in this tutorial part is another selector for the date, so we may easily change the bounds of the graph. Using the re-indexing trick we used earlier this is a cake walk. We simply need to add the method arguments and the data will bound itself.
Next up let’s throw the HTML in with the handy dash component into index.py
.
Now we just have to add the callback. Since we’ve updated the method already, we just have to add the inputs to the callback.
That ends of this part of the tutorial, next time we’ll add more views of the data and give ourselves the ability to drill down.