Ryan Sleeper
Ryan shares the Decision-Ready Dashboard Framework, Playfair Data’s strategic approach to building world-class data visualization tools. By the end of this video, you’ll be familiar with the four phases – Discovery, Data Prep, Dashboard Development, and Distribution – and be able to use those steps to build better dashboards.
Hi, this is Ryan with Playfair Data TV, and in this video, we’re going to be talking about a strategic framework that we use on the consulting side of our business at Playfair Data called the Decision-Ready Dashboard Framework. This is really the bread and butter of Playfair Data. When we do a new client project, this is what we follow. I presented this to somebody as recently as Monday afternoon. This is literally what we do on a day in, day out basis.
This is how we stay in business. It’s this Decision-Ready Dashboard Framework. That being said, as meaningful and important as this framework is to our company, I’m going to share this with you. And we’re going to get to the end of it, and you’re going to think to yourself, that’s it? That’s all they’re doing over there? And that’s great.
There’s a theorem that I really am a fan of called Occam’s Razor. Perhaps you’ve heard of that. But it states that the simplest solution is usually the best one. And this is kind of our Occam’s Razor. There’s nothing complicated about this. Anybody can do it. But it’s what I have observed to be the steps that are required to create an effective visualization as efficiently as possible.
It has four phases, Discovery, Data Prep, Dashboard Development, and Distribution. And each of those four phases has four steps within it. Starting with Discovery, here’s vital question number one, who is the audience? Who are we building this for? The difference this time is, once I know who the audience is, we do something really crazy, and we talk to them.
We do stakeholder interviews. We’re trying to make a connection. We’re trying to figure out, what problem are they trying to solve? Are there any ways of solving that problem that they particularly like? Are there ways that they don’t like?
But the main thing that we’re trying to derive out of those conversations are the objectives for the project and their objectives within the business, because once I know what those objectives are– we already walked through this with that example of improving customer satisfaction by 25%– is that gives me some tangible data to work with and seek out.
Once I know what the objective is, now you’re giving me something to work with. I can go figure out what fields or measures and dimensions am I going to need in order to see if we’re tracking well to meeting those objectives in the business.
The second phase is Data. Once I know what the fields are, I make a little data schema that just tells me where do those data fields reside. This can look very different depending on the project. This could be a single Excel file, all the way to we’ve got one field in 10 different data sources, and we got to figure out how to consolidate those.
But we need to figure out, does that data even exist? And that can be an interesting exercise, actually. Sometimes we find out it doesn’t. Let’s say our objective really was to increase customer satisfaction by 25%. If we don’t know how many customers are satisfied, we’ve got to do something different. We have to either maybe install a survey on our website, maybe do some focus groups. We need to create that data. It doesn’t exist yet.
Once we list where the data sources reside, assuming that we’ve got the data, we do what’s called identify the keys. That’s a data engineering term for columns that are in common. That’s what allows you, if you have data fields that are in common, you’re able to consolidate those data sources by either joining them together or doing unions.
And that leads me to the third and most important step within data, which is to shape that data. We’ll talk about this a little bit more actually in the next framework. But there’s a specific way to shape data, and actually we talked about this Monday morning. But there’s a specific way to shape that data that’s going to work best with Tableau.
Once we’ve engineered that data, that’s what this process is called, data engineering. We do our first of two quality assurance checks. At this stage, we’re validating the data, making sure we have the right grain or the right level of detail or analysis, so just making sure it goes down to the detail that we want in our analysis, make sure we’re not double counting things. And we’re doing this so that we build some trust with the data source before we even open it in Tableau.
And just a quick aside, I don’t want to belittle this step, because it can look very different depending on your project. You may not need to do any data engineering. It might be an export from Google Analytics that comes out perfectly. You can connect to it, start building something.
But a lot of companies, this is somebody entire job, where it really does get very complex. You’re trying to look across different data sources, figure out, are there opportunities to make it consolidated so that it runs more efficiently. So it’s just a case by case basis. We don’t go into too much detail because we could easily have a 12 hour course on just data engineering.
Notice here that we are halfway through our most commonly used framework, and we haven’t even opened Tableau yet. And so that’s 50% through. I’ve heard a common complaint that data engineering actually is one of those things in analytics that follows the 80-20 rule as well, where 80% of the time is spent doing the data prep.
That only leaves 20% of the time to actually do what’s just as important, if not more important, which is to analyze the data and create some kind of action from that data. Our goal going into a project is to flip that ratio. We always like to say we try to measure twice, cut once.
We feel like if we do enough proper planning, we can get the data engineering down to 20% of the time, which leaves 80% of the time to do the actual analytics, analyzing the data, coming up with insights, making recommendations, making it look good and engaging. Try to spend 80% of the time on that. But in the real world, it probably plays out pretty close to this for us most of the time. It’s about 50-50 between the prep, data engineering, and then the dashboard engineering.
But once we are ready to move into the actual data visualization and creating dashboards, we create an initial concept. And this will be one of my biggest tips for you throughout the course. I have learned the hard way to not invest a lot of time in the initial concept.
I always love to share this anecdote. I’ve built hundreds of Tableau dashboards. I wrote three books on the topic. It doesn’t matter. I have never to this day created a single dashboard that everyone thought was perfect. It’s not going to happen for you. Just embrace that. Do not let that get to you. Do not take it personally.
I actually kind of like that I have a little bit of an expressive or artsy side. And I do think of data visualization as an art form. And as they say, beauty is in the eye of the beholder. People are just going to have a different opinion about what you’re building.
So create an initial concept, but don’t spend a lot of time on it. In fact, we’ve evolved to using tools completely outside of Tableau for this initial concept, everything from literally sketching on a piece of paper. A couple of software tools that we like are Figma and Adobe XD or the rest of the Adobe Creative Suite. But we’ll build a concept before building it out with real data in Tableau because the second step is to gather feedback. And there will be feedback.
If you’ve got more than one stakeholder, as in more than yourself, they’re going to have an idea of how to make it better, which is great. That’ll help kind of sharpen the tool up. So embrace it. Build it into the process. Get their feedback. Then we go finalize the concepts.
This is also the one step in this framework where there can be a little bit of an iterative loop. What I mean by that is we don’t really expect to build an initial concept, get their feedback once, go build it in Tableau, and then it’s just perfect, and everybody says, yep, you nailed it. There’s just going to be something different when it gets translated to Tableau.
So usually there’s one more iteration, where once we build it out in Tableau, let the client review it again, they’ll give us some more feedback, but then usually it really is one round. It’s very rarely more than two rounds of that. So it’s still very efficient. My theory on why that works so well, I think it’s a human nature thing.
When you include the stakeholders in the process, they have a little more skin in the game. They’ll remember the types of feedback that they provided. And as long as that feedback is incorporated, they’ll remember that. And just it’s a human nature thing. They’ll be more likely to receive whatever you had built. They’ll say, oh yeah, I told you to change that to a line graph. That was a great idea. It looks awesome. Now I’m going to use this. It just seems to be how it plays out.
The last step on the Dashboard side is our second and final quality assurance check. This is an opportunity to validate the data one more time, so similar to the quality assurance step at the end of the Data phase. But we’re also checking a couple of additional things at this point. We’re trying to make sure filters and user experience work how we expected.
When I click West in the West filter, did it actually filter to West, things like that. We’re double checking, are any of our calculated fields off? Maybe we have an issue like the profit ratio wrong way that I showed you yesterday. If something is super inflated, we need to go back to the drawing board, rewrite that calculated field and make sure that’s correct.
Once we’ve done the second quality assurance step, we’re ready to distribute the dashboard. This kind of goes in order from most sophisticated to least sophisticated. But starting out with kind of the enterprise versions of Tableau, we’ve got Tableau Server and Tableau Online. If you have a Creator license, you will be able to publish to either of those two if your company has one of those.
I think I mentioned this on day one, but just a reminder, Server and Online have almost the exact same capabilities. But Server is hosted by your own company. Tableau Online is hosted by Tableau. And in fact more specific than that, I happen to know they’re hosted by Amazon Web Services, so that’s where all your dashboards would live.
And then Tableau Public, that’s actually called the world’s largest installation of Tableau Server. Again, technically the same functionality, but the difference is it’s saved to the public web where anyone can see it. You can also distribute packaged workbooks.
There is a free Tableau product called Tableau Reader. It acts very much like a PDF reader, where if you had Microsoft Word and you save that as a PDF, you could send it to somebody. Even if they don’t have a Microsoft license, they could open it as a PDF if they had a PDF reader. It’s actually a very strong tool. You can interact. So any of the interaction capabilities that you build in, like filters, will actually carry over into that free version of the software.
That’s a good product. This was extremely popular 10 years ago. But it is still around. I think most companies have evolved past this. But it is a good tool to– if your company is relatively new to Tableau, this is a good way to kind of make sure people are going to adopt it before you sign up for bigger licenses.
You could have one creative license, build everything, package it, send it to people. They could open it in the reader product, make sure they start adopting it, then move on to something better. Then we’ve got PDFs. Any dashboard can be exported or printed to PDF. I actually find that this option helps drive adoption. So there’s no shame in using this one.
Tableau Server and Tableau Online, if you’ve used it before, it can be a little burdensome for a new user. And depending on my audience, especially if it’s a C level executive, I might not want to bother them with trying to help them locate where my workbook is on the Tableau Server because they have to follow a link.
They might have to do a search for a certain project. They have to remember their credentials. There’s just all these barriers to adoption, versus a PDF, I can print that out, maybe even circle a couple of things, write a couple of notes– we saw this, we should do this– and put it on the CEO’s desk. And it removes all the barriers to adoption. They’re going to have the best possible chance to have seen your work.
And along those same lines, we’ve also got images, still no shame in this one. You can actually export any sheet or dashboard as an image in Tableau. You can also just take a screenshot of it and put it into a PowerPoint deck or put it into an email, so it’s another way to drive adoption.
Obviously, I’d be remiss if I didn’t mention be careful with security. A picture can just be emailed around. So you’ll want to be sensitive to that. But I think all of these options are good options for distributing whatever you’ve built. This has been Ryan with Playfair Data TV. Thanks for watching.