LF-Logomark-white

Do You Trust Your Reports?

I want to spend a little bit of time talking about trusting your reports. There’s really nothing more important than being able to trust your reports.

To give you an example, recently I ordered a couple of items being sent to me through UPS. UPS is the carrier for these items. I used the mobile app for UPS and there’s a report in there that will show you the status of your items. This report shows whether your item has shipped, in-transit, etc. It shows each of the steps along the way. So I go into this app, and I pull up the item. I can see the item is going to be delivered on the 31st of January, between 1:15 PM and 4:15 PM. I’m impressed how fast because this is the day before. I just ordered this item, and it is going to be delivered the next day. Then I scroll down, and it says, “Shipper created a label. UPS has not yet received a package.” That’s the shipment progress information. I already knew this was going to happen because this has happened before. I have used this vendor before, and this happens when this vendor uses UPS. For some reason, that combination causes this problem. Maybe it’s across the board, I’m not sure.

But what happens when I look at this report is it’s telling me two different pieces of information. It’s saying that UPS doesn’t have the package yet and it’s saying the package is going to be delivered tomorrow. So, I’m immediately questioning which one of those two pieces of information are correct or are they both. I’ll just simply lose faith in the report. I can’t trust it. And on top of that, I know from using this app, historically, that it is untrustworthy. Not 100%, I would say 95% of the time it’s accurate. Could be a little delayed. I’m able to deal with that and understand there’s a delay in the progress of these shipments. But it’s generally accurate, but I know that small percentage of time, it’s not.

But the problem here is I don’t know which time, when I look at this app, I’m looking at that small percentage of time, that is wrong. I don’t know when to trust the app, which across the board reduces my faith in the overall UPS tracking system. This isn’t a bash on UPS, I just had a shipment a couple of weeks ago via FedEx and it was a very similar situation. The report I was getting within the app just didn’t make any sense on the progress of the shipment, in this case that I was sending out. So, the point here is it is so important that the people consuming reports, you, the business users, the process managers, it’s important that you’re able to trust the reports that are helping you capture these value opportunities.

We’re really going to break this down into two sections. It’s all about why we do not trust reports. First, we’re going to figure out why we don’t have trust in the report to begin with. Secondly, if we did have trust in the report at one point, why do we lose faith in a report that we’re using?

So let’s start out with, how do we gain trust in a report? We’re talking about this from a business user’s standpoint, not a developer that understands all the little details that go into making these things happen but from a business user. They’re like me using the UPS app. I don’t know really all the details about how that tracking information is collected. It’s possible that, just because of my ignorance, I don’t know.

My son and I like to race remote control cars and we’ve started back again recently. And every time we do this, we end up breaking something. So, we’re familiar with this ordering parts situation. We’re ordering some parts from Traxxas. I don’t know if it’s Traxxas that’s reporting to UPS the estimated delivery date. I suspect that’s not the case. I do know that Traxxas would have to submit the label to UPS saying that maybe there’s a package ready for pickup, but I don’t know if it’s Traxxas estimating the delivery to be in an unreasonably quick amount of time or if it’s just a flaw in the UPS system. That’s assuming that if it doesn’t know when the delivery date is, they’re going to assume it’s tomorrow. I don’t know. And there may be many more things about that that I don’t know. The point here is that from a business user standpoint, we don’t have the benefit of understanding all the details that go into pulling a report together. So how do we get to the point where we trust this report?

There’s a few things that happen whenever we create reports. First, there’s often a business user or a subject matter expert of some kind that’s explaining the value opportunity in helping define the requirements that go into the development of the report. These requirements could be extremely simple, or they could be extremely complicated. The person that I’m talking about has the benefit of having been involved in the development of this report from day one. They understand the challenges that the source data is presenting. They understand what information is being pulled from which different data source and they also understand the nuances that go along with that. They know when this report’s going to be refreshed. They know if they provided constant values or goals that were being measured against source data. They know where that’s coming from. So this person really is involved with the development process and they know the level of validity of the report.

This is often the same person that is going to be signing off on the report being accurate at some point. They might have received a development version of the report and then the data goes into a source system or maybe a business application. Then data is pulled directly from that business application and validates the report. You can see that person has a lot more experience and understanding of what’s going on with the report itself and that’s one way that person can gain trust in a report. Again, they are part of the development process. They are part of the validation process and they’re part of the ongoing regression testing process. So, they have faith in the report. They trust it. That’s one way that can happen.

If you’re not that person that’s involved with the development of the report, how are we going to get to the point where we trust what we’re looking at here with this report? On the aside, why is this so important? It probably could go without saying, but there’s just nothing worse than looking at a report and that report’s just screaming at you a very important piece of information, a very valuable piece of information, but then it gets dismissed because it’s not trusted. That is a catastrophe. That’s what we’re talking about. It’s just so important that even after you go through this work of building a report for someone to help them capture value, it’s important that we don’t stop there. We got to make sure they trust the information they’re looking at.

Back to this person that’s not the subject matter expert on the report. They’re not the person that was involved with developing the report. How are we going to get to the point where you trust the report? I break this into two sections. I call this advocacy and education. I think in an ideal scenario, somebody that is trusted in the organization will simply educate the others in the organization about the report.

This could be a very simple process or a very elaborate process. It just depends on what we’re building. Maybe this person is going to call a meeting where the report is going to be explained so that everybody in the room or everybody in the call, understands how the report is being derived. They understand how the business application inputs that possibly the people in the room are entering, end up manifesting themselves in this report. Basically, giving everybody an understanding of how the report is created. What are the business rules that are implemented and what are the actions that we expect people to take as a result of this report? It may not be a report that drives direct action. It could be a report that’s informative. But what do we expect to happen when someone views this report?

Taking the time to do that type of education is critical. Otherwise, people are going to wonder, is this just another unvalidated report that I don’t know if I can trust or not. That’s some of that baggage that comes along with people that have experience in this area. If they already know that there’s a chance that these reports may not be correct, now they’re going to be questioning all the reports going forward. Education is important. Advocacy goes along with that and that’s just, someone who’s trusted in the organization working to the report. For example, in our organization, LeapFrogBI, we have reports that help us monitor the business. One of the reports that I advocate is our load monitoring report. Every day of the week, every day of the year, we monitor hundreds of processes that we have built and put into production for our clients.

We have reports that are responsible for determining if the bits of evidence we need to know that process succeeded did get received. If we’ve received that evidence and within a certain amount of time or certain time window. Daily, advocate this report to our small team. What do I mean? Well, I go into Teams, our main communication platform, I go into Teams, and I post in the whole team chat a little screenshot of this report. It’s telling us our success rate. It’s telling us, if we have any failures, and where are those failures. As well as, who’s responsible for those failures.

I believe, if your team starts to see that someone in your organization that’s trusted will advocate for this report, others will begin to understand. They will believe this is an important report. This is a report that this person trusts. I should be able to trust this report too. I don’t want to say trust blindly because we all make mistakes. We all need to question things when they look like they’re abnormal. If we have any reason to question them, we need to question them and we need to follow up on those questions seriously to make sure that we can continue to trust something because things do go wrong at times. I am going to talk about why later.

Back to the business user that’s not part of building the report. Let’s say that this person didn’t receive any education on the report. They don’t have any advocacy that they’re exposed to on this report. Basically, the report is just thrown out there in whatever reporting system’s being used and they stumble across it. They’re thinking here’s a report and it looks like exactly the information I need. They don’t know how it was created or what business rules were involved. On day one they don’t have faith in the report.


What can happen with the lack of education, the lack of advocacy, with lack of involvement in the report, or any understanding at all, through time, that person might actually begin to trust that report. So, over time, this is a much longer approach than education and advocacy. But, over time, that person may begin to trust the report simply because that person takes it upon themself to monitor the report and figure out if it’s giving them valid information. Just through trial and error. They look at the report one day and it’s giving them some feedback and they begin to question if it is true.

They go and figure out if it is the right information. And if it comes back with the right information, now they look at that as a data point. Now they can see they have a report that was correct that day. Then the next day, they might go in and look at that report again, and they can say, “Okay, well this is interesting.” But they’re still not going to be at the point after looking at the report one time, they fully trusted it. They have no idea, again, how the report was created. There’re all sorts of ways the report could be right one day and not the next.

This process could go on over time. It’s not the way that we want this to happen because that person’s just wasting their time, basically. With a little bit of information up front, a little bit of training, even a little bit of documentation, that person can begin using that report and trusting that report a lot quicker. This is better for everybody. That person’s not wasting their time trying to figure out if something is right or wrong for an ongoing period. And while they’re doing that, they’re probably losing out on some of the value opportunity that we’re trying to capture. Nevertheless, it is a possibility that someone can self-drive themself to trusting a report.

Let’s shift gears a little bit. We talked about how someone might get to the point where they trust a report. Basically, if it’s a person that’s a subject matter expert involved in the report development, if the person has some type of trusted advocate, if the person has received some type of education on the report, or if they just, over time, validate the report, all these ways can get someone to the point where they trust a report and are willing to use it. That’s the key here. They’re willing to use that report to influence whatever decisions are being made and actions are being taken. But it’s so easy to lose that trust. I guess it’s not unlike life in general. If you trust someone or trust anything and then it turns out that that trust was misguided, it’s hard to get the trust back.

It’s the same thing with reporting. Not only is it hard to get it back, but it’s easy to lose. Just one little thing that may seem immaterial to one person could make another person just completely lose faith in this reporting process. I’ll go back to the UPS app. Again, I think that app tells me 95% of the time, maybe more, accurate information that I can understand and that I can trust. The problem is that small percentage of time that I know that I can’t trust it because it’s proven to be unreliable, I can’t know which 5% of the time I’m looking at on a day-to-day basis. I constantly have this overarching mistrust in the report that I’m viewing. So trust is lost in a number of different ways, but the first thing I think the reason for lost trust is a decision that organizations make to not put in the processes that actually monitor these important reporting solutions.

If we have a reporting process in place that’s collecting information from business applications, it might be collecting information from executives that have actually recorded goals, it’s tying things together that have other influences on them. When we pull data out of an application, those applications have configurations, and those configurations could be altered. An alteration and configuration could impact a report downstream ending up putting inaccurate information in front of our business users. That’s one form of monitoring that needs to be in place. That would be regression testing. That would be where we must have something that is a known good value. After each process that we run, we need to compare the results of that process to a known good value. That’s a regression testing where we can just simply say this process is complete. We can put some types of checks and balances in place that are going to alert us if things are off the scale, they’re way out of whack from the prior day. However, we want to measure it so that someone can actively go and figure out what’s going on.

Before the users find out that there’s a problem, you can proactively tell users “We’ve got something that’s unexpected here. Hold off on using this report today.” Just doing that and telling the users that they shouldn’t use the report right now, that adds trust. Going back to how trust is earned in reports, that’s one way that this trust is earned, they know someone is making sure this report is accurate for them and they’ve learned to trust that that person is going to do that job. That’s very important. In addition to that type of monitoring where we’re monitoring for data anomalies and such, the other type of monitoring is even more basic and that is, “Did the process succeed? Did the process fail?”

Having done this for a long time, I can tell you that a lot of companies, I would even go as far as saying most companies do not do proper monitoring of these reporting processes. And there’s a lot that goes into this. Don’t get me wrong. I’m not pointing fingers, but if you have a data warehouse in place, you have post data warehouse semantic processes being run, you’ve got a reporting system that’s being loaded with information, and you’ve got reports being refreshed, there’s a lot of opportunity for things to go right and wrong at any point. We must know, we must have monitoring in place that tells us the things that we expect to be completed did complete. And did the things we are expecting to complete within the time window that we expect.

That type of monitoring is crucial. Otherwise, what’s going to happen is people are going to start looking at reports, the reports are going to be stale. They’re going to be from the prior day or the prior week. And now they’re wondering what they are looking at.

We use Power BI a lot for our front-end reporting tools. The reporting tools will tell you, in certain scenarios, the reporting tool will tell you that the data set underlying a report was refreshed today. It says there in the report, if they take the time to read it, “the data set was refreshed as of this date and time.”

So now you may think, the data set was refreshed. You think this report is current. Not so fast because that data set was refreshed on top of a data warehouse that wasn’t refreshed. So now we’re in a really bad situation, right? Now the user is thinking the date on this report tells me this report is refreshed, but I can look at this report and tell this is not new information. There’s no way this is current information. So now tomorrow when that person goes in to look at that report and they look at the refresh date, it says it was just refreshed. Even if that report was just refreshed correctly and all the upstream processes were refreshed correctly, now they’re wondering, “Well, yeah, I know it says that, but yesterday it said that too.” They’ve just got this hard to overcome mistrust now in the report. That can be easily taken care of with monitoring.

At LeapFrogBI, one thing we do, is we monitor every production process we put in place. Every single day of the year, and sometimes multiple times a day, it’s across the board. That’s so important. Rather you do it yourself or you have someone else do it for you, it is critical that monitoring be put in place.

Another reason why people lose faith in these reports, this one is probably well known, is you’ve got conflicting information. This can happen in so many ways. I’m going to go back to the UPS app, I go to my vendor website, I bought some parts from Traxxas, which is a remote-control car manufacturer. I go to their website, and I look at my order. It says “Shipped. Here’s the tracking number.” I’m like, “Okay. Great, it’s shipped. Got the tracking number. Wonderful.” But then when I go to the UPS app, it says they haven’t received the shipment yet.

I somewhat understand what’s going on there, but the point is that’s a little bit conflicting information. In some cases, you can be looking at two different reports. Let’s just talk about sales, since that’s easily described. Let’s say you’re looking at an executive report and it’s telling you that month-to-date sales are $20,000. Month-to-date sales are $20,000, high-level number. Then you go in and you start looking at the details, you want to know which segments of your business make up that $20,000. You start looking and you can see when you add up all the different segments that should make up that $20,000, but it adds up to $15,000.

So, we have two different pieces of information and they conflict. How can the executive report say that the month to date sales say $20,000, but my detailed report is telling me there’s only $15,000 in sales? There is not much that can damage trust more than conflicting information in these reports. It is so easy to let this happen. It could happen because one report is out of date. It could happen because business rules changed and one report was updated, but another report was not. It could happen because we’ve got a bunch of point solutions out there and not an integrated data solution such as the data warehouse or even an ODS for that matter. We have a bunch of disparate systems out there that have no way of staying in sync. There’s a lot of ways this can happen. It is critical that it doesn’t happen though. We’ve got to keep this information that’s being placed in front of business users consistently.

Another way that trust can be lost is complexity. Sometimes you can look at a report and you can be dazzled by all the colors and the cool looking visualizations and acronyms. It can just be a work of art, you can say, combined with a circus in one report. I’ve seen them, frankly, I’ve developed them in the past too just thinking, “I can put all this information in one report and it’s all right here.” It’s all right there, but it’s so darn complicated that no one is ever going to take the time to understand what they’re looking at. That complexity causes trust to be lost.

It’s much better to have a report that has one word and one number on it that someone can trust than to have an all-in-one report that no one’s ever going to read. No one’s ever going to understand. One measure and one number, that’s it. Now anyone can understand. It doesn’t have to be that simple, I know I’m going to the extremes here, but that’s something that people can understand. You can put that on a graph, right? You can graph that through time and show how sales have looked through time. People understand that. When you keep going and going, and you’re trying to show not only sales, but you’re trying to show where the sales came from and you want to give people the ability to impact the scope of those sales by providing a bunch of different filters that interact with visuals in different ways, it gets really complicated. So, keeping the report simple from a design perspective is critical.

Complexity from a business rules perspective is a little more challenging. These reports can get complicated. One of the reasons we’re building reports is to prevent the business users from having to go through this complicated process of data integration and rural application transformations They can pull the report up and trust the report. They don’t have to do all that stuff. But all that stuff is complicated. When you look at a report that you don’t understand, even if it’s well designed, you’re going to question it.The trust in the numbers you’re viewing are going to be diminished to some extent. So how do you overcome this? It’s hard. And it takes time.

It’s the same things we talked about before with advocacy, education, possible involvement in the process of development report, and then just simply, over time, getting experience with the report. All those things can overcome this, but another thing that I do is put a glossary directly in the report. Try to explain in text what the report is doing, not from a technical standpoint because no one cares about that. The people using the report don’t care about the method used to perform the transformations. What they care about is, from a business perspective, is what this report tells them. “How was this thing created? What’s excluded from this report? Who should use this report? How should it be used?” Those things will go a long way to helping the consumer of a report overcome a lot of complexity.

One more area where I think we loose trust in reports, is not maintaining reports. I hit on this before a little bit, but a report is not something that you put out there for people to consume and then treat it like the rotisserie oven. You set it and forget it. That is not what you do, in my view and in our practice, the role of reporting is a business function. Just like HR, just like finance, just like operations, reporting, business intelligence, that is a business function that needs to be maintained just like all other business functions. It needs to be grown over time. It needs to be adjusted to meet your changing business needs. It is a business function.

And if you treat a report like the old rotisserie oven, set it and forget it, believe me, people are going to forget it. People will forget the report because it is just a matter of time before that report becomes untrusted. Rightly so, because what’s going to happen is your source systems are going to change. Your configurations are going to change. Your business rules are going to change. The value opportunity is going to change. The people accessing that report are going to change. If you just let the report sit there and get stale over time and think that people are just accessing it without checking to see if it’s valid, it won’t be long before a report that’s not maintained is not trusted and should not be trusted.

It’s super critical, in addition to monitoring the report, in addition to providing that glossary so people can understand the report, in addition to making sure there’s not two versions of the truth out there with this report not matching another one when it should match, in addition to all of those things, it’s critical that the report is maintained. Do not set it and forget it or the report will do nothing for capturing value opportunities. In fact, it could diminish the the value opportunity that can be captured.

I am passionate about this one. I just feel like it’s so important when we’re building Reportopia that we don’t just think that we’re going to throw reports out there and people are going to trust them, start consuming this information, and making better decisions overnight. It’s not going to happen. We build the reports that can meet the need, and then we help people trust and use those reports. Super critical.


 

Share this post

';