Reporting tools have become pretty good, and companies buy them with the expectation that doing so will solve their reporting challenges. But is this really the case? Are they really up to the job? This is a conversation we have frequently with clients and we hope it may benefit you. You can learn more about […]
Design patterns greatly speed development. The DP3000 & DP3100 design patterns create a data flow which tracks the current version of records in PSA based on a selected key field. The result is a new field in PSA called isCurrent by default which identifies the current PSA record with a value equal to 1.At times […]
Many organizations have built data warehouses successfully, and some have failed. There’s no reason at all for you to learn the hard way. In this video I’m going to tell you what I believe are the top three reasons for data warehouse failure. Stick around. Reason number one for data warehouse failure; not implementing a […]
A data warehouse offers the benefits of fact-based decision making, and these days nearly everyone agrees on their value. But data warehouse project have an alarmingly high failure rate. In this video we explain why and offer a way you can succeed where others have failed. You can find our article on data lakes here: […]
Our data is in PSA, and we have everything we need to build out a dimensional model. In this video we will define a conceptual model and use the LeapFrogBI Platform to automate the development of all required ETL. Finally, we will deploy our data to Power BI.
We have a persistent staging area loaded with our YouTube Analytics data. Now we need to do a bit of data discovery to better understand how our requirements will best be met. With this information in hand we will be able to create a suitable dimensional model.
Our YouTube data has been downloaded from the Reporting API, and we created a simple LeapFrogBI Platform project which will parse the downloaded flat files and load a persistent staging area. Now it’s time to deploy our project and load the target schema.
We are on the road to getting our YouTube Analytics data into Power BI. So far we have created a simple console application that collects reports from the YouTube Reporting API. Now it is time to load the downloaded data into a persistent staging area. The LeapFrogBI Platform will be used to automate all ETL […]
Now that we know a little bit about the YouTube Reporting API and OAuth 2.0, it is time to build a .NET console application that can interact with the Reporting API and automate the ongoing requirement to download our YouTube Analytics data.
YouTube’s Reporting API uses OAuth 2.0. In this video, we will use the Google Developer Console to create a project enabling our upcoming application to authenticate and gain authorization. We will also review the Reporting API quotas. Follow us! Email *NameSubmit Learn more Learn about ReadyForBI™, our complete BI in the cloud solution that includes […]
Would you like to use Power BI to analyze your YouTube Analytics data? Same here. In this first video, we briefly review the YouTube Reporting API which will be used to collect YouTube data. In subsequent videos, we will download YouTube data, create a dimensional model, and ultimately use Power BI to visualize our YouTube […]
Introduction Your project has a single dimension data flow defined. Now it’s time to complete an iterative deployment and load our target dimension. Success is only minutes away!