Your choice of data solution helps define your overall solution architecture as well as the specific needs that must be filled by data wrangling and visualization tools. Analytics has always been difficult, and as data sources expand and proliferate the challenge grows. Despite advancements in technology, including a broad range of available databases, data preparation […]
Filter by Format
Business intelligence is a funny thing. The vast majority of companies engage in business intelligence in one form or another, yet if you ask 100 people what it is, 99 can’t tell you. As a provider of cloud business intelligence and BI consulting I figure LeapFrogBI has some responsibility to educate folks. So here […]
Reporting tools have become pretty good, and companies buy them with the expectation that doing so will solve their reporting challenges. But is this really the case? Are they really up to the job? This is a conversation we have frequently with clients and we hope it may benefit you. You can learn more about […]
Design patterns greatly speed development. The DP3000 & DP3100 design patterns create a data flow which tracks the current version of records in PSA based on a selected key field. The result is a new field in PSA called isCurrent by default which identifies the current PSA record with a value equal to 1.At times […]
Many organizations have built data warehouses successfully, and some have failed. There’s no reason at all for you to learn the hard way. In this video I’m going to tell you what I believe are the top three reasons for data warehouse failure. Stick around. Reason number one for data warehouse failure; not implementing a […]
A data warehouse offers the benefits of fact-based decision making, and these days nearly everyone agrees on their value. But data warehouse project have an alarmingly high failure rate. In this video we explain why and offer a way you can succeed where others have failed. You can find our article on data lakes here: […]
Our data is in PSA, and we have everything we need to build out a dimensional model. In this video we will define a conceptual model and use the LeapFrogBI Platform to automate the development of all required ETL. Finally, we will deploy our data to Power BI.
We have a persistent staging area loaded with our YouTube Analytics data. Now we need to do a bit of data discovery to better understand how our requirements will best be met. With this information in hand we will be able to create a suitable dimensional model.
Our YouTube data has been downloaded from the Reporting API, and we created a simple LeapFrogBI Platform project which will parse the downloaded flat files and load a persistent staging area. Now it’s time to deploy our project and load the target schema.
We are on the road to getting our YouTube Analytics data into Power BI. So far we have created a simple console application that collects reports from the YouTube Reporting API. Now it is time to load the downloaded data into a persistent staging area. The LeapFrogBI Platform will be used to automate all ETL […]
Now that we know a little bit about the YouTube Reporting API and OAuth 2.0, it is time to build a .NET console application that can interact with the Reporting API and automate the ongoing requirement to download our YouTube Analytics data.
YouTube’s Reporting API uses OAuth 2.0. In this video, we will use the Google Developer Console to create a project enabling our upcoming application to authenticate and gain authorization. We will also review the Reporting API quotas. Follow us! Email *NameSubmit Learn more Learn about ReadyForBI™, our complete BI in the cloud solution that includes […]