Simple as that. LeapFrogBI Data Warehouse Automation now supports SQL Server 2017. Existing clients can upgrade their projects without writing a single line of code using the “Set Target Project” option in the admin panel. As always, LeapFrogBI projects are both forward and backwards compatible. Enjoy!
Posts by Paul B. Felix:
Does race or gender impact sentencing lengths? To what extent? The United States Sentencing Commission (USSC) published a report which explores the correlation between demographic factors and federal sentence lengths. In this post, we review the findings in depth, look for bias in referencing articles, and draw our own conclusions. Data-driven decision making in action.
Design patterns greatly speed development. The DP3000 & DP3100 design patterns create a data flow which tracks the current version of records in PSA based on a selected key field. The result is a new field in PSA called isCurrent by default which identifies the current PSA record with a value equal to 1.At times […]
Many organizations have built data warehouses successfully, and some have failed. There’s no reason at all for you to learn the hard way. In this video I’m going to tell you what I believe are the top three reasons for data warehouse failure. Stick around. Reason number one for data warehouse failure; not implementing a […]
Our data is in PSA, and we have everything we need to build out a dimensional model. In this video we will define a conceptual model and use the LeapFrogBI Platform to automate the development of all required ETL. Finally, we will deploy our data to Power BI.
We have a persistent staging area loaded with our YouTube Analytics data. Now we need to do a bit of data discovery to better understand how our requirements will best be met. With this information in hand we will be able to create a suitable dimensional model.
Our YouTube data has been downloaded from the Reporting API, and we created a simple LeapFrogBI Platform project which will parse the downloaded flat files and load a persistent staging area. Now it’s time to deploy our project and load the target schema.
We are on the road to getting our YouTube Analytics data into Power BI. So far we have created a simple console application that collects reports from the YouTube Reporting API. Now it is time to load the downloaded data into a persistent staging area. The LeapFrogBI Platform will be used to automate all ETL […]
Now that we know a little bit about the YouTube Reporting API and OAuth 2.0, it is time to build a .NET console application that can interact with the Reporting API and automate the ongoing requirement to download our YouTube Analytics data.
YouTube’s Reporting API uses OAuth 2.0. In this video, we will use the Google Developer Console to create a project enabling our upcoming application to authenticate and gain authorization. We will also review the Reporting API quotas. Follow us! Email *NameSubmit Learn more Learn about ReadyForBI™, our complete BI in the cloud solution that includes […]
Would you like to use Power BI to analyze your YouTube Analytics data? Same here. In this first video, we briefly review the YouTube Reporting API which will be used to collect YouTube data. In subsequent videos, we will download YouTube data, create a dimensional model, and ultimately use Power BI to visualize our YouTube […]
Introduction Your project has a single dimension data flow defined. Now it’s time to complete an iterative deployment and load our target dimension. Success is only minutes away!