If you reside in Japan, chances are you've encountered our advertisements on platforms like Facebook, Instagram (Meta) Ads, Google Ads, LinkedIn Ads, etc. Clicking on these ads redirects you to our website, app store page, or the app itself. With the diverse range of platforms we employ, monitoring their performance simultaneously becomes increasingly challenging.
The question arises: How can we efficiently compare the real-time performance of these platforms on a centralized dashboard without hiring someone to create a dashboard every day? Fortunately, leveraging rapidly evolving technology, our marketing team has been developing a system known as Marketing Analytics Engineering.
In this article, we will outline a solution that you can potentially integrate into your company or organization, particularly if you are a small startup looking to build everything from scratch.
How it Functions
We employ the Extract, Load, and Transform (ELT) concept from the data engineering field. Let's break down the process at a high level.
Extract
We extract data from various platforms using APIs provided by advertisers and platforms. Fortunately, most advertisers and platforms nowadays offer APIs, allowing us to communicate and extract their data using a program we build ourselves. For example, Meta offers a Marketing API allowing us to extract ad results. Our team then build a Python program to extract the data from the API.
Load
The extracted data is loaded into a centralized data warehouse, built using Google BigQuery. BigQuery facilitates easy transformation and analysis of data through a user-friendly SQL syntax. Furthermore, BigQuery is designated as a data warehouse for analytical processes, making it faster to execute analyses and mathematical operations on the data.
Transform
Data is transformed into smaller data pieces, referred to as data marts. These are categorized based on the data's needs and the interests of the team or individuals. For instance, raw data from Meta is broken down into data marts for impressions only or daily cost-related data.
These data marts are then visualized on a real-time dashboard, making it easier for the business side to comprehend. At GIG-A, we utilize Looker Studio due to its seamless integration with other Google products.
The Missing Piece
While this process seems straightforward, automating it poses challenges. Transformations are simplified with BigQuery's native scheduling feature for SQL queries. However, extraction requires a different approach. In our Google environment, we use Google Cloud Run as the container for data extraction programs and Google Scheduler to schedule all jobs.
Does This Solution Make Sense to You?
As Larry Wall, the creator of the Perl programming language, once said, "There's more than one way to do it!" The solution we present is suitable for smaller-scale data or less complex data flow from extraction to transformation. For larger scales, various tools may be more appropriate, such as Google Composer with Airflow or dbt for orchestrating a complex ELT process or adding Vertex AI when incorporating machine learning solutions. Go check them out if you’re interested into them.
Ultimately, the choice should align with your requirements, budget, and the solution's impact on your business. Keep in mind that the more tools and complexity involved, the higher the expenses for the solution. Choose wisely based on your specific needs and circumstances.