How to archive your Universal Analytics historical data

Time is running out to save your Universal Analytics data. Follow these steps to archive historical analytics before the July 1 deadline.

Chat with MarTechBot

Another Google Analytics 4 migration project deadline is fast approaching, and this deadline is hard set. On July 1, Google will delete all historical data from Universal Analytics properties. This deadline also affects Analytics 360 customers.

With little more than one month until the deadline, if you have not done so by now, your organization needs to prioritize archiving your historical data. There are three main phases I recommend for approaching this project.

Phase 1: Make a plan

Before archiving data, it’s important to decide:

What specific data is important to you? 

  • Prioritize downloading data that you regularly refer to, such as conversion and sales data.
  • Make a full list of the data you need to archive.

How many years of data do you want to keep?

  • Many of us have been using Google Analytics since the mid-2000s – does your organization need to archive data from nearly 20 years ago? 
  • Decide how far back you want to archive data. I recommend, at minimum, to consider archiving back to 2018 or so to ensure you have pre-pandemic data since the pandemic really presented data anomalies for many companies.

At what cadence do you review data?

  • Consider how often you typically report on your data. Is it weekly? Monthly? 
  • Depending on the archiving method you choose in Phase 2, you may need to organize the data into specific time increments.

Dig deeper: Why the shift from ‘conversions’ to ‘key events’ in GA4 is a game-changer

Phase 2: Choose an archiving method

There are four main options available for archiving your Universal Analytics data. Each has its own pros and cons, so choose a method based on your team’s resources and skills.

Option 1: Manual file downloads

  • Pros: Easy for almost all users to do, free.
  • Cons: Time-consuming, cumbersome, difficult to access data for reporting later, limited to 5000 rows.

While this is the easiest process to understand, it is also time consuming.

Following your plan for years, cadence and data points, you’ll need to go into each report in the Google Universal Analytics interface, set the date, dimension and metric settings as needed.

Also, remember to change the number of rows from the default of 10 to the maximum of 5,000 rows to ensure you capture as much data as possible.

Click the export button and export data to a Google Sheet, Excel or CSV. Repeat this process until you have downloaded all of the data identified in your archive plan.

Option 2: Download data to Google Sheets using the Google Analytics add-on (best option for tech novices)

  • Pros: Fairly simple to implement for most users with spreadsheet experience, free, fast to download.
  • Cons: Restrictive to a set timeframe (e.g., monthly), each sheet has total data limitations, often encounters sampling issues.

This option is fairly simple for most users to perform. Create a new Google Sheet and add the Google Analytics spreadsheet add-on.

The add-on essentially uses the Google Analytics API to download data to Google Sheets but doesn’t require API programming knowledge to operate. Google has compiled a basic overview of this approach in this help document.

The first time you use the add-on, you’ll build a report using the add-on’s interface. But after the first report is run, you can also simply update the Report Configuration tab and create additional reports directly in columns of that sheet.

You can also conveniently use formulas in the Report Configuration sheet. Use the Dimensions and Metrics Explorer to find the proper API code to enter into each field. 

One drawback of the Google Sheets method is that you may encounter sampling if you pull too much data at once (e.g., your entire 20-year dataset for sessions) or your report is too detailed (too many dimensions pulled together for a high level of granularity). 

When you run a report, you’ll see the sampling level on the report’s data tab in cell B6. If your report contains sampled data, you may want to consider reducing the amount of data in this particular pull, for example, you might split the pull into two time frames.

However, if you just can’t avoid sampling, check the data sample percentage on the report. Then, on the Report Configuration tab, unhide rows 14-17 and the sampling size on row 15 to this level so that your data remains consistent.

Tip: The add-on defaults to 1,000 lines of data in a report. Simply delete the 1,000 under the line labeled “Limit” (typically row 11).

Another drawback of the Google Sheets option is that each file is limited to 10,000,000 cells. Typically, each sheet starts out with 26 columns (A to Z) and 1,000 default rows (or 26,000 cells).

If your downloaded data exceeds the 10,000,000 cell limitation (which can very likely happen), then you may need to have multiple Google Sheets to download all of the data.

Option 3: Download data using the Google Analytics API

  • Pros: Pulls data quickly once set up.
  • Cons: Requires web development knowledge and resources, doesn’t solve the data sampling issue, API quota limitations.

If you have web development resources that can work on the archiving project, they can pull the data detailed in your plan using the Google Analytics API directly. 

This works similarly to the aforementioned Google Sheets add-on option, but it’s a more manual process in programming the API calls.

To learn about how to use the API for this project, visit Google’s archiving information page and review the second bullet, which details several resources and considerations for using the API for this data export project.

Option 4: Download data to BigQuery (best option overall)

  • Pros: Simple to access data later for reporting, increased data insights, most flexible for data.
  • Cons: Complicated for novices to set up initially, can involve fees for BiqQuery, may require technical resources to set up, need to involve an additional tool.

The main benefit of archiving your Universal Analytics data to BigQuery is that BigQuery is a data warehouse that allows you to ask questions of the data set through SQL queries to get your data very quickly. This is especially useful in accessing this data for reporting later. 

Analytics 360 users

If you are an Analytics 360 user, Google provides a native export to BigQuery. I recommend this method. See these instructions from Google.

Everyone else

If you’re not an Analytics 360 user, then you’ll need to approach the BigQuery backup differently because Google does not provide innate BigQuery backup options in Universal Analytics for non-360 users. 

Here are the steps you’ll want to follow:

  • Step 1: Create a Google API Console project and enable BigQuery.
    • Log in to the Google APIs Console.
    • Create a Google APIs Console project.
    • Navigate to the APIs table.
    • Activate BigQuery.
  • Step 2: Prepare your project for BigQuery export.
    • Ensure Billing is enabled for your project. You may not need to pay anything, but it will vary depending on the usage and data you have.
    • If prompted, create a billing account.
    • Accept the free trial if it’s available.
    • Validate Billing enablement. Open your project at https://console.cloud.google.com/bigquery, and try to create a data set in the project. Click the blue arrow next to the project name, then click Create data set. If you can create the data set, billing is setup correctly. If there are any errors, make sure billing is enabled.
    • Add the service account to your project. Add [email protected] as a member of the project, and ensure that permission at the project level is set to Editor (as opposed to BigQuery Data Editor). The Editor role is required in order to export data from Analytics to BigQuery.
    • If you are in the EU, please also review additional requirements.
  • Step 3: Set up a free trial of Supermetrics. Similar to the Google Sheets add-on in option 2 above, Supermetrics is a tool that helps non-technical users interface with and use APIs. They offer a free 14-day trial, which is likely all you’ll need for this project since you’re only downloading the Universal Analytics data once (not regularly).
    • Connect the BigQuery data source in the Supermetrics dashboard. 
  • Step 4: In BigQuery, establish the connection to Supermetrics.
    • Navigate to BigQuery, then to Data transfers.
    • Click + Create transfer.
    • Select your Google Analytics by Supermetrics as your source and click Enroll
    • Fill in the transfer details. See detailed instructions on how to set up a transfer.
    • Under Third-party connection, click Connect source.
    • Accept the agreement.
    • Click Authorize with your Google data source.
    • Click Sign in with Google.
    • Sign in with the Google Account you use with this data source. This doesn’t have to be the same as the Google Account you use with Supermetrics.
    • Click Allow.
    • Select the accounts you’d like to include in your reporting and define the transfer settings.
    • Click Submit.
    • Click Save.

Because you only need to transfer the Universal Analytics data one time, you can also change the schedule on the transfer to On demand and then run the transfer now. 

Dig deeper: 3 ways GA4 is way better than UA

Phase 3: Ensure you’ve captured it all

Before you consider the project complete, be sure to double-check your archived data to ensure you’ve captured everything you planned to archive.

On July 1, you will no longer be able to access Universal Analytics data, either by API or through the interface. 

Email:


Contributing authors are invited to create content for MarTech and are chosen for their expertise and contribution to the martech community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. The opinions they express are their own.


About the author

Janet Driscoll Miller
Contributor
Janet Driscoll Miller is the President and CEO of Marketing Mojo and has been working digital marketing for over twenty-five years. She is the author of "Data-Frist Marketing: How to Compete and Win in the Age of Analytics" and is a frequent speaker on digital marketing and data analytics. She specializes in providing technical SEO, digital marketing analysis and management and accurate attribution and data analytics.

Fuel up with free marketing insights.