Vero logo
  1. All Posts
  2. /
  3. Automating a personalized recommendations marketing campaign

Automating a personalized recommendations marketing campaign

Data Management Messaging and Automation

In today’s article I’m going to show you how I’d build an email like the one below: a recurring campaign that goes out to users of a fictional flight search provider who have recently searched for a flight.

Example recent flight search email

When talking to potential Vero customers, a very common use case that comes up is a recurring email, push or SMS campaign that recommends or reminds users of their preferences, recent interactions or results.

A couple of examples:

  • Reminder with recent searches for hotels, flights, etc.
  • List of products in a customer’s wishlist that have come back in stock.
  • Products currently sitting in the customer’s cart.
  • Recommended products based on recent purchases or searches.
  • Service or work orders awaiting confirmation.

…and so on. Every industry, vertical and business model has examples of this sort of campaign.

In 2026, you’d figure that this sort of campaign is easy to set up, but potential customers still share their war stories with me: it’s not as easy as it seems.

In this post, I’ll break down how you can access this detailed, personalized data using an audience loaded directly from a data warehouse or other similar data source.

The benefit of this approach is that typically a data warehouse has a rich set of data tables that contain information both about the user but also about the objects they are interacting with, such as a flight search, a cart, work orders and so on.

So, let’s pretend we’re the Head of CRM at a travel search engine and get to it.

How the data looks

A travel search engine might have a database with tables such as:

  • Users. Contains data about users who have signed up for your platform.
  • Searches. Represents searches that users make.
  • Flights. Represents possible flights in your inventory that can be returned in a search. This would include details like flight prices, flight length, number of stops, etc. Imagine this data is updated every 15 minutes to remain accurate. Flights are given a unique ID and archived so there’s a history of movements in price.

Here’s a diagram of these data tables (using some database notation) so you can build a mental image:

Data tables for fake flight search email

When we’re sending our campaign we are of course going to target a list of users. But, in order to personalise the campaign each user receives, we want to:

  • Only send to users who have searched for a flight recently.
  • Include details about the latest flights for the routes they’ve searched for.

This "related data" comes from the flights and searches tables in our data warehouse.

To build this campaign I need to know:

  • The email address of the recipient users.
  • The first name of the recipient users.
  • Only include users who have searched in the last 24 hours.
  • Details of the top three cheap flights currently available matching their search origin and destination.

One of the cool things about loading the data directly from the data warehouse is that we don’t have to spend time working with our engineers or data engineers to “remodel” this data and sync a second copy to our customer engagement platform via our API.

Even if the customer engagement platform makes it possible to sync this data, keeping this many different data models in sync in another platform can very quickly multiply the two-way data sync problem. There’s something really nice and clean about loading the data directly from the warehouse.

Ultimately, what we want to end up with is an audience that looks like this.

Audience example

As you can see, we’ve got columns of id and email, first_name for each recipient, and then a flights column, which is an array and contains information about multiple flights related to the latest search conducted by each user.

We’ll be able to use this to dynamically insert the data in our campaign.

Note that this is based on fake data and the data in this screenshot doesn’t align exactly with the data we’d need for the campaign I illustrated at the top of this article (I am short on time) but it’s indicative: in the real world you’d want to pull in a few extra details from your data warehouse as well.

Further, I’ll include an example of the sort of SQL used for this hypothetical audience at the bottom of this email, just as a guideline in case it’s useful to you!

Building the campaign

Now we’ve got our personalized data attached to our audience, we want to put that data to good use. The following screenshot is an example of how I might build this campaign in my drag-and-drop campaign editor. This is a screenshot from Vero, which uses the Liquid templating language.

As you can see here, for each entry in the flights array part of our audience, we’re inserting:

  • The airline name
  • Number of stops
  • The origin and destination airport codes
  • The price

We could insert any data we wanted here. We could handle images using an image CDN and building out dynamic URLs. I’ve left that out to keep this example fairly simple, but in terms of the personalization we could achieve, it really comes down to what is available in our warehouse. You could include as many columns as we need in the audience to build out exactly the message we want.

Dynamic Liquid for flight email

Scheduling the campaign

Having built out the audience query and defined the dynamic content I could now safely schedule up my campaign. I’m using Vero for this demo of course and here’s how it would look in Vero to schedule this up to send automatically every Wednesday at 9.10am.

Recurring schedule

From here, Vero would handle the rest!

The audience SQL query

with 

searches_last_day as (
  select 
    id,
    user_id,
    origin_airport_id,
    destination_airport_id,
    row_number() over (partition by searches.user_id order by searches.created_at desc) as row_number
  from 
    searches
  where
    created_at > current_date() - interval '24 hours'
),

latest_search_per_user_with_all_flights as (
  select
    latest_search_per_user.id   as search_id,
    user.id                     as user_id,
    user.email,
    user.first_name,
    flights.origin_airport_id,
    flights.destination_airport_id,
    flights.price,
    flights.airline_id
  from 
    -- Here we want to get just the latest search for each user, in case they did multiple
    (
      select * 
      from searches_last_day 
      where row_number = 1
    ) as latest_search_per_user
  left outer join 
    users
    on latest_search_per_user.user_id = users.id
  left outer join 
    flights
    on flights.origin_city_id = searches.origin_airport_id
    and flights.destination_city_id = flights.destination_airport_id
)

-- We're aggregated all the flights into a single column of our audience, 
-- so we can use with our email/message builder
select
  search_id,
  user_id,
  email,
  first_name,
  coalesce(
    json_agg(
      json_build_object(
        'origin_airport_id', origin_airport_id,
        'destination_airport_id', destination_airport_id,
        'price', price
        'airline_id', airline_id
      )
    ),
    '[]'::json
  ) as flights
from latest_search
group by 1, 2, 3, 4

A word on event tracking

A quick word on event tracking. This example, we’re using a data warehouse as our data source. This is something that can work really nicely alongside event tracking or as a replacement for it.

Over the last decade, the most cutting-edge customer engagement platforms have championed users and events as the primary “data types” in their system. Events are really powerful: they represent something happening in a system. For example, “Cart item added”, “Cart item removed”, “Cart completed checkout”, etc. Events are meant to represent the lifecycle of an object. In this example, the lifecycle of a “cart”: the cart is empty, items are added, items are (perhaps) removed and, once the cart is checked out, it is ultimately cleared or reset to be used again.

However, for an email like the one we’re building in this article, we also need not only a record of what the user did when they did it and what they interacted with (i.e., the cart), but also details about what they interacted with, such as:

  • What item they added and meta-data about that item.
  • What item they removed.
  • How they checked out and data about the transaction.

This can all be captured with events, but it can sometimes be fiddly to access that data for two main reasons:

  1. Data changes since the event was tracked. If someone removed Nike shoes from their cart 30 days ago, what is the price of those Nike’s today?
  2. Syncing a copy of the data creates two-way data sync issues. One way to combat the above can be to sync “custom objects” to your engagement platform, but this can often mean syncing 1,000s or 10,000s or 100,000s of records across multiple data objects and keeping them in sync.

Event tracking can absolutely be a part of solving this problem but in this post I wanted to show a different way of solving things using a direct connection to a data warehouse, using a feature in Vero we call “Data Sources”.

Bringing it all together

So that’s it, that’s how we build out a recurring message using rich data from our data warehouse to send a dynamic flight search email to our users. Hopefully this can serve as some inspiration and get you thinking about different ways to solve this sort of problem.

If you’ve ever got any questions and want us to help, I’d love to get into the weeds of these sorts of campaigns with you.

Please feel free to email or otherwise contact our team at Vero and we’ll get in touch.

Keep growing, Chris

Want to send more personalized mobile and email messages to your users?

Check out Vero, customer engagement software designed for product marketers. Message your users based on what they do (or don't do).

Get started

Consider signing up for a free trial. No credit card required.

Vero Cloud Workflows