What My Safelist Experiment Might Actually Be Showing

Illustration of marketer discovering insights from safelist experiment data and engagement patterns

In the last two posts I shared the setup for a simple safelist experiment and then walked through the results.

The idea was straightforward.

Instead of promoting an offer or trying to build my list, I asked visitors to do something very simple.

Click a button.

No opt-in form.
No sales pitch.
No reward.

Just curiosity.

Over the course of about a week the page received 4,047 visits and 357 button clicks across 40 safelists.

The numbers themselves were interesting, but after spending some time looking through the data I started thinking about something a little deeper.

What exactly was this experiment measuring?

What This Experiment Was Really Measuring

This wasn’t a conversion test.

It wasn’t a landing page test.

And it definitely wasn’t measuring sales or signups.

What it really measured was voluntary interaction.

Someone had to:

– land on the page
– read what was written
– understand the experiment
– decide to click the button

That’s a higher bar than the normal safelist browsing process.

Which means the clicks in this experiment probably say more about attention than anything else.

Different Safelists Encourage Different Behavior

One thing became very clear while looking at the results.

Safelists don’t all behave the same way.

The same splash page ran on 40 different platforms during the same time period.

Yet the engagement levels were dramatically different.

There are probably two main reasons for that.

Community Activity

Some safelists simply have more active communities.

More members are opening mailings and browsing the ads.

That naturally leads to more interaction.

Platform Design

The structure of the mailer itself can also influence how people behave.

Some platforms make it very easy to earn credits quickly. When that happens, people tend to move through pages faster.

Other systems encourage members to spend a little more time looking at what’s on the screen.

Both factors affect how ads are experienced.

The Big Standout

One result from the experiment stood out immediately.

My Daily Mailer produced a participation rate that was dramatically higher than anything else in the test.

There are probably a few reasons for that.

The community is very active, and the platform includes Lucky Letters, which are messages that sometimes contain prizes. That naturally encourages members to actually look at the pages they land on.

It’s also possible that my ad simply stood out more there since members are already familiar with me on that platform.

Whatever the reason, the difference was noticeable.

A Couple Interesting Outliers

While reviewing the data, a couple of smaller mailers also caught my attention.

Send Circle and Viral URL didn’t send a lot of traffic during the experiment.

But the visitors they did send interacted with the page at a surprisingly high rate.

That’s always interesting to see.

Sometimes smaller communities can produce more attentive traffic simply because there are fewer members and people spend a little more time browsing the ads.

With a larger dataset it would be interesting to see if that pattern continues.

Traffic Volume vs Engagement

Another thing that stood out in the data was the relationship between traffic volume and engagement.

Some safelists sent a lot of visitors but relatively few clicks.

Others sent fewer visitors but a much higher percentage of interaction.

That doesn’t necessarily mean one type of traffic is better than the other.

It just shows that different communities interact with ads in different ways.

Why I Enjoy Running Experiments Like This

One of the reasons I enjoyed this experiment so much is that it reminded me of something I used to do regularly.

Years ago I used to publish monthly safelist statistics showing which platforms were producing the most list signups.

Those posts were always fun because they showed real data from actual campaigns.

This experiment felt a little like returning to that idea, but from a different angle.

Instead of measuring signups, I was simply measuring interaction.

And sometimes that tells us just as much.

What I Might Experiment With Next

This experiment answered some questions, but it also created a few new ones.

For example:

– Would the results change if the experiment ran longer?
– Would traffic exchanges show similar behavior?
– What would happen if the page offered an incentive instead of pure curiosity?

I may explore some of those ideas in the future.

For now, I’m glad I ran this experiment.

It turned out to be a fun way to look at safelist traffic from a slightly different perspective.

And as always, I appreciate everyone who took a moment to participate.

Safelist Experiment Results: What the Data Showed

Illustration of marketer analyzing safelist experiment results with charts and leaderboard rankings

In my last post, I explained the idea behind a small safelist experiment I decided to run.

Instead of promoting an offer or trying to build my list, I asked visitors to do something very simple.

Click a button.

There was no reward for clicking it.
No redirect.
No opt-in form.

Just a simple invitation to participate in the experiment.

The goal was to see how many visitors arriving from safelists would actually interact with the page.

Now that the experiment is finished, we can look at the numbers.

Overall Experiment Results

Over the course of about a week, I promoted the splash page on 40 different safelists.

Here are the totals.

MetricResult
Safelists Tested40
Total Visits4,047
Participation Clicks357
Overall Participation Rate8.82%

Considering there was no incentive to click the button, I thought this was pretty interesting.

The button was clicked 357 times during the experiment.

Since I didn’t track IP addresses, that number represents total clicks rather than unique participants.

Visitors were allowed to click the button once per safelist per day, so some people may have participated more than once if they saw the experiment on multiple sites.

Top Safelist Traffic Sources

First, let’s look at which safelists delivered the most visitors.

SafelistVisitsClicksCTR %
My Daily Mailer47113428.45
Mister Safelist304309.87
I Love Traffic2802810.00
State of the Art Mailer226104.42
List Impact205125.85
European Safelist202115.45
List Avail172116.40
Website Traffic Rewards15253.29
List Mailer Plus14664.11
Instant Ad Power139117.91

This shows which safelists produced the most traffic during the test.

But traffic volume is only part of the story.

Top Engagement Rates

The more interesting metric is participation rate — the percentage of visitors who actually clicked the button.

Here are some of the highest engagement rates from the experiment.

SafelistCTR %Visits
My Daily Mailer28.45%471
I Love Traffic10.00%280
Mister Safelist9.87%304
Instant Ad Power7.91%139
List Avail6.40%172
List Impact5.85%205
European Safelist5.45%202
State of the Art Mailer4.42%226

Most safelists landed somewhere between 4% and 7% participation.

That seems to be a fairly typical range for this type of interaction.

Safelists Producing the Most Participation

Another way to look at the results is by total participation clicks.

These are the safelists that generated the most actual interaction with the experiment page.

SafelistClicks
My Daily Mailer134
Mister Safelist30
I Love Traffic28
List Impact12
European Safelist11
List Avail11
Instant Ad Power11
State of the Art Mailer10

Just the top three safelists generated more than half of all participation clicks in the experiment.

The Big Outlier

One result stood out immediately when I started looking through the numbers.

My Daily Mailer

VisitsClicksCTR
47113428.45%

That participation rate was dramatically higher than anything else in the experiment.

There are probably a few reasons for this.

First, My Daily Mailer has a very active community.

Second, the platform includes “Lucky Letters” — messages that look like normal ads but sometimes contain prizes. That tends to encourage members to actually look at the pages they land on instead of just clicking through them.

And finally, it’s possible my ad simply stood out more on that platform since my picture appears on the page and members are already familiar with me there.

Whatever the reason, the difference was significant.

What the Data Suggests

One thing this experiment reinforces is something I’ve believed for a long time.

Safelists are not all the same.

Some communities are more active than others.

And the design of the platform itself can influence how members interact with ads.

Same splash page.
Same message.
Same time period.

Yet the engagement levels varied dramatically depending on the safelist.

That’s part of what makes experiments like this interesting.

Final Thoughts

I originally ran this experiment because I missed publishing safelist statistics like I used to.

It turned out to be a fun way to look at safelist traffic from a slightly different perspective.

Instead of measuring signups or conversions, this experiment simply measured curiosity.

And based on the results, there are clearly a lot of curious safelist users out there.

Thanks again to everyone who took a moment to participate.

A Very Simple Safelist Experiment

Safelist marketing experiment concept showing marketer analyzing click data and engagement results

Recently I ran a small safelist experiment that turned out to be pretty interesting.

It actually started because I missed something.

For a long time I used to publish monthly safelist statistics showing where my list signups were coming from. Those posts were always fun to write because they showed real results from actual safelist traffic.

Over time though, those reports became harder to produce.

It wasn’t that safelists stopped working.

It was more that the way I was using them changed.

These days I mostly use safelists to promote things like My Daily Mailer. When you’re promoting programs instead of building a list directly, it becomes much harder to collect clean data for reports like that.

So I started thinking about a different way to measure activity.

The Idea

Instead of tracking opt-ins, I wondered what would happen if I measured something much simpler.

Just a click.

No offer.
No signup form.
No funnel.

Just a page asking visitors to click a button.

If someone clicked the button, it would simply record that they participated in the experiment.

Nothing else happened.

No email collected.
No redirect.
No sales pitch waiting on the next page.

Just curiosity.

The goal was simply to see how many people arriving from safelists were actually looking at the pages they landed on.

The Splash Page

Here is the splash page I used for the experiment.

splash page showing the very simple safelist experiment

The page was intentionally very simple.

It explained that I was running a public safelist experiment and invited people to participate by clicking the button.

When someone clicked it, they saw a short message saying their participation had been recorded.

That was the entire experience.

The Email I Sent

This is the exact email I used.

🧪 A Very Simple Safelist Experiment

Hi, I’m Jerry.

I’m running a very simple public safelist experiment.

No offer. No sales pitch. Just a button.

Clicking it simply records anonymous participation. Nothing is being sold and nothing is being collected.

If you’d like to take part, just click the button.

That’s it.

Thanks for indulging my curiosity 🙂

Jerry

Running the Test

I promoted that splash page on 40 different safelists over the course of about a week.

The response was actually better than I expected.

The page received thousands of visits and hundreds of voluntary clicks from people choosing to participate in the experiment.

Which is exactly what I was hoping for.

Unlike opt-ins, this kind of interaction generates a lot of data very quickly, which makes it much easier to see patterns.

One Result I Didn’t Expect

As I started looking through the results, one platform immediately stood out.

The difference wasn’t small.

It was big enough that I double-checked the numbers just to make sure I wasn’t reading something wrong.

Everything checked out.

The numbers were real.

I’ll share the full breakdown in the next post, but that particular result gave me a lot to think about regarding how different safelist communities interact with ads.

What I’ll Share Next

In the next post I’ll go through the results of the experiment in more detail, including:

– total visits
– participation clicks
– which platforms showed the strongest engagement
– a few patterns that stood out to me while looking through the data

Some of the results were exactly what I expected.

Others were not.

And one result in particular surprised me quite a bit.

More on that soon.