How can marketers adapt to the recent Facebook algorithm changes?

Following Mark Zuckerberg’s personal post on Facebook’s recent newsfeed changes, the social network’s share price fell as much as 6.1 percent.

With Facebook’s key advertising space for brands reverting back to its original purpose (actual, personal news from friends and family), the brand marketers who have relied on in-feed advertising to drive traffic and revenue must now reconsider how to engage with audiences through this pillar channel.

But more than anything, organisations must recognise this as an opportunity to take back control of their audiences, audience data, and engagement strategy. With GDPR on the horizon, there has never been a more opportune moment to build a proprietary customer database, find innovative ways to connect directly with consumers, and truly own your customer experience.

This is the beginning of the next era of owned media versus rented audiences. Marketers can build the right foundation through the following approaches:

Personalisation

A one-to-one relationship creates mutual trust and respect, and is proven to drive engagement that is more meaningful to consumers and to brands (read: revenue). Organisations utilising the full depth and breadth of their data better understand each individual user and tailor content for a more relevant, contextual experience.

For us, this means moving beyond ‘engagement bait’ that can erode relationships; and building and activating relationships that matter.

Better websites

As organic reach decreases on Facebook, the importance of having a central hub of content increases – but not just the traditional homepage, section, and article pages.

Publisher websites must be designed to convert traffic into known audiences, and this means learning from retail, ecommerce, and other industries. Quality content, SEO, overlays, and connectivity to email and mobile messaging must be in the mix.

Ads over articles

Instant articles will no longer be… so instant. With a shift to engaging a known audience, publishers will still be engaging and spending with Facebook, but in a different way.

Targeting and converting Facebook users into email subscribers is where we will see money being spent, all in a way designed to give brands the opportunity to own the data – and the experience.

Omnichannel approach

An integrated approach is more important than ever. Content delivered across mobile, app and email must be coordinated for every individual in real-time, as audiences shift sessions and channels. Creating a seamless experience for accessing breaking news, daily media, and evergreen content will solidify your relationships and your revenue streams.

Facebook has thrown curveball after curveball, but this change has the potential to be truly significant. That said, brands now have a real opportunity to take audience development into their own hands, rather than continuing to rely on social media platforms for support.

- by Marielle Habbel

What will programmatic look like in 2027?

2017 saw the tenth anniversary of programmatic advertising. From its humble beginnings, over half of all non-search digital ad spend is now made using the technology. The days of slow deals, subject to human errors and inefficiencies, are increasingly behind us.

But adtech’s story is far from over.

The next ten years will see new technologies that will fundamentally change the way advertising is experienced in our day-to-day lives, whether as media buyers or as consumers. It is our responsibility as technology providers to capitalise on these changes as best as possible, delivering what is becoming a genuinely helpful service by connecting consumers to the purchases they want to make.

By 2027, programmatic will become ubiquitous in the marketing sector. It will no longer be a line item on a media plan, but will be taken as a given by media buyers. Programmatic will extend its depth, moving into more channels such as DOOH, TV, Radio, and VR. It will also extend its breadth to more consumers, yielding more data across more localities to properly optimise successful campaigns.

Ditching siloed channels

Understanding the omnichannel user experience is already a must for any effective programmatic marketer. The way an ad is engaged with by a consumer, between, say, their desktop and mobile, is crucial for advertisers to understand to best optimise their campaigns. The ‘single view of the consumer’ across multiple devices currently spoken of will be a well-established state of affairs, with marketers having an accurate view of cross-device behaviour.

The way consumers engage with multiple channels will change too. A search for a product or service on their smartphone will instantaneously update the ads recommended to them when on their laptop or tablet. What’s more is that consumers will quickly raise their own expectations in light of these developments, demanding convenient, real-time offers and messages, and scrutinising ads which are irrelevant and unnecessary. The consumer will even start to see advertising as a core component of shopping, with smart refrigerators reminding them of the need to buy milk through an ad, having detected that they don’t have any.

By 2027, omnichannel marketing will have reached new heights as we increasingly inhabit a computerised world. Developments in VR are but one example of the direction of travel, not to mention the growing market in tech-enabled wearables.

The adoption of programmatic trading by out-of-home platforms such as billboards are another sign of things to come. This computerised environment will be all the more reason for marketers to ditch increasingly outmoded siloed channel strategies, with specific channel budgets giving way to unified ones.

Just 6% of television ads will be traded programmatically in 2018, according to eMarketer.

But even that stalwart of traditional media buying will itself look radically different in 2027. As video-on-demand and streaming services blur the lines between TV and online video, traditional broadcasters will adopt the media buying techniques of their peers in the video universe.

Most TV ads, by 2027, will therefore be executed programmatically.

Audience-first

Current trends towards transparency in the media buying process will continue to accelerate. Marketers will exercise their right to know where exactly their spending is going, asking for guarantees that their content is running on quality sites alongside brand-safe content. Marketers will take a fully audience-first approach to media buying, matching high-value audiences to the best media in brand-safe contexts. Cast-iron guarantees that ads won’t run on brand-unsafe sites will become the norm, with the legitimacy of sites verified by the likes of Ads.txt.

The media agency will take on a new, more strategic role in this changing ecosystem, partly as a results of machine learning, which will automate several existing tasks.

Brand CMOs will increasingly look to their agencies to share the best approaches to media buying, data management and measurement – seeking their advice, for example, on how to best utilise their tech stacks, or in developing new joint products.

As the world of technology, data, and devices expands, the agency is set to take a consultative approach, delivering holistic strategy and bringing the latest innovations and approaches in marketing to their clients for consideration. Agencies will need to be on top of neatly summarising the story in a complex trove of data, which can be easily relayed back to the relevant CMO and to the board of a given client.

2027 is, of course, still a long time away, and making predictions of this nature are never a precise science. However, by looking at the fundamentals of where the industry is going, and where we’ve come from in such a short space of time, we can have an idea of where marketing is headed. It’s the job of any serious marketer to take these trends seriously and make the best use of them for their customers.

- by Emma Williams

How to be smarter with customer data audits

Each year, the data that marketing teams hold on their customers will degrade by around 10 – 20%. This is simply because approximately 1% of the population will die, 10% will move house, and email addresses and phone numbers will inevitably change.

Not all data will take the same amount of time to degrade. For example, details on customer segments, products or customer type will all deteriorate at different rates. The age of the data plays a part too. If it is over 3 years old then 30% of customers will have moved, so the entire database essentially changes address every 10 years.

It’s also not entirely equal. Data provided by existing customers may be more valuable to a business than that held against prospective customers, and this may vary between marketing campaigns.

With the General Data Protection Regulation (GDPR) coming into effect this May, it has never been more critical to keep data clean. Article 5 of the regulation states “every reasonable step…” must be taken to fix inaccurate data, but how can you know what to fix if you haven’t properly audited?

Here is how you can be smarter with customer data audits:

1.) Focus

Narrowing a data audit across either different products or customer types (such as active, lapsed, prospects etc.) gives a basis for comparing results. This means you also have grounds to justify investment from appropriate teams if needed.

Imagine letting a product manager know the address quality of the total customer base is 95%. They would probably consider that a good score, so no action needed. However, if you can use tangible proof to show that the specific product’s customer address score is just 80%, despite the 95% overall quality, you are in a stronger position to get the support and resources needed to look into why the product’s score is lower.

Focusing the audits enables marketing teams to benchmark internally across products and business areas, as well as spotting problems. If one set of individuals has purchased two different products, but there is a difference in the data quality, you are well placed to dig deeper into the issue.

Analysing other variables, such as contact data validation, opt-ins, field populations and distributions will provide a full perspective of the data. If the product manager sees there are email addresses against 50% of customers, for example, they may not realise if, say, only 60% are verified. In this instance, only 30 in every 100 customers might be valid, not 50 in 100 as originally assumed.

When comparing datasets of products, if some have more opt-in rates and a higher percentage of valid email addresses, it could be that something is wrong. There could be issues with the system that passes the data across, or perhaps the on-boarding or data capture processes aren’t working properly.

2.) Regularity is key

As a rule, the longer the time left between data cleanses the higher the cost will be, as more data will have degraded and will need to be fixed. It’s much better to be proactive than reactive, to avoid paying out higher costs.

You can use regular audits in the same way you would an exception report. If they are set up to automatically run they can fix and improve data quality when it drops below a certain level. For example, if the amount of invalid email addresses gets to 10% or telephone numbers drop below a 90% match.

This approach means the investment is frequent and regulated, so that a high level of data quality is self-maintained. When data quality is taken care of, it allows teams to focus on pending marketing campaigns, or elsewhere in the business that needs attention.

Frequent data audits are a vital way to monitor and repair any inaccurate data, and from May you could receive a fine under the GDPR of €20m or 4% of a business’s global revenue, whichever is greater. Regular data audits mean you are in control of your data.

3.) Maximise the value of the data

Auditing your data for its quality is a vital step, but it is just the start of understanding it fully. Other work needs to be done to get a holistic view, and the audit lays the groundwork for this.

A data quality audit will tell you that, say, 10% of telephone numbers are invalid, but a next step would be to conduct further investigation into this variable, to understand the distribution of values, and the formats used (e.g. +44, (01423), 1423 etc).

It may be that the field is inconsistent in how the telephone number is captured and stored and this can impact how much of the data can be actioned. Correcting the data at source to prevent more dirty data getting into your systems is extremely valuable. Equally, correcting data so it can be efficiently transferred across systems reduces the manipulation time analysts spend on correcting data.

Ultimately, a data audit is a great place to start. Coupled with a more comprehensive view of data variables, you can move towards a position of maximising the usage of your data, and maximising the returns on it, as well as ensuring you meet best practice and comply with GDPR.

- by Rob Frost

Have A Question?
Ready For Answers?
Call Us 1-949-954-7769
eMail us at: wantmore@teamdebello.com

Have A Question?
Ready For Answers?
Call Us 1-949-954-7769
eMail us at: wantmore@teamdebello.com