There’s so much content being created on the Internet every day that people’s attention is more fractured than it has ever been. People are bombarded with marketing messages from all sides – and that’s before you even get into the content that people actually want to experience, like their favorite films and television shows.
Gone are the days where a sleek-looking website is enough to get people to believe in you, your vision and the business you built around it. If you really want to create a competitive advantage for yourself, you need people to start to see you for what you actually are – an authority in your industry and in your marketplace.
You need to position yourself as a leader – someone to be trusted, respected and followed. Your content – not only the type but how you choose to create that content – are two of the most powerful weapons you have available to you to that end. Make no mistake: the right type of marketing at precisely the right time can turn your customers into an army of loyal followers before you know it. Of course, you just have to keep a few key things in mind to make sure you’re headed in the right directions.
Always Be Willing to Explore Your Options
Far too often, online marketers tend to fall into the pattern of creating the same types of content over and over again. They’re under the mistaken impression that people will only ever take them seriously if they crank out countless 1,000+ word blog posts. While this format certainly does give you the breathing room you need to slowly establish your credibility, it’s hardly the only option out there – or the best.
Technically, you can publish anything you want on your website. Some types of content, however, stand a better chance of getting noticed and ranked by search engines. Let’s have a look at some content examples that are best for SEO.
Authoritative Blog Posts
In a world where content is a top ranking factor, having a blog would be a fundamental rule to follow. You can it to publish authoritative posts which can help you build a reputation as an industry leader. Apart from focusing on news, case studies, guides and everything’s related to your industry, it would also be great to engage your readers with other topics that your audience would find interesting to analyze.
Videos and visual content
Videos make up most of media consumption and web traffic today: in 2018, we’ve witnessed the value of visual content being emphasized by the changes that occurred across almost every major social network. Videos and visual content have grown into powerful tools for brands looking to communicate more easily with their target audience, and also virtual reality (VR) is finding its place as a marketing tool in various businesses. Visual content marketing strategies can easily show their impact on reach, engagement, and sales.
Many people love to read a well-written article, but there’s who would prefer receiving information that can be processed more easily. Infographics could provide that opportunity to readers and they are also more shareable, giving your content the chance to become popular and be a good source of backlinks.
So you really should think about the different content formats that your audience might like. Instead of writing that 1,000-word blog post, consider cutting the length in half and expressing the other half by way of a visual survey results report instead. Likewise, instead of sending out a plain email to announce the next big event you’re participating in, consider using a flyer maker to really spruce things up in a visual, interesting way that people might not expect from you.
None of this is to say that you should suddenly stop writing blog posts and other types of long-form content like white papers. Far from it. The point is that you should be willing to show the many different sides of yourself, showing people exactly what you can do in ways that make people really sit up and start to pay attention to.
People always search for instructions on how to do stuff, and that’s why guides and how-to content will always be a good choice to invest your time on. You can offer your audience insights and tips with either an article, a video or an infographic. One of the best topics to start writing a guide is your industry, of course, and everything related to your area of expertise.
How to Create Interactive Content
Interactive content helps create a dialogue with your audience which means it can help you trigger a direct action from the user. Creating interactive content could be easy if you know where to start. Here’s how to get started:
Identify Your Audience
The most important thing to do will be identifying your target audience. Create well-defined user personas in order to identify your best chances. A persona is basically a lookalike of your ideal customer, so you need to collect information about the demographics, intentions, habits, what he/she likes or dislike to complete the profile.
Create Specific Content For Every Stage of The Funnel
Customers usually take their time to buy anything, so it’s important to create different types of content for every stage of the buyer journey. Remember that having a precise schedule is crucial while preparing your own editorial plan.
The image shows the buyer journey and how to use interactive content at every stage of it to achieve the best possible outcome.
Create Content For Other People, Too
Another one of the best ways to get your prospects to finally see you as an authority in your industry involves creating content not just for your own website, but for others as well. Remember that the Internet is a big place – your customers care about more than just your brand. There are a lot of other people and voices operating within your industry right now who also have their attention. Therefore, one of the keys to establishing yourself as someone who knows what they’re talking about involves teaming up with these people as much as you can.
You might consider creating video content with other industry leaders or thought influencers on topics that are both relevant and important to your mutual target audiences. Not only will you likely get a bit of attention from their audience, but the reverse is also true – creating a mutually beneficial situation for all parties. You could even then upload that content to a service like Uscreen and create new revenue streams for both of you, thus killing two proverbial birds with one stone.
But maybe the most important thing for you to understand is that Rome wasn’t built in a day – and your authority won’t be, either. People aren’t going to start to trust you or your brand out of the kindness of their hearts. They need to want to. They need to be convinced. Content is how you do it and how you express your thoughts and values through that content is how you get people to believe in what you have to say and, slowly but surely, how you get them to actively want more.
Payman is the founder of Visme, an easy-to-use online tool to create engaging presentations, infographics, and other forms of visual content. He is also the founder of HindSite Interactive, an award-winning Maryland digital agency specializing in website design, user experience and web app development.
Making sense of data using AI is becoming crucial to our daily lives and has significantly shaped my professional career in the last 5 years.
When I began working on the Web it was in the mid-nineties and Amazon was still a bookseller with a primitive website.
At that time it became extremely clear that the world was about to change and every single aspect of our society in both cultural and economic terms was going to be radically transformed by the information society. I was in my twenties, eager to make a revolution and the Internet became my natural playground. I dropped out of school and worked day and night contributing to the Web of today.
Twenty years after I am witnessing again to a similar – if not even more radical – transformation of our society as we race for the so-called AI transformation. This basically means applying machine learning, ontologies and knowledge graphs to optimize every process of our daily lives.
At the personal level I am back in my twenties ? (sort of) and I wake up at night to train a new model, to read the latest research paper on recurrent neural networks or to test how deep learning can be used to perform tasks on knowledge graphs.
The beauty of it is that I have the same feeling of building the plane as we’re flying it that I had in the mid-nineties when I started with TCP/IP, HTML and websites!
Wevolver: an image I took at SXSW
AI transformation for search engine optimization
In practical terms, the AI transformation here at WordLift (our SEO startup) works this way: we look at how we help companies improve traffic coming from search engines. We analyze complex tasks and break them down into small chunks of work and we try to automate them using narrow AI techniques (in some cases we simply tap at the top of the AI pyramid and start using ready-made APIs, in some other cases we develop/train our own models). We tend to focus (in this phase at least) to trivial repetitive tasks that can bring a concrete and measurable impact on the SEO of a website (i.e. more visits from Google, more engaged users, …) such as:
We test these approaches on a selected number of terrific clients that literally fuel this process, we keep on iterating and improving the tooling we use until we feel ready to add it back into our product to make it available to hundreds of other users.
All along the journey, I’ve learned the following lessons:
1. The AI stack is constantly evolving
AI introduces a completely new paradigm: from teaching computers what to do, to providing the data required for computers to learn what to do.
In this pivotal change, we still lack the infrastructure required to address fundamental problems (i.e. How do I debug a model? How can I prevent/detect a bias in the system? How can I predict an event in the context in which the future is not a mere projection of the past?). This basically means that new programming languages will emerge and new stacks shall be designed to address these issues right from the beginning. In this continuing evolving scenario libraries like TensorFlow Hub represent a concrete and valuable example of how the consumption of reusable parts in AI and machine learning can be achieved. This approach also greatly improves the accessibility of these technologies by a growing number of people outside the AI community.
2. Semantic data is king
AI depends on data and any business that wants to implement AI inevitably ends up re-vamping and/or building a data pipeline: the way in which the data is sourced, collected, cleaned, processed, stored, secured and managed. In machine learning, we no longer use if-then-else rules to instruct the computer but we instead let the computer learn the rules by providing a training set of data. This approach, while extremely effective, poses several issues as there is no way to explain why a computer has learned a specific behavior from the training data. In Semantic AI, knowledge graphs are used to collect and manage the training data, and this allows us to check the consistency of this data and to understand, more easily, how the network is behaving and where we might have a margin for improvement. Real-world entities and the relationships between them are becoming essential building blocks in the third era of computing. Knowledge graphs are also great in “translating” insights and wisdom from domain experts in a computable form that machine can understand.
3. You need the help of subject-matter experts
Knowledge becomes a business asset when it is properly collected, encoded, enriched and managed. Any AI project you might have in mind always starts with a domain expert providing the right keys to address the problem. In a way, AI is the most human-dependent technology of all times. For example, let’s say that you want to improve your SEO for images on your website. You will start by looking at best practices and direct experiences of professional SEOs that have been dealing with this issue for years. It is only through the analysis of the methods that this expert community would use that you can tackle the problem and implement your AI strategy. Domain experts know, clearly in advance, what can be automated and what are the expected results from this automation. A data analyst or an ML developer would think that we can train an LSTM network to write all the meta-descriptions of a website on-the-fly. A domain expert would tell you that Google only uses meta descriptions 33% of the times as search snippets and that, if these texts are not revised by a qualified human, they will produce little or no results in terms of actual clicks (we can provide a decent summary with NLP and automatic text summarization but enticing a click is a different challenge).
4. Always link data with other data
External data linked with internal data helps you improve how the computer will learn about the world you live in. Rarely an organization controls all the data that an ML algorithm would need to become useful and to have a concrete business impact. By building on top of the Semantic Web and Linked Data, and by connecting internal with external data we can help machines get smarter. When we started designing the new WordLift’s dashboard whose goal is to help editors improve their editorial plan by looking at how their content ranks on Google, it immediately became clear that our entity-centric world would have benefited from query and ranking data gathered by our partner WooRank. The combination of these two pieces of information helped us create the basis for training an agent that will recommend editors what is good to write and if they are connecting with the right audience over organic search.
To shape your AI strategy and improve both technical and organizational measures we need to study carefully the business requirements with the support of a domain expert and remember that, narrow AI helps us build agentive systems that do things for end-users (like, say, tagging images automatically or building a knowledge graph from your blog posts) as long as we always keep the user at the center of the process.
Ready to transform your marketing strategy with AI?Let's talk!
The way we communicate and interact online is constantly changing. Users have come to expect a much more personal and tailored experience, the type that can’t be provided using traditional ways of interaction.
When looking at the words conversational marketing, some people might be wondering what exactly that is. Well, it basically is a strategy that gives customers the personalized value they are looking for and allows businesses to scale while saving time and resources. We found out that through conversational marketing and therefore through live chats, chatbots, and social monitoring it’s possible to promote genuine conversations and real relationships. The goal here is, of course, to enhance the user’s experience while minimizing friction.
Long gone are the days when consumers were passive recipients of marketing messages who had to be bombarded with a blatantly pushy sales pitch in order to be convinced to make a purchase. New, interactive technologies enabled them to break the fourth wall and have their say about how they feel about brands and what they expect from them. This means that the time has come for brands to learn how to listen actively while their customers do the talking. Marketing is a two-way street, and that’s the essence of conversational marketing.
What’s Conversational Marketing?
Unlike traditional marketing which heavily relied on TV commercials, billboards, newspaper ads, direct mail, and similar methods which customers learned to ignore successfully, conversational marketing enables brands to have relevant, meaningful, one-on-one conversations with their audiences across different channels of communication.
Live chat and chatbots are the first things that come to mind when it comes to conversational marketing. However, this strategy is much more than these two tools, and it can be extended to social media, phone calls, SMS, and IMs – pretty much any channel that your customers prefer.
Some of the benefits of such an approach include:
Being available 24/7. This is something that your customers will appreciate as you’re putting their needs first, and override your regular working hours which are somewhat limiting. AI-powered bots can answer customers’ questions in real time, be it 7 a.m or midnight. No wonder that by 2020, more than 85% of customer support interactions will be handled by chatbots.
Getting to know your audience on a more profound level. These chats and conversations are a gold mine of customer information, and they can help you understand your audience better and start using their language in your messaging.
Humanizing your brand. By combining live chat, bots, and social media, your outreach will be much more natural, and you’ll avoid using generic request forms which your customers don’t consider particularly promising in terms of providing them with timely responses.
1. Sephora’s Virtual Artist
The upscale beauty retailer stepped up its marketing game by introducing the Sephora Virtual Artist feature in their Facebook Messenger bot.
This innovative AR functionality allows the brand’s customers to “try on” makeup by uploading their selfies and applying different lipstick shades, eyeshadows, and false lashes.
Besides being fun and making it easy for customers to share their makeover photos with friends in order to get valuable feedback or add them to Facebook Stories, Visual Artist offers something much more important – a try-before-you-buy experience without having to visit a physical store.
What’s even better, once a prospective shopper makes their purchasing decision, they can order the products they want directly from the thread, which additionally streamlines and improves the customer journey. The brand reports that Sephora Assistant, a similar Messenger bot for booking makeovers in one of its stores, accounts for an 11% conversion rate increase.
eBay’s Google Assistant App tremendously facilitates browsing through the company’s vast online shopping inventory and lets customers start their search by saying “Ok, Google, ask eBay to find me…”, and this smart app will ask you additional questions in an attempt to narrow down your search and provide you with the most relevant results. Once it finds the best deal, the chatbot will ask you whether to send the results to your smartphone so that you can complete your purchase.
Given that Siri, Alexa, Amazon Echo, and other voice-based assistants are increasingly popular, it’s clear that implementing such a tool can significantly boost customer engagement.
This widget comes after the online retailer’s Facebook Messenger ShopBot, which uses AI and Machine Learning in order to personalize the shopping experience based on a deeper understanding of customer intent.
Planning and executing such an effective conversational marketing strategy can be a complex endeavor, which is why it’s a good idea to consult experts from digital marketing agencies and see what the best approach will be for your company and how to make it work within your budget.
3. Domino’s AnyWare
Domino’s wants to make the process of ordering pizza as easy as pie.
Back in 2015, the company encouraged its customers to tweet or text a pizza emoji and have a pizza sent their way.
This concept evolved further, so that now with Domino’s AnyWare it’s possible to order your favorite items from their menu through a number of available options – Google Home, Alexa, Slack, Facebook Messenger, Twitter, or even a Smart TV. This versatility and abundance of different channels of communications is something that’s of vital importance to today’s picky customers, and Domino’s does everything g to meet its patrons’ preferences.
Again, personalization and an in-depth understanding of customers needs is exactly what helps Domino’s build loyalty thus making sure that its clients will come back knowing that they can easily reorder their favorite item from the menu with a single click, tweet, or word, as well as track their order and see when it will be delivered.
4. General Motors and Social Media
Although conversational marketing is mostly related to innovative chatbots powered by the latest tech, social media is another tool that can make this strategy work.
One of the best examples of this approach is General Motors and the way it dealt with the 2014 ignition switch recalls, a crisis which threatened to ruin not only the company’s finances but also its reputation.
Over the course of several months, more than 30 million cars worldwide were recalled, while the switch ignition flaw resulted in the deaths of 124 people. G.M. was transparent about the issue and owned it, raising the bar on customer support and experience along the way.
Customers flooded the company’s social media channels with distressed comments and negative feedback, and the auto giant had its customer support reps address each and every individual complaint and offered to help on the spot.
Some customers got loaner cars until their problems were solved while others were given a refund for the travel expenses caused by the malfunction of their vehicles. Instead of trying to hush things up and switching to traditional tactics such as emails, phone calls, and other more private communication channels General Motors chose to listen to their customers, hear their objections, and proactively handle this huge blunder in the public eye.
It’s time to jump on the conversational marketing bandwagon, if you already haven’t, and take a cue from these companies who mastered the art of customer experience and satisfaction with the help of this powerful strategy.
Nina is a technical researcher & writer at DesignRush, a B2B marketplace connecting brands with agencies. She loves to share her experiences and meaningful content that educates and inspires people. Her main interests are web design and marketing. In her free time, when she's away from the computer, she likes to do yoga and ride a bike. You can also find her on Twitter.
In this article, we explore how to evaluate the correspondence between title tags and the keywords that people use on Google to reach the content they need. We will share the results of the analysis (and the code behind) using a TensorFlow model for encoding sentences into embedding vectors. The result is a list of titles that can be improved on your website.
“A title tag is an HTML element that defines the title of the page. Titles are one of the most important on-page factors for SEO. […]
They are used, combined with meta descriptions, by search engines to create the search snippet displayed in search results.”
Every search engine’s most fundamental goal is to match the intent of the searcher by analyzing the query to find the best content on the web on that specific topic. In the quest for relevancy a good title influence search engines only partially (it takes a lot more than just matching the title with the keyword to rank on Google) but it does have an impact especially on top ranking positions (1st and 2nd according to a study conducted a few years ago by Cognitive SEO). This is also due to the fact that a searcher is likely inclined to click when they find good semantic correspondence between the keyword used on Google and the title (along with the meta description) displayed in the search snippet of the SERP.
What is semantic similarity in text mining?
Semantic similarity defines the distance between terms (or documents) by analyzing their semantic meanings as opposed to looking at their syntactic form.
“Apple” and “apple” are the same word and if I compute the difference syntactically using an algorithm like Levenshtein they will look identical, on the other hand, by analyzing the context of the phrase where the word apple is used I can “read” the true semantic meaning and find out if the word is referencing the world-famous tech company headquartered in Cupertino or the sweet forbidden fruit of Adam and Eve.
A search engine like Google uses NLP and machine learning to find the right semantic match between the intent and the content. This means the search engines are no longer looking at keywords as strings of text but they are reading the true meaning that each keyword has for the searcher. As SEO and marketers, we can also now use AI-powered tools to create the most authoritative content for a given query.
There are two main ways to compute the semantic similarity using NLP:
we can compute the distance of two terms using semantic graphsand ontologies by looking at the distance between the nodes (this is how our tool WordLift is capable of discerning if apple – in a given sentence – is the company founded by Steve Jobs or the sweet fruit). A very trivial, but interesting example is to, build a “semantic tree” (or better we should say a directed graph) by using the Wikidata P279-property (subclass of).
we can alternatively use a statistical approach and train a deep neural network to build – from a text corpus (a collection of documents), a vector space model that will help us transform the terms in numbers to analyze their semantic similarity and run other NLP tasks (i.e. classification).
There is a crucial and essential debate behind these two approaches. The essential question being: is there a path by which our machines can possess any true understanding? Our best AI efforts after all only create an illusion of an understanding. Both rule-based ontologies and statistical models are far from producing a real thought as it is known in cognitive studies of the human brain. I am not going to expand here but, if you are in the mood, read this blog post on the Noam Chomsky / Peter Norvig debate.
Text embeddings in SEO
Word embeddings (or text embeddings) are a type of algebraic representation of words that allows words with similar meaning to have similar mathematical representation. A vector is an array of numbers of a particular dimension. We calculate how close or distant two words are by measuring the distance between these vectors.
In this article, we’re going to extract embedding using the tf.Hub Universal Sentence Encoder, a pre-trained deep neural network designed to convert text into high dimensional vectors for natural language tasks. We want to analyze the semantic similarity between hundreds of combinations of Titles and Keywords from one of the clients of our SEO management services. We are going to focus our attention on only one keyword per URL, the keyword with the highest ranking (of course we can also analyze multiple combinations). While a page might attract traffic on hundreds of keywords we typically expect to see most of the traffic coming from the keyword with the highest position on Google.
We are going to start from the original code developed by the TensorFlow Hub team and we are going to use Google Colab (a free cloud service with GPU supports to work with machine learning). You can copy the code I worked on and run it on your own instance.
Our starting point is a CSV file containing Keyword, Position (the actual ranking on Google) and Title. You can generate this CSV from the GSC or use any keyword tracking tool like Woorank, MOZ or Semrush. You will need to upload the file to the session storage of Colab (there is an option you can click in the left tray) and you will need to update the file name on the line that starts with:
df = pd.read_csv( … )
Here is the output.
Let’s get into action. The pre-trained model comes with two flavors: one trained with a Transformer encoder and another trained with a Deep Averaging Network (DAN). The first one is more accurate but has higher computational resource requirements. I used the transformer considering the fact that I only worked with a few hundreds of combinations.
In the code below we initiate the module, open the session (it takes some time so the same session will be used for all the extractions), get the embeddings, compute the semantic distance and store the results. I did some tests in which I removed the site name, this helped me see things differently but in the end, I preferred to keep whatever a search engine would see.
The semantic similarity – the degree to which the title and the keyword carry the same meaning – is calculated, as the inner products of the two vectors.
An interesting aspect of using word embeddings from this model is that – for English content – I can easily calculate the semantic similarity of both short and long text. This is particularly helpful when looking at a dataset that might contain very short keywords and very long titles.
The result is a table of combinations from rankings between 1 and 5 that have the least semantic similarity (Corr).
It is interesting to see that it can help, for this specific website, to add to the title the location (i.e. Costa Rica, Anguilla, Barbados, …).
With a well-structured data markup we are already helping the search engine disambiguate these terms by specifying the geographical location, but for the user making the search, it might be beneficial to see at a glance the name of the location he/she is searching for in the search snippet. We can achieve this by revising the title or by bringing more structure in the search snippets using schema:breadcrumbs to present the hierarchy of the places (i.e. Italy > Lake Como > …).
In this scatter plot we can also see that the highest semantic similarity between titles and keywords has an impact on high rankings for this specific website.
Semantic Similarity between keywords and titles visualized
Start running your semantic content audit
Crawling your website using natural language processing and machine learning to extract and analyze the main entities, greatly helps you improve the findability of your content. Adding semantic rich structured data in your web pages helps search engines match your content with the right audience. Thanks to NLP and deep learning I could see that to reduce the gap – between what people search and the existing titles – it was important – for this website – to add the Breadcrumbs markup with the geographical location of the villas. Once again AI, while still incapable of true understanding, helps us become more relevant for our audience (and it does it at web scele on hundreds of web pages).
Solutions like the TF-Hub Universal Encoder bring, in the hands of SEO professionals and marketers, the same AI-machinery that modern search engines like Google use to compute the relevancy of content. Unfortunately, this specific model is limited to English only.
Are you ready to run your first semantic content audit?