Select Page
Sitechecker

Sitechecker

Sitechecker.pro is an SEO platform designed to help websites improve their search engine ranking and visibility.

It offers a variety of tools for both beginners and experienced users, including website audits to identify technical and content issues, keyword tracking to monitor ranking performance, and competitor analysis to understand your SEO landscape.

By consolidating data from various sources, Sitechecker provides a centralized dashboard to track your website’s overall SEO health and make informed decisions to improve organic traffic and revenue.

Understanding LLM Optimization: Ethical AI and Protecting Your Content

Understanding LLM Optimization: Ethical AI and Protecting Your Content

Table of content:

  1. Red Teaming, LLM Manipulation, and SEO
  2. Streamlining Content Creation with LLM Optimization: Our Method
  3. Conclusion

Are you ready to dive into the future of digital marketing, where artificial intelligence creates your content and verifies its integrity? In the fast-paced digital marketing and SEO world, mastery of Large Language Model (LLM) optimization is emerging as a game changer for companies eager to stand out. These sophisticated models are at the heart of modern artificial intelligence-driven content creation, enabling companies to produce engaging and personalized content at scale.

However, as we embrace this wave of AI-created content, we face the challenge of safeguarding its inherent vulnerabilities and ethical concerns. We enter the world of Red Teaming: a strategic, simulated battlefield designed to probe AI’s defenses, expose its flaws, and defend against potential threats. This critical exercise ensures that our trust in artificial intelligence does not become our Achilles’ heel.

But it is not just about defense mechanisms. Ethical considerations take center stage as we navigate the rapid advances in AI technology. Companies must manage the power of AI with a moral compass, ensuring that the digital evolution proceeds with integrity and transparency. After all, the goal is to harness AI as a force for good, enriching our content strategies and meeting ethical standards.

Join me as we journey through the intricate dance of LLM optimization, red-teaming, and the quest for ethical AI practices. We will delve into the vulnerabilities of these models, uncover tested strategies, and explore how to create content and product descriptions that leverage your data for stellar results without falling into the trap of shortcuts. We will unlock the secrets to thriving in the digital arena, where technology meets ethics.

Red Teaming, LLM Manipulation, and SEO

Explanation of Red Teaming

Have you ever wondered how innovative technology that writes articles, creates content for websites or can summarize 100% of search results like Google SGE does? Let’s keep it simple, especially for those who are not experts in the field but are curious about SEO, content marketing, or running a business in today’s digital age.

Imagine Large Language Models (LLMs), like GPT models, as incredibly talented writers who can produce text that sounds just like it was written by a human being. These models are significant for content creation because they can quickly generate articles, product descriptions, and more, all by providing them with a request or question. Be careful, of course, because they are not error-free. As Lily Ray shows us in this tweet, if you ask Google what the “best cocktail bars in NY ” are, it may respond by pointing you to one that doesn’t even exist. 

However, with great power comes great responsibility and potential risk. While these models can create valuable and informative content, they can also be manipulated to produce misleading or harmful content. This is where “red teaming” comes in.

Think of Red Teaming as the digital world’s version of a security exercise. It is a strategy in which experts in cybersecurity, artificial intelligence (AI), and language come together to test these intelligent models. They act like potential hackers or malicious users, trying to identify ways these models could be induced to do something they shouldn’t, such as generating false information or distorted content.

The purpose of Red Teaming in this context is twofold. First, it helps identify weaknesses in how these models understand language, interpret context, or adhere to ethical guidelines. Second, it is about strengthening the defenses of these models, ensuring that they are robust enough to resist manipulation and continue to produce engaging but also reliable and fair content.

Thus, for SEOs, content marketers, business owners, and managers at various levels, understanding the role of Red Teaming in LLM optimization is critical. It’s not just about leveraging technology to stay ahead of the digital marketing game but also ensuring that it is used responsibly and safely, protecting your brand and audience from potential misinformation.

How Red Teaming Identifies Vulnerabilities in LLM Manipulation

Red Teams employ a multifaceted strategy to evaluate the resilience of LLMs. They simulate attacks and challenging situations to identify vulnerabilities, such as bias amplification, misunderstandings of context, and ethical violations. Doing so, they help uncover areas where LLMs might perpetuate biases, misinterpret information, or generate content that could harm users.

The work of Red Teams is invaluable in the quest to refine AI-driven content creation tools. By identifying and addressing the weaknesses of LLMs, they ensure that these models can continue to serve as powerful assets for generating high-quality, ethical, and accurate content. 

For digital marketing and content creation professionals, understanding the role of red teaming is critical to recognizing where machines fail and areas where automated processes or algorithms may not be as effective as human judgment. Although machines can process large amounts of data quickly, they lack the ability to understand human emotions, values, and ethics. This is where the human touch, or what we might call the “moral compass,” becomes essential.

The moral compass refers to our internal sense of right and wrong, which guides our decisions and actions. In digital marketing, it pushes us to ask important questions: Do we use our understanding of human behavior to connect with our audience and serve them authentically, or do we exploit this understanding to manipulate them to our advantage?

Similarly, we might consider red teaming and what comes out of the LLM tests: Do we use our understanding of model vulnerabilities to govern it, or do we want to exploit it to manipulate models to get advantages?

Tests and Experiments of LLM Manipulation

How to influence search engine product recommendations

The research paper Manipulating Large Language Models to Increase Product Visibility explores the LLM manipulation for influencing search engine product recommendations by specifically asking: could a vendor increase the visibility of their product by embedding a strategic text sequence in the product information page? The researchers investigated this question by developing a framework to manipulate an LLM’s recommendations in favor of a target product. They achieved this by inserting a strategic text sequence (STS) into the product’s information.

Using a catalog of fictitious coffee machines, the research demonstrates that adding the strategic text sequence significantly improves the visibility of the target product and increases its chances of being recommended to potential customers. This echoes SEO’s impact on traditional search engines, where optimized content ranks higher in search results.

Firstly, they tested the model’s behavior in a real-world scenario. This involved embedding the optimized STS within the informational content of the target product. By doing so, they aimed to observe how the LLM would rank the product among a list of alternatives when presented to users. The experiment was designed to mimic a user’s search for coffee machines, explicitly focusing on affordability. Including the STS within the target product’s description was intended to influence the LLM to rank the target product, ColdBrew Master, higher than it naturally would, compared to more cost-effective options like SingleServe Wonder and FrenchPress Classic.

Secondly, the researchers evaluated the impact of this STS optimization on the LLM’s recommendations. The outcome was significant; the LLM displayed the ColdBrew Master as the top recommendation, surpassing other models that were objectively more aligned with the affordability criteria. This step was crucial in demonstrating the practical effects of STS optimization on LLM behavior, highlighting how even subtle manipulations could significantly alter the model’s output. Through these steps, the researchers showcased the potential for manipulating LLM responses and underscored the importance of understanding and mitigating such vulnerabilities to ensure fair and unbiased AI recommendations.

However, it’s important to consider the ethical implications. Just as SEO can be misused, LLM manipulation could disrupt fair market competition by giving manipulative vendors an edge. The ability to manipulate LLM search responses, as shown in this research, gives vendors a significant competitive advantage over rival products. This capability has far-reaching implications for market dynamics, as it can alter the balance of competition and lead to a skewed representation of products. As LLMs become more deeply embedded in the digital commerce infrastructure, safeguards must be established to prevent the exploitation of AI-driven search tools for unfair advantage.

How to use DSPy programming framework in red teaming

DSPy is a framework developed by Stanford NLP for structuring and optimizing large language models (LLMs) that can be effectively used in SEO, as explained by Andrea, but also in red teaming. It introduces a systematic methodology that separates the flow of programs into modules from the parameters of each step, allowing for more structured and efficient optimization. This separation enables the creation of a “feed-forward” language program consisting of several layers of alternating Attack and Refine modules, which is more effective in red teaming than simple language programs. 

DSPy’s focus on creating structure in place can greatly help search for hacky prompts and pipeline engineering tricks, making it a very effective tool for red teaming (here is a great article about red teaming with DSPy).

Streamlining Content Creation with LLM Optimization: Our Method

In our innovative approach to content creation, we have taken a significant step forward by integrating the power of Knowledge Graph, and a further step we are taking by testing the ability to use reviews collected on Trustpilot to optimize prompts and generate product descriptions for e-commerce.

By drawing on the rich user-generated content on Trustpilot, we can refine our large language models (LLMs) with real-world feedback and preferences, enabling personalization and relevance that sets new standards in content creation. In addition, we can use the product reviews we have in the knowledge graph to generate content and introduce product highlights as Google requires. They offer shoppers concise, easy-to-read sentence fragments that swiftly address common consumer queries or spotlight key product features.

Customized content at scale: A data-driven approach

Our method involves a sophisticated process in which Knowledge Graphs and Trustpilot reviews converge to inform our LLMs. This synergy allows us to deeply understand what matters most to users, identifying trends, sentiments, and key points of interest that resonate with our target audience. The result is highly personalized content that speaks directly to users’ needs and preferences, delivered efficiently and at scale. This approach enhances the user experience by providing them with more relevant and engaging content. It significantly boosts our SEO efforts by aligning closely with search intent.

Ethical use of AI: Prioritizing the end user

At the core of our strategy is the ethical use of AI, which we define as leveraging better-screened data for the benefit of the end user. By incorporating feedback from Trustpilot reviews into our Knowledge Graphs, we ensure that our content is based on authentic user experiences and perspectives. This commitment to ethical use of artificial intelligence means that we are optimizing for search engines and user engagement and satisfaction. Our models are trained to prioritize content that is informative and useful and reflects real user feedback and needs.

This ethical approach extends to how we handle data, ensuring transparency, accuracy, and fairness in every piece of content we generate. By focusing on the benefit of the end user, we ensure that our content creation process remains accountable, reliable, and aligned with our audience’s expectations. It’s a commitment beyond simple compliance; it’s about setting a benchmark for how artificial intelligence and data can truly enhance the digital user experience.

Our integration of Knowledge Graphs with reviews to train and optimize our LLMs represents a leap forward in creating customized content at scale. It’s a testament to our belief that the ethical use of AI—defined by leveraging better data for the end-user’s benefit—is the cornerstone of effective and impactful content creation. This approach sets us apart in the digital marketing landscape and ensures that we deliver content that truly matters to our audience, fostering engagement, trust, and loyalty.

Conclusion

Exploring LLM optimization, Red Teaming, and ethical AI practices unveils a fascinating interplay in the digital marketing landscape. As Large Language Models (LLMs) have become major players in content generation, mastering LLM optimization offers a strategic advantage to companies seeking to thrive in the competitive SEO world. However, this power requires a responsible approach.

Red Teaming is crucial for identifying vulnerabilities and potential pitfalls associated with LLM manipulation. By simulating attacks and uncovering weaknesses, Red Teaming helps strengthen defenses against malicious actors seeking to exploit LLMs for misinformation or manipulation.

But the conversation extends beyond technical safeguards. Ethical considerations are paramount. We must navigate this rapidly evolving landscape with transparency and integrity, ensuring that AI serves as a force for good. This means prioritizing accurate, unbiased content that benefits users rather than deceives them.

At WordLift, we believe the future of LLM optimization lies in ethical practices and user-centric content creation. Our innovative approach integrates Knowledge Graph data and Trustpilot reviews to refine our LLMs and personalize content at scale. This ensures user relevance and satisfaction while boosting SEO efforts.

Ultimately, the power of LLM optimization can be harnessed to create a win-win scenario for businesses and users. By embracing responsible AI practices and prioritizing user needs, we can unlock the true potential of LLMs and shape a more informative and engaging digital experience for everyone.

Dominate Black Friday & Cyber Monday with Strategic SEO Techniques

Dominate Black Friday & Cyber Monday with Strategic SEO Techniques

Table of contents:

  1. Why BlackFriday and CyberMonday SEO tactics matter for advanced ecommerce in 2024
  2. Black Friday & Cyber Monday SEO in the era of Generative AI
  3. SEO changed with generative AI, Google’s updates and economy shifts
  4. What Twitter and Independent Research Have Taught Us
  5. Underutilized Black Friday & Cyber Monday Schema Markups
  6. Selling more and going beyond prompting and schema markups

Why Black Friday And Cyber Monday SEO Tactics Matter For Advanced E-commerce In 2024

Black Friday and Cyber Monday SEO have transformed significantly in 2024 due to advancements in generative AI, Google’s updates and AI-driven SEO software. If you’re in one of these SEO teams executing strategies that are not in-line with generative AI efforts, you need to reconsider pivoting and bolstering your AI SEO initiatives. Competition during these two events is fierce and it won’t be easy to stand out with average funnel optimization and subpar customer journeys. Your potential customers are more demanding than ever, so your mindset should shift too.

Why am I saying this? 

The significance of SEO tactics during Black Friday and Cyber Monday (BFCM) in 2024 for advanced e-commerce is multifaceted and pivotal for the success of any online business aiming to capitalize on these peak shopping periods. As competition reaches its zenith, I know that having a robust SEO strategy helps e-commerce sites stand out in search engine results, thus drawing in more organic traffic and potential sales. Black Friday and Cyber Monday represent some of the year’s highest revenue potentials, making it critical for businesses to secure top rankings in search results where shoppers are most active. Moreover, optimizing SEO is not just about visibility; it also encompasses enhancing user experience, tailored content and promotions to specific market segments. 

If I were you, I would like to be a business that engages more effectively with my target audience, leading to improved customer interactions and sales outcomes. Technical preparedness and content scaling are crucial aspects in this setup: they involve ensuring that websites can manage the surge in Black Friday and Cyber Monday traffic and adhere to SEO best practices that maintain visibility and functionality.
It’s easier said than done, believe me when I say…but I’ll make a promise to give my best to describe potential ways to prepare for the Black Friday and Cyber Monday period. I hope you’ll enjoy them and be patient with me, my dear reader.

Black Friday SEO In The Era Of Generative AI

Generative AI, such as ChatGPT, Google Bard, Bing Chat, Adobe Firefly, Perplexity AI, Midjourney, and so on, holds immense potential to disrupt various industries, including marketing and creative work. When it comes to Black Friday SEO, generative AI can play a pivotal role in content generation, automation, personalization, and efficiency. 

Even more, there have been some substantial changes in how merchant data is interacting with the Google shopping experience in the past year. We can observe that there are some tectonic shifts for merchants to consider.

Bing suggesting new products using AI

Bing is using AI to suggest products – according to their official statement “Bing’s goal is to bring more joy to shopping—from the initial spark of inspiration to the exciting unboxing experience—by making the process easier and giving you confidence, you’re getting the right item at the right price.”

What can you do to stay ahead of the curve? Our vision and recommended approach is to anticipate the idea of semantic, ontology-based prompting, where structured data is fed into a large language model (LLM) and used for validating the output. WordLift is proud to be one of the companies that pioneer ontology-based prompting to help you construct data-informed prompts which will use data programmatically.

How Marketers Can Prepare Their Black Friday SEO Plans

WordLift has been on the forefront of genAI x SEO innovation and we’ve perfected the AI snapshot for our clients before. I saw the mistakes, the grit and the innovative efforts where we poured our hearts into crafting stunning customer experiences that vowed users.

In this area of Black Friday, Cyber Monday, and holiday sales in general, we have direct experience with the following:

  1. Merchant Feed + Structured Data are interconnected. Data out of sync will stop your ads campaign. Merchant metadata is richer at the moment (Google is trying to make them converge but there are still differences as for the sale price). Instead of waiting, we can establish and use your data in your product knowledge graph on top of the PKG to run technical SEO optimizations.
  2. We can re-generate using data about sales (or campaigns) product descriptions or Product Listing Pages (PLPs) intro text at scale to boost sales by feeding to a large language model (LLM):
    • the existing description
    • the sales price of the items we want to promote
    • a few examples of product descriptions or PLP intro text that would work effectively for SEO (or creative copy taken from the campaign).

To sum up, the quality of your data and data curation workflows is crucial for this to perform in the best possible way.

There is nothing more powerful than utilizing what you have on your side in the first place. Do you want to learn how you can bring your business to the next level? Book a demo.

Here are also some other ways that SEO marketers can effectively use to prepare for Black Friday in the era of generative AI:

  1. Streamline Content Creation: Generative AI empowers marketers to streamline content creation for marketing purposes. By leveraging AI models, marketers can effortlessly generate text and content that aligns with their brand’s style and tone. This automation saves valuable time and resources by handling the generation of product descriptions, promotional emails, blog posts, and landing page content specifically tailored for Black Friday & Cyber Monday.
  2. Personalization and Targeting: Generative AI enables marketers to analyze consumer behavior patterns and preferences, leading to the creation of personalized content. Leveraging AI and ML, marketers can deliver targeted marketing campaigns, tailored offers, and recommendations that resonate with individual customers. This personalized approach significantly boosts customer engagement and conversion rates during the Black Friday & Cyber Monday shopping season.
  3. SEO Optimization: Black Friday is an intensely competitive period for online retailers. Generative AI can aid marketers in optimizing their SEO strategies by generating relevant and keyword-rich content. AI models analyze search trends, identify popular keywords, and produce optimized meta descriptions, titles, and product descriptions, ultimately enhancing organic search visibility during Black Friday.
  4. Efficient Marketing Automation: Generative AI facilitates the automation of repetitive marketing tasks, freeing up marketers to focus on strategic initiatives. AI-powered chatbots can handle customer service inquiries, provide real-time support, and offer personalized recommendations based on user preferences. This automation elevates the customer experience and allows marketers to dedicate more time to other critical aspects of Black Friday campaigns.
  5. Embrace AI and Automation for Fulfillment: Black Friday triggers a surge in online orders, demanding efficient fulfillment from retailers. AI-driven tools, such as robotics and automated software, optimize warehouse operations, streamline order processing, and ensure round-the-clock fulfillment centers. By embracing AI and automation, retailers can steer clear of stockouts, enhance customer satisfaction, and capitalize on the growing trend of online shopping during Black Friday & Cyber Monday.

To effectively prepare for Black Friday in the era of generative AI, marketers must prioritize the adoption of AI technologies and leverage the benefits they offer. Staying updated with the latest advancements is crucial, as it allows marketers to explore how these technologies can enhance content creation, personalization, automation, and SEO optimization. By embracing generative AI and seamlessly integrating it into their Black Friday marketing strategies, marketers can deliver more efficient and impactful campaigns, gain a competitive edge, and achieve better results in the dynamic landscape of Black Friday & Cyber Monday sales.

The key to maximizing the potential of generative AI lies in integrating it with an intelligent content framework. This framework should incorporate carefully selected schema markups to facilitate content comprehension and harness the advantages of organic rankings.

SEO Changed With Generative AI, Google’s Updates And Economy Shifts

Try searching for the keyphrase “use generative AI for SEO” and you’ll see approximately 12 million results. Here’s the screenshot from German SERPs below for that keyphrase: if you thought that this is just another trend like everything else that you’ve seen in the past years, I’ll advise you to think again. Take my advice, I speak from experience.

Don’t believe me? From what I’ve observed in 2024 so far, the landscape of SEO has been reshaped significantly due to the advent of generative AI, Google’s latest updates, and the economic climate. Generative AI has revolutionized content generation, allowing for the crafting of engaging and highly targeted content more efficiently. This advancement aids in enhancing the visibility of websites on search engines by ensuring content is not only relevant but also highly personalized to meet user needs.
Last year, I wrote “Generative AI, such as ChatGPT, Google Bard, Bing Chat, Adobe Firefly, Perplexity AI, Midjourney, and so on, holds immense potential to disrupt various industries, including marketing and creative work. When it comes to Black Friday SEO, generative AI can play a pivotal role in content generation, automation, personalization, and efficiency. Even more, there have been some substantial changes in how merchant data is interacting with the Google shopping experience in the past year. We can observe that there are two tectonic shifts for merchants to consider.” I was right. The changes did arrive, as well as the studies for estimated reduction in organic traffic after search generative experience (SGE) is fully enabled.

The next thing that happened was In March 2024, when Google rolled out a major update focused on improving the quality of content surfaced in search results. This update has prioritized high-quality, helpful content while penalizing sites that offer little value in terms of originality or usefulness. The intent is clear: to refine search results so that they provide more value, pushing marketers and content creators to invest in substantial and useful content. No negotiations on that one and honestly – it was a hard ride.

Additionally, the economic conditions of 2024 have influenced SEO strategies. With budgets possibly tightened, there is a greater need for SEO efforts to be more strategic and effective. Companies are now more focused on optimizing their ROI, ensuring every dollar spent on SEO can be justified with tangible improvements in traffic and sales conversions. I know and witness this first-hand, so I’m trying to pass my wisdom to you, my dear reader, so that you don’t get burned on the same stuff I did.

What Twitter And Independent Research Have Taught Us

The end-of-the-year holiday season is probably the most interesting time for buyers worldwide: everyone is waiting for Black Friday & Cyber Monday to come up and catch a good deal. Customer behavior reshaped a lot, especially after Covid-19: businesses that were late to their digital transformation and understanding the importance of selling on the Internet started shifting their mindsets and preparing themselves to build their business online. 

Your success and profits, besides other stuff, depend on how well you utilize best-structured data practices to support and aid the experience of your online buyers. Users are thirsty to live better experiences and find what they want faster and at scale. Therefore, your number one priority should be to put your best deals and offers on your website in a way that is easily reachable and understandable and stand out in the huge online battlefield. So, let’s go!

We ran a Twitter thread to ask the SEO community (special thanks goes to Rich Tatum) about the least used structured data SEO markups for Black Friday. At the same time, we performed independent research where we analyzed over 107+ popular e-commerce stores (Black Friday pages only) in Switzerland, Germany, the UK, France, Spain, Italy, Netherlands, Belgium, Sweden, Norway, Austria, Europe, and worldwide in general, ordered by their popularity defined by the profits that they gain. 

Some of the most notable e-commerce brands involve Amazon, eBay, migros.ch, microspot.ch, digitec.ch, Mediamarkt & MediaWorld stores, bol.com, Decathlon, Tesco, Zalando, Otto, Carrefour, Next, Very, Argos, Wish, Asda, Asos, IKEA, Coop, H&M, Rewe, Lidl,  Matas, Zara, Schein, idealo, Boulanger, GearBest, Privalia, Global Savings Group, Anibis.ch, Groupon, Alibaba, AliExpress, Flipkart, Walmart and many more. Here’s what we learned through our automated scraping and summarizing process:

  1. Most of the businesses do not even have structured data (over 54%+ of them) or they use the basic schema markups for LocalBusiness, Website, and/or Organization;
  1. BreadcrumbList, Offer, and FAQPage are partially underutilized in the study;
  1. SaleEvent, CollectionPage, OpeningHoursSpecification, OfferCatalog, imageObject and videoObject, and discountCode were the most underused Black Friday schema markups.

Underutilized Black Friday & Cyber Monday Schema Markups – A List

Let us elaborate on the second and the third ones independently, so that you can utilize their powerfully for your e-commerce SEO:

  • BreadcrumbList -> good to use when you have a collection of pages that are interlinked together. The rule here is to use your top pages first in your breadcrumb list while the upcoming pages develop from there, forming a list of ordered, sequential elements. ItemList can be also utilized here too.
  • Offer -> This is reserved for both online and offline deals that you want to showcase on your website, like selling a ticket for an event, streaming occasions, and so on. It goes well in combination with paymentMethod, areaServed, aggregateRating, availabilityStarts, availabilityEnds, category, offeredBy, GTIN, and similar attributes that proved to be helpful in the process.
  • FAQPage -> Great deals come with many unanswered customer questions around the new prices in specific time periods like Black Friday and Cyber Monday. That is why it is important to include the most prominent questions in your webpage by using the FAQPage schema markup. This structured data type goes well with mainContentOfPage, speakable, abstract, about, author, and rating attributes when combined together.
  • SaleEvent -> This one is definitely the most underused across the ecommerce industry. This schema markup is probably the most appropriate for temporary deals (therefore, the use of the word event in its name comes obvious). SaleEvent works perfectly with the audience, contributor, startDate, endDate, eventAttendanceMode, eventStatus (for postponing and rescheduling), location, Offers, subEvent, and sameAs properties. Definitely worth checking out. Ideal when you want to showcase the commercial intent of a webpage compared to the use of FAQPage schema markup which is more informational.
  • CollectionPage -> This is a more specific use of the Webpage schema markup. It can also be used with ItemList together with the mainEntity that references it. This way, it is clear to customers that they are having a collection of items on the webpage that they are seeing. Example:

{ “@context”: “http://schema.org”, “@type”: “CollectionPage”, “mainEntity”: { “@type”: “ItemList”, “itemListElement”: [ { “@type”: “ItemPage” }, { “@type”: “ItemPage” }, { “@type”: “ItemPage” } ] } };

  • OpeningHoursSpecification -> It is always wise to update your opening hours during specific season periods like the black week so that you can properly inform your online visitors (potential customers) about the right time to reach out to you. Great when used in combination with dayOfWeek, opens, closes, validFrom, validThrough, description, and sameAs attributes.
  • OfferCatalog -> This is basically an ItemList but it refers to a list of products that are offered by the same provider. Very important to know the difference here. Works pretty well when used with itemListOrder, alternateName, description, disambiguatingDescription (useful for differentiating between similar product items), name, sameAs and identifier properties. 
  • discountCode -> often misunderstood with the Offer schema markup which usually refers to product schema markup, while this one can refer to a service. This is still not part of Google’s Search Gallery, so you cannot expect to obtain a rich snippet on the SERPs by using it, however, it is still a good choice when providing discounts to stuff or working with a coupons website.
  • ImageObject and VideoObject are quite similar, so we will cover them together. They are particularly useful when you want to provide more images of your products or have a video overview (e.g. games products). The 3DModel schema markup can also prove interesting for advanced ecommerce brands who utilize the power of augmented reality to show off their products in a more interactive way with their audience during the black cyber season.
  • Language  schema can also be very interesting for you -> especially if you’re struggling to get a buy-in to implement hreflang (which from my experience requires a bit more complex technical setup). I’ve used it as a side-way in the past to fix issues when we had multiple stores with similar languages interfering with each other’s organic rankings. I can confirm first-hand that I managed to decrease irrelevant traffic coming to our core pages, so this is something that you can consider too.

Selling More And Going Beyond Prompting And Schema Markups

Customers’ expectations can go beyond prompting and simple schema markup fixing. In order to help you position yourself as competitively and intelligently as possible, we developed our Business + E-commerce Plan which uses the WordLift and Product Knowledge Graph Builder. By using both of them you can easily import your data from your Merchant Feed and enrich it with structured data, streamlining your schema markup creation and building a basis to develop new customer experience on top of that knowledge base.

FAQs

How can SEO help maximize Black Friday sales?

Search Engine Optimization (SEO) can play a critical role in maximizing Black Friday sales by driving targeted traffic to a company’s website and increasing visibility for key products and promotions. By optimizing website content, product pages, and landing pages for relevant keywords, companies can increase their chances of ranking higher in search engine results pages (SERPs) and attracting more potential customers to their sites. Additionally, SEO can help improve the user experience by ensuring that the website is easy to navigate and mobile-friendly, which can lead to higher conversion rates and increased revenue during the Black Friday sales event.

What are the top SEO techniques for Black Friday promotions?

The top SEO techniques for Black Friday promotions include:

  1. Keyword research and optimization: Identify relevant keywords and phrases related to Black Friday deals and promotions, and optimize your website content, product pages, and meta tags to rank higher in search engine results.
  2. On-page optimization: Ensure that your website is optimized for Black Friday, with clear navigation, fast loading times, and mobile responsiveness.
  3. Content creation and promotion: Create valuable and relevant content related to Black Friday promotions, and promote it through social media, email marketing, and other channels.
  4. Structured data markup: Use structured data markup, such as schema.org, to provide search engines with additional information about your Black Friday deals and promotions.
  5. Backlinks: Acquire high-quality backlinks from reputable websites to improve your website’s authority and relevance.
  6. Social media promotion: Use social media platforms to promote your Black Friday deals and promotions, and engage with customers to build brand awareness.
  7. Local SEO: Optimize your website for local search, including creating a Google My Business listing and obtaining customer reviews.
  8. Influencer marketing: Partner with influencers in your industry to promote your Black Friday deals and reach new audiences.
  9. A/B testing: Test different variations of your website and marketing campaigns to identify the most effective strategies for driving traffic and sales.
  10. Analytics and tracking: Track your website traffic, conversion rates, and other key metrics to measure the effectiveness of your SEO efforts and make data-driven decisions.

Are there any specific SEO tips for optimizing Black Friday landing pages?

Here are some SEO tips for optimizing Black Friday landing pages:

  1. Target relevant keywords throughout your landing page content.
  2. Craft compelling meta titles and descriptions.
  3. Optimize page speed for a better user experience.
  4. Ensure mobile optimization for smartphone users.
  5. Include a clear and persuasive call-to-action (CTA).
  6. Create high-quality and engaging content.
  7. Utilize internal and external linking strategies.
  8. Integrate social sharing buttons for increased visibility.
  9. Monitor and analyze landing page performance using web analytics tools.

What are the common SEO mistakes to avoid for Black Friday campaigns?

Here are some common SEO mistakes to avoid for Black Friday campaigns:

  1. Neglecting keyword research and targeting the wrong audience.
  2. Ignoring page speed optimization leads to high bounce rates.
  3. Overlooking mobile optimization, impacting rankings and user experience.
  4. Having poorly written or thin content that lacks quality and relevance.
  5. Missing or poorly optimized meta tags, reducing click-through opportunities.
  6. Neglecting internal and external linking strategies.
  7. Failing to incorporate social sharing buttons for increased visibility.
  8. Inadequate monitoring and analysis of key metrics for optimization.

How To Get Your Products Into Google’s Shopping Graph

How To Get Your Products Into Google’s Shopping Graph

According to McKinsey e-commerce has been 10 years’ growth in 3 months during COVID-19. 

This has led Google to enter the online shopping industry head-on, going head-to-head with giant Amazon. To challenge Amazon, Google presents itself as a cheaper, less restrictive option for independent sellers. And it is focusing on driving traffic to sellers’ sites, not selling its version of products, as Amazon does. To increase its ability to sell products, Google has launched a series of updates, also aimed at helping small businesses gain visibility for their products. 

We’re advancing our plans to make it free for merchants to sell on Google. Beginning next week, search results on the Google Shopping tab will consist primarily of free listings, helping merchants better connect with consumers, regardless of whether they advertise on Google. 

With hundreds of millions of shopping searches on Google each day, we know that many retailers have the items people need in stock and ready to ship, but are less discoverable online. Bill Ready – President, Commerce (April 2020)

It is a great innovation. This new approach marks the overcoming of the pay-to-play model of Google. Before selling products on Google, you had to invest money and put in so much effort that small businesses could not appear on Google. With the free listing, Google allows everyone to sell their products on Google property.

Product data on Google

Typically, you’d only share product data for products you wanted to promote on Google services. This changed in 2020 with the introduction of several free offerings, and these continue to evolve. Here’s what you need to know for 2024:

  1. Free listings on the Google Shopping tab: It’s still free to participate in the Google Shopping tab, allowing you to showcase your products to a wider audience of potential buyers.
  2. Enhanced organic search results: Product data from Google Merchant Center is still used to enrich organic search results. This helps Google present your products more accurately to users searching for relevant terms. However, Google’s understanding of product information from your website is constantly improving.

Why is accurate product data still important?

Even with Google’s advancements in extracting information from web pages, providing accurate product data through Merchant Center offers several advantages:

  • Increased control and accuracy: You have more control over the information displayed about your products, ensuring it’s up-to-date and aligns with your marketing strategy.
  • Eligibility for richer search results: Complete and accurate product data can improve your chances of appearing in richer search results with features like product ratings, pricing, and availability.
  • Stand out from the competition: Detailed product information can give you an edge over competitors with less informative listings.

In conclusion, while Google’s ability to understand product information from your website is improving, providing accurate product data through Google Merchant Center remains a valuable strategy for maximizing the visibility and appeal of your products in search results.

What is Google Shopping Graph?

“Building on the Knowledge Graph, the Shopping Graph brings together information from websites, pricing, reviews, videos and, most importantly, product data we receive directly from brands and retailers,” said Billy Ready. The AI-enhanced model works in real-time and shows users relevant listings as they shop on Google.

Similar to Google’s Knowledge Graph, the Shopping Graph links information about entities and influences what may appear in search results. You can imagine the Knowledge Graph as a giant encyclopedia that connects information about everything under the sun. The Shopping Graph, on the other hand, is like a massive product catalog with detailed entries for each item.

Google Shopping Graph is growing. Billy Ready said, “We’ve seen a 70% increase in the size of our product catalog and an 80% increase in merchants on our platform”.

The sources from which the data for the Google Shopping Graph are taken are the most diverse. Among them:

  • Youtube videos
  • Manufacturer Websites
  • Online shops and product detail pages (PDPs)
  • Google Merchant Center
  • Product tests
  • Product reviews

How Do You Get Your Products Into Google’s Shopping Graph?

To increase the sales for both your E-Commerce website and/or local retail shops, your goal is to help Google understand the products that you’re selling. We can do this by providing accurate product data using both push and pull strategies:

  1. Push: we feed product data using Google Merchant Center.
  2. Pull: we let Google crawl the structured data on our webpages.

Google is using this data to present products to end-users using special formatting on Google Search, Google Image Search, Google Maps, and now Google Lens. The aim of Google is to guide the consumer throughout his/her journey in making informative purchasing decisions. In the context of Google Search, product data is also aggregated into product knowledge panels that are designed to provide shoppers with an eye-bird overview of all the characteristics, how-tos, pricing details, and reviews. 

Google stores all of this data in its Shopping Graph made of 2.4 billion products and constantly evolves how data is used on its various surfaces to be more relevant for the end-user. 

Learn more about how to get product data into Google Shopping Graph, by watching the video.

Data harmonization

Data synchronization is essential. A lack of it can create confusion and risk your products not ranking well on Google for one or more specific queries. 

So you must make sure where your product information is coming from and make sure you’re using Merchant Feed for structured data. We go from there to enrich it with information that Google will appreciate. 

On some occasions, you’ll have to make strategic choices about updating your data, deciding to put some information only on your website or only in the Merchant Feed. An example of this is the “in stock” value. As with other values that can change often, it is more beneficial to keep the information only on one side to avoid discrepancies or differences in updates that can negatively affect listing on Google. 

Data reconciliation 

A unique product identifier (UPI) is a number or code that uniquely identifies the product in the eyes of the customer. A UPI helps us identify a specific type of product or a color variation for that product worldwide. 

As of 2018, UPIs became mandatory to list products in Google Merchant Feed. But the real change happened in 2021 when unique product identifiers became mandatory for free listening on Google

Google said that “different products using the same GTIN with the same variant attributes will be considered ambiguous and will be disapproved.” Additionally, Google said that “if a group of products is identified as duplicates, only one will remain active and eligible to show in free listings.”

Adding structured data to your product page, you will be sure to have UPIs for each product you want to sell.

Optimizing Your Google Merchant Feed for Enhanced E-commerce SEO

In our approach to improving SEO for ecommerce, we place significant emphasis on optimizing and improving the Google Merchant Feed, a crucial component for companies aiming to thrive in the digital marketplace and gain access to Google’s free listings through the Shopping Graph.

Our strategy involves meticulously adding structured data to your product listings, ensuring that each item is accurately represented and easily discoverable, packed with all the information needed to appear in Google’s Shopping Graph.

By integrating with the Google Merchant Center, we streamline the feed management process, making it more efficient and effective. Our method not only facilitates the creation of a comprehensive Product Knowledge Graph, but also ensures that your products stand out in search results.

This integration enables a continuous flow of accurate and detailed product information, directly influencing how your products are presented and perceived in the vast online shopping landscape. Through our customized solutions, we aim to elevate your online presence, making your products more visible and attractive to potential customers.

Do you want to get your products into Google Shopping Graph? Start building your Product Knowledge Graph today, talk with our SEO expert.

How Google Uses Structured Data And Google Merchant Center Data

The following are examples of how Google uses structured data embedded in web pages and Google Merchant Center data for different experiences. Note that experiences may vary by country, device, and other factors.

A critical aspect of growing your online business is getting discovered in the search, and Google can help shoppers with this during their buying journey. 

For this, Google provides you with these best practices that can help you. When you share data about your products, Google can more easily find and analyze your content, giving you the ability to appear in Google Search and other Google spaces. This allows your online shop to have excellent visibility and be found more easily by users searching for products or services like your company’s. 

Do you want to get your products into Google Shopping Graph? Talk with our SEO experts.

Google Duplex is the new technology developed by Google to automate certain tasks, such as booking a table at a restaurant, booking a flight, or speeding up your customers’ shopping experience. Learn how to use Google Duplex for e-commerce, check out our latest web story.

Learn more about product structured data and how to build a multilingual product knowledge graph for your e-commerce website by reading our article.

Sitechecker

Post Cheetah

In the dynamic world of online visibility, mastering Search Engine Optimization (SEO) is key to standing out amidst the digital noise. Meet Post Cheetah, your ultimate SEO companion designed to propel your website to the top of search engine results pages (SERPs).

With its innovative features and user-friendly interface, Post Cheetah empowers individuals and businesses alike to unlock the full potential of their online presence through strategic SEO tactics. Say goodbye to guesswork and hello to optimized success with Post Cheetah by your side.

Meeting DSPy: From Prompting to Programming Language Models

Meeting DSPy: From Prompting to Programming Language Models

Are you exhausted from constantly prompting engineering tasks? Do you find the process fragile and tiresome? Are you involved in creating workflows using language models? If so, you might be interested in DSPy. This blog post provides a gentle introduction to its core concepts.

While building robust neuro-symbolic AI workflows, we’ll explore the synergy between LMs and graph knowledge bases within digital marketing and SEO tasks.

Table of content:

  1. What is DSPy?
  2. Let’s Build Our First Agents
  3. Automated Optimization Using DSPy Compiler
  4. Creating a Learning Agent
  5. Implementing Multi-Hop Search with DSPy and WordLift
  6. Conclusion and Future Work

What is DSPy?

DSPy is an acronym that stands for Declarative Self-Improving Language Programs. It is a framework developed by the Stanford NLP team that aims to shift the focus from using LMs with orchestrating frameworks like LangChain, Llama Index, or Semantic Kernel to programming with foundational models. This approach addresses the need for structured and programming-first prompting that can improve itself over time.

A Real Machine Learning Workflow🤩

For those with experience working with PyTorch and machine learning in general, DSPy is an excellent tool. It is designed around the concept of constructing neural networks. Let me explain: when starting a machine learning project, you typically begin by working on datasets, defining the model, running the training, configuring the evaluation, and testing.

DSPy simplifies this process by providing general-purpose modules like ChainOfThought and ReAct, which you can use instead of complex prompting structures. Most importantly, DSPy brings general optimizers (BootstrapFewShotWithRandomSearch or BayesianSignatureOptimizer), which are algorithms that will automatically update the parameters in your AI program.

You can recompile your program whenever you change your code, data, assertions, or metrics, and DSPy will generate new effective prompts that fit your modifications.

DSPy’s design philosophy is the opposite of thinking, “Prompting is the future of programming.” you will build modules to express the logic behind your task and let the framework deal with the language model.

Core Concepts Behind DSPy💡

Let’s review the fundamental components of the framework:

  • Signatures: Here is how you abstract both prompting and fine-tuning. Imagine signatures as the core directive of your program (e.g., read a question and answer it, do sentiment analysis, optimize title tags). This is where you define the inputs and outputs of your program; it’s the contract between you and the LM. Question answering is represented, for example as question -> answer, sentiment analysis is sentence -> sentiment, title optimization is title, keyword -> optimized title and so on.
  • Modules: They are the building blocks of your program. Here is where you define how things shall be done (e.g., use Chain of Thought or Act as an SEO specialist, etc.). Using parameters here, you can encode your prompting strategies without directly messing up with prompt engineering. DSPy comes with pre-defined modules (dspy.Predict, dspy.ChainOfThought, dspy.ProgramOfThought, dspy.ReAct, and so on ). Modules use the Signature as an indication of what needs to be done.
  • Optimizers: to improve the accuracy of the LM, a DSPy Optimizer will automatically update the prompts or even the LM’s weights to improve on a given metric. To use an optimizer, you will need the following:
    • a DSPy program (like a simple dspy.Predict or a RAG),
    • a metric to assess the quality of the output,
    • a training dataset to identify valuable examples (even small batches like 10 or 20 would work here).

Let’s Build Our First Agents 🤖

Before further ado, let’s begin with a few examples to dive into the implementation by following a simple Colab Notebook.

We will:

  • Run a few basic examples for zero-shot prompting, entity extraction, and summarization. These are simple NLP tasks that we can run using LMs. We will execute them using DSPy to grasp its intrinsic logic.
  • After familiarizing ourselves with these concepts, we will implement our first Retrieval-Augmented Generation (RAG) pipeline. Retrieval-augmented generation (RAG) allows LMs to tap into a large corpus of knowledge from sources such as a Knowledge Graph (KG). The RAG will query the KG behind this blog to find the relevant passages/content that can produce a well-refined response. Here, we will construct a DSPy retriever using WordLift’s Vector Search. It is our first time using this new functionality from our platform 😍.  
  • We will then:
    a. Compile a program using the RAG previously created.
    b. Create a training dataset by extracting question-answer pairs from our KG (we will extract a set of schema:faqPages).
    c. Configure a DSPy Optimizer to improve our program.
    d. Evaluate the results.

A Simple Workflow For Content Summarization🗜️ 

Let’s create a signature to elaborate a summary from a long text. This can be handy for generating the meta description of blog posts or other similar tasks. We will simply instruct DSPy to use long_context -> tldr using the Chain Of Thought methodology.

Worth noticing we didn’t have to write any prompts! 

WordLift DSPy Retriever🔎

The next step is to use WordLift’s Knowledge Graph and its new semantic search capabilities. DSPy supports various retrieval modules out of the box, such as ColBERTv2, AzureCognitiveSearch, Pinecone, Weaviate, and now also WordLift 😎. 

Here is how we’re creating the WordLiftRetriever, which, given a query, will provide the most relevant passages.

Building a RAG, once we have a retriever,  using DSPy is quite straightforward. We begin by setting up both the language model and the new retriever with the following line: 

dspy.settings.configure(lm=turbo, rm=wl_retriever)

The RAG comprises a signature made of a context (obtained from WordLift’s KG), a question, and an answer.

Automated Optimization Using DSPy Compiler

Using the DSPy compiler, we can now optimize the performance or efficiency of an NLP pipeline by simulating different program versions and bootstrapping examples to construct effective few-shot prompts.

I like this about DSPy: not only are we moving away from chaining tasks, but we’re using programming and can rely on the framework to automate the prompt optimization process. 

The “optimizer” component, previously known as the teleprompter, helps to refine a program’s modules by prompting or fine-tuning the optimization process. This is the real magic of using the DSPy framework. As we feed more data into our Knowledge Graph, the AI agent we create using DSPy evolves to align its generation with the gold standard we have established.

Let’s Create a DSPy Program

DSPy programs like the one built in the Colab help us with tasks like question answering, information extraction, or content optimization.

As with traditional machine learning, the general workflow comprises these steps:

  • Get data. To train your program, you will need some training data. To do this, you should provide examples of the inputs and outputs that your program will use. For instance, collecting FAQs from your blog will give you a relevant set of question-answer pairs. Using at least 10 samples is recommended, but remember that the more data you have, the better your program will perform.
  • Write your program. Define your program’s modules (i.e., sub-tasks) and how they should interact to solve your task. We are using primarily a RAG with a Chain Of Thought. Imagine using control flows if/then statements and effectively using the data in our knowledge base and external APIs to accomplish more sophisticated tasks. 
  • Define some validation logic. What makes for a good run of your program? Maybe the answers we have already marked up as FAQs? Maybe the best descriptions for our products? Specify the logic that will validate that.
  • Compile! Ask DSPy to compile your program using your data. The compiler will use our data and validation logic to optimize the program (e.g., prompts and modules).
  • Iterate. Repeat the process by improving your data, program, and validation or using more advanced DSPy compiler features.

Creating a Learning Agent

By combining DSPy with curated data in a graph, we can create LM-based applications that are modular, easy to maintain, self-optimizing, and robust to changes in the underlying models and datasets. The synergies between semantic data and a declarative framework like DSPy enable a new paradigm of LLM programming, where high-level reasoning strategies (i.e., optimize the product description by reading all the latest reviews) can be automatically discovered, optimized, and integrated into efficient and interpretable pipelines. 

DSPy is brilliant as it creates a new paradigm for AI agent development. Using the DSPy compiler we can ground the generation in the information we store in our knowledge graph and have a system that is self-optimizing and easier to be understood. 

Here is DSPy’s teleprompter and compiler pipeline, which helps us create a modular, extensible, self-optimizing RAG system that adapts by leveraging human-annotated question-answer pairs on our website! 

When dealing with complex queries that combine multiple information needs, we can implement a sophisticated retrieval mechanism, Multi-Hop Search (“Baleen” – Khattab et al., 2021), to help us find different parts of the same query in different documents. 

Using DSPy, we can recreate such a system that will read the retrieved results and generate additional queries to gather further information when necessary. 

We can do it with only a few lines of code.

Let’s review this bare-bone implementation. The __init__ method defines a few key sub-modules: 

  • generate_query: We use the Chain of Thought predictor within the GenerateSearchQuery signature for each turn. 
  • retrieve: This module uses WordLift Vector Search to do the actual search using the generated queries. 
  • generate_answer: This dspy.Predict module is used after all the search steps. It has a GenerateAnswer to produce the final answer. The forward method uses these sub-modules in simple control flow. First, we’ll loop up to self.max_hops times. We generate a search query during each iteration using the predictor at self.generate_query[hop]. We’ll retrieve the top-k passages using that query. We’ll add the (deduplicated) passages to our accumulator of context. After the loop, we’ll use self.generate_answer to produce the final answer. We’ll return a prediction with the retrieved context and the predicted answer.

Quite interestingly, we can inspect the last calls to the LLM with a simple command: turbo.inspect_history(n=3). This is a practical way to examine the extensive optimization work done automatically with these very few lines of code. 

Conclusion and Future Work

As new language models emerge with advanced abilities, there is a trend to move away from fine-tuning and towards more sophisticated prompting techniques.

The combination of symbolic reasoning enabled by function calling and semantic data requires a robust AI development and validation strategy. 

While still at its earliest stage, DSPy represents a breakthrough in orchestration frameworks. It improves language programs with more refined semantic data, clean definitions, and a programming-first approach that best suits our neuro-symbolic thinking

Diving deeper into DSPy will help us improve our tooling and Agent WordLift’s skills in providing more accurate responses. Evaluation in LLM applications remains a strategic goal, and DSPy brings the right approach to solving the problem. 

Imagine the potential advancements in generating product descriptions as we continuously enrich the knowledge graph (KG) with additional training data. Integrating data from Google Search Console will allow us to pinpoint and leverage the most effective samples to improve a DSPy program. 

Beyond SEOs and digital marketing, creating self-optimizing AI systems raises ethical implications for using these technologies. As we develop increasingly powerful and autonomous AI agents and workflows, it is vital that we do so responsibly and in a way that is fully aligned with human values.

Are you evaluating the integration of Generative AI within your organization to enhance marketing efforts? I am eager to hear your plans. Drop me a line with your thoughts.

References