How Google Bot Crawls JavaScript [2025]

Ever wondered how Google turns your lovingly handcrafted website into a ranking somewhere below a Reddit thread from 2013? It’s not magic, it’s just a long queue of tiny robot librarians fetching HTML, executing JavaScript, and occasionally having nervous breakdowns when they hit your React app.

This is the life cycle of a webpage inside Google’s digestive system: crawl, render, index, panic. Let’s go step by step before your sitemap starts crying.

1. Crawling: getting the raw HTML

1.1 URL discovery & crawl queue

Googlebot first has to discover your URLs. That can happen via:

  • Links from other pages
  • XML sitemaps
  • Manual submit / “Inspect URL → Request indexing” in Search Console
  • Other Google systems (e.g. feeds, previous crawls)

Discovered URLs go into a crawl queue with priority based on things like page importance and your site’s crawl budget.

1.2 robots.txt and basic checks

Before requesting the URL, Googlebot:

  1. Fetches robots.txt
  2. Checks if the URL (and key resources like JS/CSS) are allowed
  3. Applies host load limits and crawl budget rules

If the page or important JS/CSS files are blocked in robots.txt, Google:

  • Won’t crawl them
  • Won’t be able to fully render your JS content later

Practical implication: Never block /js/, /static/, /assets/, etc. in robots.txt.

1.3 Fetching the HTML (“first wave”)

Googlebot makes a normal HTTP request (like a browser without UI):

  • Gets the initial HTML (without having run JS yet)
  • Parses head tags (title, meta description, canonical, meta robots, hreflang, etc.)
  • Extracts links from the HTML and adds them to the crawl queue
  • Notes references to resources (JS, CSS, images)

At this stage, only what’s in the raw HTML is visible. If your content is 100% client-side rendered (React, Vue, etc.), Google might see almost nothing yet.

Google can sometimes do basic indexing directly from the HTML (e.g. if content is already there), but JS-heavy pages need the next phase.


2. Rendering: running your JavaScript

Google describes the JS pipeline as: Crawling → Rendering → Indexing. Rendering happens in a separate system using an evergreen version of Chromium (a headless Chrome kept relatively up-to-date) called the Web Rendering Service.

2.1 The render queue (“second wave”)

After the initial crawl:

  1. Google adds the page to a render queue.
  2. When resources allow, that queue feeds URLs into the rendering system.
  3. Until rendering happens, Google only “knows” what was in the raw HTML.

This is why people talk about “two waves of indexing” for JavaScript:

  • Wave 1: Index from HTML (if possible)
  • Wave 2: Index updated content after JS has run

Modern research suggests the process is smoother and faster than years ago, but there is still a render queue and potential delay for JS content.

2.2 How Google’s renderer behaves

When a page reaches the renderer:

  1. Google loads it in an evergreen Chromium environment (no UI).
  2. It fetches JS, CSS, and other resources (subject to robots.txt, CORS, etc.).
  3. It executes JavaScript for a limited amount of time (shorter than a user session).
  4. JS can:
    • Modify the DOM
    • Inject content
    • Fetch JSON/XHR/data and add it to the page
    • Add structured data (application/ld+json) dynamically

Important constraints (from Google’s docs & tests):

  • No user interactions: Google doesn’t click, type, or scroll like a user.
  • Time limits: Long chains of async calls may never complete before the renderer stops.
  • Resource limits: Heavily blocking scripts or endless network calls can break rendering.
  • “Noindex = no render” effect: If a page is noindex, Google generally won’t bother rendering it.

2.3 Post-render snapshot

Once JS finishes (or time runs out), Google:

  1. Takes the final DOM snapshot (what a user would see after JS).
  2. Extracts:
    • Visible text content
    • Links added by JS (e.g. SPA navigation)
    • Structured data present in the DOM
    • Meta tags if they are changed or added by JS

This rendered snapshot is what feeds into the real indexing stage.


3. Indexing: storing & scoring the rendered content

With the rendered HTML/DOM in hand, Google moves to indexing.

3.1 Understanding the content

From the rendered DOM, Google:

  • Tokenizes and stores text (words, phrases, headings).
  • Maps entities, topics, and relationships.
  • Reads links (anchor text + target URL) added by your JS navigation.
  • Parses structured data (schema.org, etc.) that JS may have injected.

This is the version of the page that can now rank for queries matching that content.

3.2 Canonicals, duplicates & signals

Indexing also handles:

  • Canonical selection (HTML tags, redirects, link signals).
  • Duplicate / near-duplicate detection, especially if JS rewrites similar pages.
  • Applying meta robots and HTTP headers from the final state after JS (for most cases).

If Google decides another URL is the canonical, your rendered JS content might be stored but not shown as the main result.

3.3 Final result: searchable document

After indexing, the document is:

  • Stored in Google’s index with:
    • Content (from rendered DOM)
    • Links
    • Structured data
    • Various quality & relevance signals
  • Ready to be retrieved and ranked when a user searches for related queries.

4. Where JavaScript sites usually break this flow

Because JS adds extra moving parts, a bunch of things can go wrong between crawl → render → index:

  1. Blocked JS/CSS in robots.txt
    Google can’t render layout or content if the files are disallowed.
  2. Content only after interaction
    If key text appears only after a click/scroll or in a modal that never opens, Google won’t see it.
  3. Too slow or broken rendering
    Heavy JS, long waterfalls, or failing XHR calls mean the DOM never contains the content when Google takes the snapshot.
  4. Infinite scroll / SPA routing without proper URLs
    If content is loaded endlessly on one URL without crawlable links or pagination (e.g. no ?page=2, no anchor links), Googlebot may only see the initial “page”.
  5. Client-side only structured data that doesn’t materialise in time
    If JS injects JSON-LD but too slowly (or fails), rich results won’t trigger.

5. How to see what Google sees (JS-specific)

To understand how your JS is being processed:

  • Use URL Inspection“View crawled page” & “Screenshot” in Search Console to see the rendered DOM.
  • Compare “HTML” vs “Rendered HTML” to spot what content only appears post-JS.
  • Use “Test live URL” if you suspect render-queue delay.
  • Check Coverage / Pages report for “Crawled – currently not indexed” patterns that often show render/index issues.

So there you have it — from lazy bots fetching half your HTML to a headless Chrome pretending to be a real user for 0.3 seconds. Somewhere in that chaos, your content might actually get indexed.

If your JavaScript site isn’t showing up, don’t blame Google just yet — try unblocking your own files and giving the crawler half a chance. Think of it as SEO mindfulness: eliminate obstacles, breathe deeply, and let the bots eat your content in peace.


Explained in simpler terms – How Googlebot Crawls Javascript –

Stage 1 – Discovery & Crawling: “Finding your page and grabbing the first copy”

1. Google finds your URL

Google finds pages from things you already know:

  • Links on other sites
  • Your internal links
  • Your XML sitemap
  • Stuff you submit in Search Console

It puts those URLs in a big to-do list (crawl queue).


2. robots.txt check

Before visiting a URL, Google checks your robots.txt file:

  • If the page or important files (JS/CSS) are blocked, Google is basically told: “Don’t look here.”
  • If they’re allowed, it moves on.

Simple rule for you:
Never block your JS/CSS folders in robots.txt.


3. Google downloads the HTML (Wave 1)

Google now requests the page, just like a browser:

  • It gets the basic HTML (before any JavaScript runs).
  • From that HTML it grabs:
    • Title, meta description, canonical, meta robots, etc.
    • Any plain text that’s already there
    • Links to other pages
    • Links to JS/CSS/images

At this point, Google has not run your JavaScript yet.

If your important content is already in the HTML (e.g. server-side rendered), Google can often index it right away from this “first wave”.


Stage 2 – Rendering: “Actually running your JavaScript”

Now Google needs to know what your page looks like after JS runs – like a real user would see it.

Because this is heavy work, Google doesn’t do it instantly for every URL.

4. Render queue (waiting line)

After the first crawl, JavaScript pages go into a render queue:

  • Think of it like: “We’ve saved the HTML. When we have time, we’ll come back and run the JS.”

So for a while, Google might only know the bare HTML version of your page.


5. Headless Chrome renders the page

When your page reaches the front of the queue, Google loads it in something like Chrome without a screen (headless browser).

This browser:

  • Downloads JS/CSS (if not blocked)
  • Executes the JS for a short amount of time
  • Lets JS:
    • Change the DOM (the page structure)
    • Insert more text
    • Add more links
    • Inject structured data (JSON-LD)

Then it takes a snapshot of the final page – the “after JS” version.

This is basically:

“What a user would see if they opened your page and waited a bit.”


6. Things that can go wrong here

This is where JS sites often break:

  • Blocked JS/CSS → Google can’t see the layout or content properly.
  • Very slow JS → content appears after Google stops waiting.
  • Content only after a click/scroll → Google doesn’t usually click buttons or scroll like a human.
  • Broken scripts / errors → content never appears at all.

Result: Google’s snapshot may miss your main content.


Stage 3 – Indexing: “Filing your page in the library”

Now Google has:

  • Version 1: HTML-only (first wave)
  • Version 2: Rendered DOM (after JS runs)

7. Understanding the rendered page

From the rendered snapshot Google:

  • Reads all the visible text
  • Sees headings and structure
  • Follows any extra links added by JS
  • Reads structured data (schema)
  • Applies canonical tags / meta robots, etc.

This updated information is used to update your page in the index (second wave).


8. Search results

When someone searches:

  1. Google looks through its index of pages (which contains that rendered version).
  2. It decides which pages are most relevant.
  3. It shows your URL in the results.
  4. When the user clicks it, they go to your live site, not to Google’s stored copy.

Quick “JS SEO” checklist for you

If you remember nothing else, remember these:

  1. Can I see my main content in the raw HTML?
    • If yes → you’re mostly safe (e.g. SSR or hybrid).
    • If no → you’re relying heavily on rendering; be extra careful.
  2. Are JS & CSS allowed in robots.txt?
    • They should be.
  3. Does important content require a click/scroll?
    • Try to have key text and links visible without interaction, or use proper URLs for loaded content.
  4. Is the page reasonably fast?
    • If it takes ages for content to appear, Google may bail before seeing it.
  5. Use Search Console’s URL Inspection → “View crawled page”
    • Compare:
      • HTML Google saw
      • Rendered HTML
    • If you don’t see your text in the rendered version → that’s a problem.

Scraping Reviews with Screaming Frog? [2025]

You can scrape reviews if they are :

– In Schema
– Have their own specific class of <p> tag

e.g. class=”review-tex text-base font-secondary”

Can you scrape the reviews then?

Yes, you can scrape the reviews if they are formatted in schema markup (like Review, AggregateRating, etc.) or if they belong to a specific class. Screaming Frog’s Custom Extraction feature will allow you to do this.

✅ How to Scrape Reviews in Schema or Specific HTML Classes Using Screaming Frog

1️⃣ Enable Structured Data Extraction (for Schema Reviews) If the reviews are in schema markup (JSON-LD, Microdata, or RDFa), Screaming Frog can directly extract them:

✔ Open Screaming Frog SEO Spider.

✔ Navigate to Configuration > Spider > Enable Structured Data.

✔ Start a crawl, and Screaming Frog will pull JSON-LD, Microdata, and RDFa formats.

✔ Once the crawl is finished, go to Reports > Structured Data > Review Schema to export the data.

2️⃣ Extract Reviews from a Specific Class (e.g., class=”review-text text-base font-secondary”) If the reviews are located within a specific tag that has a class, you can use Custom Extraction to get them:

✔ Open Screaming Frog and go to Configuration > Custom Extraction.

✔ Select XPath or CSS Selector based on your data structure. ✔ Use these extraction methods: 📝 Option 1: Using CSS Selector (Ideal for Class-Based Extraction) css Copy Edit p.review-text.text-base.font-secondary ✔

This will retrieve all tags that have the class review-text text-base font-secondary. 📝 Option 2: Using XPath (For More Complex Queries) xpath Copy Edit //p[contains(@class, ‘review-text text-base font-secondary’)] ✔

This will locate all elements with a class that includes “review-text text-base font-secondary”.

3️⃣ Run the Crawl and Export the Review Data

✔ Start the crawl and allow Screaming Frog to extract the reviews.

✔ After it’s done, go to Custom Extraction > Export Data.

✔ Open the resulting file in Excel or Google Sheets for further analysis. 🚀 Alternative Approach: Scraping JSON-LD Schema Reviews If the reviews are included in JSON-LD schema markup, you can extract them using: Custom Extraction (JSON-LD): xpath //script[@type=’application/ld+json’]/text()
✔ This extracts all JSON-LD data, which you can filter for reviews.
✔ Use JSON formatting tools (like jq or Python json.loads()) to clean and parse the data.

🔹 Summary: Can Screaming Frog Scrape Reviews?
✅ YES, if reviews are inside Schema Markup (JSON-LD, Microdata, RDFa).
✅ YES, if they have a specific class, using Custom Extraction.
✅ YES, if the reviews are static HTML, meaning they exist in the page source.
❌ NO, if reviews are loaded dynamically via JavaScript (use Selenium instead).

What’s the Point of Having a Business Blog? [2025]

Last Updated – A few days ago (probably)

Having a high-quality, high-traffic blog in the same niche can significantly enhance the organic rankings of an eCommerce site for commercial terms like “buy football goals.” Here’s how:

  1. Increased Topical Authority

Search engines such as Google prioritise specific knowledge and expertise. A blog that focuses on football-related topics showcases expertise and builds authority within the niche. This can enhance the credibility of the eCommerce site as a reliable source for football equipment.

If you have 30 excellent, well written, detailed posts about football, then for an array of reasons from topical authority to social shares and backlinks, the eCommerce ‘section’ of your site will tend to rank a lot higher for commercial terms.

Top tip – include your own research and data. People love to link back to statistics. A good example with the NOW Foods 3rd party lab testing of creatine gummies – showing half of them had hardly any creatine in them.

  1. Internal Linking Opportunities

A well-organized blog provides the chance to link to your product pages (e.g., “buy football goals”) using keyword-rich anchor text. This not only drives traffic to those pages but also indicates to search engines the relevance of your product pages for specific keywords.

  1. Improved Backlink Profile

Blogs tend to attract more backlinks than product pages because they offer valuable, non-commercial content. These backlinks to your blog can transfer authority to your eCommerce site if you effectively link the blog to your product pages.

  1. Keyword Coverage

Blogs enable you to target informational keywords like “how to set up a football goal” or “best football training drills,” which may not fit well on product pages. Once users visit these blog pages, you can direct them toward related products, creating a smooth transition from information to purchase.

  1. Increased Dwell Time

High-quality content keeps users engaged on your site for longer periods. This increased dwell time signals to search engines that your site offers valuable content, which can positively influence rankings across the board, including for commercial terms.

  1. Capture Users at Different Stages of the Sales Funnel

Blogs can attract users who are in the awareness or consideration stages of their buying journey. For instance:

A post titled “5 Things to Consider When Buying Football Goals” can inform users while subtly promoting your products.

If they choose to buy, they’re already on your site and more likely to make a purchase.

  • Use exit intent pop-ups to build an email list – incentivise sign ups with discounts
  • Have a sticky banner with a special offer
  • Make the brand stand out in images and banners

Brand Awareness!

Anybody can set up a website and sell stuff, and create some ads – but you won’t get direct visitors and people searching for your brand (a HUGE SEO ranking factor) if you haven’t built your brand.

Brand bias is also huge on products – take trainers, football boots and clothes for example. A top quality blog with good content can help build your brand awareness massively.

Establishes Authority & Expertise

Marketing bellends call this “thought leadership” and other contrived BS terms to make it sound impressive. But, at the end of the day, if you read a local PT’s blog about fitness and nutrition and it’s amazing, and references meta-analysis and robust research; you’ll probably be inclined to contact him or her if you are looking for a PT in the future? Especially if they display other EAT – Expertise, Authority and Trustworthiness, like a PhD in Exercise Physiology and 20 years experience as a Navy Seal fitness instructor and 10 years as a Premier League Physiotherapy – just to give you a realistic example.

Gives You an Idea of What Your Audience Wants More of

Use search console to see what your blog posts rank for. Take note of any quasi-relevant search terms.

For example, my MMA diet plan, was ranking for Boxing Diet Plan – so I created a new page for this second search term.

In addition to expanding your offerings in terms of content and products, see which are your most popular posts, and if these posts can inspire more content or products. Especially true if the posts related to pain-points of your target audience.

eCommerce Product Page – SEO & UX Checklist

SEO Checklist:
Product Title Optimization
Include the main keyword naturally.
Keep titles concise but descriptive.
Meta Title and Meta Description
Write a compelling meta title (60-70 characters) with the main keyword.
Create a unique and engaging meta description (150-160 characters) with a call to action and relevant keywords.
URL Structure
Use clean and descriptive URLs with relevant keywords.
Avoid unnecessary parameters and ensure URLs are short.
Header Tags (H1, H2, H3, etc.)
Use one H1 tag for the product title.
Structure content with subheadings (H2, H3) for different sections.
High-Quality Product Descriptions
Write unique, engaging, and detailed product descriptions (300+ words).
Naturally incorporate relevant keywords.
Highlight features, benefits, and use cases.
Product Images and Alt Text
Use high-resolution, web-optimized images.
Add descriptive, keyword-rich alt text for images.
Include multiple angles or lifestyle images.
Structured Data Markup (Schema)
Implement product schema for price, availability, reviews, etc.
Use JSON-LD for structured data.
Internal Linking
Link to related products or categories.
Include “frequently bought together” and “related products” sections.
Mobile-Friendliness
Ensure the product page is fully responsive and optimized for mobile users.
Test on multiple devices.
Page Load Speed
Optimize images and use compression.
Minimize CSS, JavaScript, and HTML.
Use a content delivery network (CDN).
Canonical Tags
Implement canonical tags to prevent duplicate content issues.
Keyword Usage
Use keywords naturally in the title, description, URL, headings, and body text.
Avoid keyword stuffing.
User Reviews and Ratings
Display user-generated content like reviews and star ratings.
Use schema markup for review ratings.
Social Sharing Buttons
Provide social sharing options for easy promotion.
Breadcrumbs
Use breadcrumb navigation to enhance site structure and user experience.
UX Checklist:
Clear Call to Action (CTA)
Use prominent and persuasive “Add to Cart” or “Buy Now” buttons.
Ensure CTA buttons are highly visible.
Product Images
Include multiple images and zoom functionality.
Offer a gallery of images showing different angles and context.
Product Information
Display essential details (price, size, color, availability) prominently.
Offer accurate and up-to-date stock information.
Shipping Information
Provide clear and transparent shipping details, including delivery times and costs.
Product Variations
Offer a seamless selection process for colors, sizes, or other variations.
Use clear dropdown menus or buttons.
Customer Reviews and Ratings
Allow customers to leave reviews.
Include user-generated content for social proof.
Related Products
Display recommended, cross-sell, or upsell products.
Use “Customers Also Viewed” or “Frequently Bought Together” sections.
Return Policy Information
Clearly outline the return policy on the product page.
Trust Signals
Include security badges, accepted payment icons, and other trust symbols.
Show testimonials or guarantees if applicable.
Easy Navigation
Ensure intuitive breadcrumb trails.
Use filters and sorting options for larger catalogs.
User-Friendly Design
Use white space effectively to improve readability.
Ensure buttons and interactive elements are appropriately sized.
FAQ Section
Provide answers to common questions about the product.
Accessibility Compliance
Follow accessibility guidelines (e.g., alt text, ARIA labels).
Test for screen reader compatibility.
Wishlist or Save for Later Feature
Offer options to save the product for future reference.
Live Chat or Support Options
Provide real-time assistance through chat or customer support links.
Easy Checkout Process
Minimize steps to purchase.
Offer guest checkout and multiple payment options.
Stock Notifications
Include “Back in Stock” or “Low Stock” alerts.

SEO News & Tips – September 2024

Actionable Points:

  1. Impact Assessment:
    • Analyze Core Update Impacts: If your site was affected, investigate potential causes such as relevance adjustments, intent shifts, or quality issues.
    • Run Delta Reports: Isolate dropped queries and landing pages using delta reports to identify the reasons for traffic decline.**
  2. Content Strategy:
    • Reassess Content Strategy: Move away from generic content aimed at search engines (SEO-first) and focus on user-first content that aligns with intent and provides real value.
    • Avoid Overloading Pages: Do not overload pages with excessive information. Make them clear and relevant to user intent.
  3. Quality Over Quantity:
    • Holistic Quality Evaluation: Improve overall site quality, including UX, ads, presentation, and sources. It’s not just about content but the experience as a whole.
    • Long-Term Quality Improvements: Use a “kitchen sink” approach—continuous, holistic improvements to demonstrate significant quality enhancement over time.
  4. Utilize User Engagement Data:
    • Measure User Engagement: Focus on engagement metrics like click-through rates (CTR), time on page, and interactions (Navboost signals) to improve rankings.
  5. Consider the Intent Shifts:
    • Study Intent Shifts: Understand changes in user intent. Ensure your content targets the appropriate search intent, whether transactional, informational, or navigational.
    • Use Natural Language Processing (NLP) Tools: Leverage NLP tools to analyze how well your content aligns with searcher intent.*
  6. RankBrain and Vector Search:
    • Optimize for RankBrain: Vector search re-ranks results based on intent. Focus on improving content relevancy through quality rater feedback and actual user interactions.
  7. Monitor Industry-Specific Trends:
    • Industry Trends: Be aware of specific industry shifts (e.g., pet insurance, tech products) and adjust your SEO strategy accordingly. Specialized retailers and informational sites are gaining over commercial giants like Amazon.
  8. Consider Site Authority:
    • Balance Domain and Brand Authority: Tom Capper’s study suggests sites with higher brand authority (BA) performed better than those relying solely on domain authority (DA). Focus on building brand recognition alongside traditional SEO efforts.
  9. Core Update Response:
    • Future-Proofing: Continue producing high-quality, intent-driven content as Google aims to reward helpful content over SEO-first strategies. Sites that improve post-HCU or core updates will continue to see gains.
  10. Action for Content Creators:
  • Engage with the Community: Marie plans to collaborate with SB Pro members to develop a system for scoring intent alignment, potentially enlisting community support to study core update effects.

Marie also encourages community engagement, suggesting that producing authentic, user-focused content will be a key driver of success in the evolving search landscape.

supplements are bad

*1. Use Natural Language Processing (NLP) Tools to Analyze Content Alignment with Searcher Intent

What is NLP?

Natural Language Processing (NLP) is a branch of artificial intelligence that focuses on the interaction between computers and humans through natural language. In the context of SEO and content creation, NLP tools help analyze and understand the nuances of language, enabling better alignment of your content with what users are searching for.

Why Align Content with Searcher Intent?

Searcher intent refers to the primary goal a user has when typing a query into a search engine. Understanding and aligning your content with this intent ensures that your pages meet the needs of your audience, leading to higher engagement, better rankings, and increased conversions.

How to Leverage NLP Tools:

  1. Identify Searcher Intent:
    • Types of Intent:
      • Informational: Users seeking information (e.g., “How to fix a leaky faucet”).
      • Navigational: Users looking to reach a specific website or page (e.g., “Facebook login”).
      • Transactional: Users intending to make a purchase or complete a transaction (e.g., “Buy iPhone 14”).
      • Commercial Investigation: Users researching products or services before making a decision (e.g., “Best DSLR cameras 2024”).
  2. Analyze Your Content:
    • Keyword Analysis:
      • Use NLP tools to analyze the keywords and phrases within your content to determine if they align with the identified searcher intent.
    • Content Structure:
      • Ensure your content is structured in a way that directly addresses the user’s query. For instance, use clear headings, bullet points, and concise paragraphs for better readability and relevance.
    • Semantic Analysis:
      • Assess the context and relationships between words to ensure your content comprehensively covers the topic.
  3. Optimize Content Based on Insights:
    • Content Gap Analysis:
      • Identify areas where your content may be lacking in addressing certain aspects of the searcher intent and enhance those sections.
    • Content Enrichment:
      • Add relevant information, multimedia elements, or interactive features to better satisfy user needs.
    • Language Refinement:
      • Adjust the tone, complexity, and style of your content to match the preferences and expectations of your target audience.

Recommended NLP Tools:

  1. Google Natural Language API:
    • Offers powerful text analysis capabilities, including sentiment analysis, entity recognition, and syntax analysis.
    • Google Cloud Natural Language
  2. IBM Watson Natural Language Understanding:
    • Provides comprehensive text analysis features such as keyword extraction, concept tagging, and emotion analysis.
    • IBM Watson NLU
  3. Ahrefs Content Gap Tool:
    • While not a pure NLP tool, it helps identify content gaps by comparing your site with competitors, which can be complemented with NLP analysis.
    • Ahrefs Content Gap
  4. Surfer SEO:
    • Integrates NLP techniques to optimize content structure and keyword usage based on top-ranking pages.
    • Surfer SEO
  5. Clearscope:
    • Uses NLP to provide keyword suggestions and content optimization recommendations to improve relevancy.
    • Clearscope

Implementation Steps:

  1. Choose the Right NLP Tool:
    • Select a tool that fits your technical expertise and specific needs. For instance, Google Natural Language API is highly customizable but may require more technical setup compared to user-friendly platforms like Surfer SEO.
  2. Conduct a Content Audit:
    • Use the chosen NLP tool to analyze your existing content. Look for metrics like keyword density, semantic relevance, and sentiment alignment with user intent.
  3. Map Content to Search Intent:
    • Categorize your pages based on the primary search intent they fulfill. Ensure each page clearly aligns with its intended user goal.
  4. Optimize and Update Content:
    • Based on the analysis, make necessary adjustments to improve alignment. This could involve rewriting sections, adding new information, or restructuring content for better clarity and relevance.
  5. Monitor and Iterate:
    • After optimization, continuously monitor your content’s performance. Use analytics to assess improvements in rankings, engagement metrics, and conversions, and iterate as needed.

*2. Run Delta Reports: Isolate Dropped Queries and Landing Pages to Identify Reasons for Traffic Decline

What are Delta Reports?

Delta reports in SEO are comparative analyses that track changes in your website’s performance metrics over specific periods. By comparing data before and after a Google core update (or any significant change), you can identify which queries and landing pages experienced fluctuations in traffic.

Why Run Delta Reports?

Running Delta reports helps you pinpoint the exact areas of your website that are affected by algorithm changes. Understanding these changes allows you to take targeted actions to recover lost traffic and optimize your site for future updates.

How to Run Delta Reports:

  1. Gather Data:
    • Time Frames:
      • Pre-Update Period: Data from a period before the core update (e.g., two weeks before).
      • Post-Update Period: Data from a period after the core update has been fully rolled out (e.g., two weeks after).
    • Metrics to Collect:
      • Organic traffic
      • Impressions
      • Click-through rates (CTR)
      • Average position
  2. Use SEO Tools:
    • Google Search Console (GSC):
      • Provides comprehensive data on search performance, including queries and landing pages.
      • Google Search Console
    • Google Analytics:
      • Offers insights into overall traffic patterns and user behavior.
      • Google Analytics
    • Third-Party SEO Tools:
      • Tools like SEMrush, Ahrefs, and Moz can provide additional data and more sophisticated reporting features.
      • SEMrush
      • Ahrefs
      • Moz
  3. Generate Comparative Reports:
    • Google Search Console Steps:
      1. Access Performance Report:
        • Navigate to Performance > Search Results.
      2. Set Date Range:
        • Select a date range that covers both pre-update and post-update periods.
      3. Export Data:
        • Export the performance data for both periods to a spreadsheet for easier comparison.
      4. Calculate Delta:
        • Use formulas to calculate the percentage change in impressions, clicks, CTR, and average position for each query and landing page.
  4. Analyze the Data:
    • Identify Drops and Gains:
      • Highlight queries and pages that saw significant drops or gains in traffic.
    • Segment the Data:
      • Break down the data by categories such as content type, topic, or user intent to identify patterns.
    • Look for Correlations:
      • Determine if there are common factors among the affected queries/pages, such as content quality, relevance, or technical issues.
  5. Identify Potential Causes:
    • Relevancy Issues:
      • Content may no longer align with the updated search intent or relevancy criteria.
    • Technical Problems:
      • Issues like slow page speed, mobile usability problems, or broken links can impact rankings.
    • Content Quality:
      • Thin content, outdated information, or lack of depth can lead to declines.
    • Competitor Improvements:
      • Competitors may have enhanced their content, earning higher rankings.
    • User Experience (UX):
      • Poor UX, intrusive ads, or difficult navigation can negatively affect rankings.
  6. Develop an Action Plan:
    • Content Optimization:
      • Revise and enhance content to better match searcher intent and improve quality.
    • Technical SEO Fixes:
      • Address any identified technical issues to improve site performance.
    • Improve UX:
      • Enhance the overall user experience to keep visitors engaged and reduce bounce rates.
    • Monitor Competitors:
      • Analyze top-performing competitors to understand what changes they made that may have contributed to their gains.
    • Continuous Monitoring:
      • Regularly update and monitor Delta reports to track the effectiveness of your optimizations and stay ahead of future updates.

Tools and Techniques for Delta Reporting:

  1. Google Search Console (GSC):
    • Export and Compare:
      • Export performance data before and after the update and use spreadsheet tools to calculate differences.
    • Filtering Options:
      • Use filters to narrow down specific queries or pages for detailed analysis.
  2. Google Analytics:
    • Custom Reports:
      • Create custom reports to compare traffic sources, user behavior, and conversion metrics across different periods.
    • Annotations:
      • Use annotations to mark the date of the core update, helping correlate changes in data with the update rollout.
  3. SEO Platforms:
    • SEMrush’s Position Tracking:
      • Track keyword rankings over time and compare pre- and post-update positions.
    • Ahrefs’ Site Explorer:
      • Analyze backlink profiles and organic search traffic changes.
    • Moz’s Rank Tracker:
      • Monitor keyword performance and identify ranking fluctuations.
  4. Spreadsheet Analysis:
    • Pivot Tables:
      • Use pivot tables to summarize and analyze large datasets effectively.
    • Visualization:
      • Create charts and graphs to visualize traffic trends and identify significant changes.

Best Practices for Delta Reporting:

  • Consistent Time Frames:
    • Ensure the comparison periods are equivalent in length and cover similar days of the week to account for any variability.
  • Account for Seasonality:
    • Be aware of seasonal trends that might affect traffic independently of the core update.
  • Focus on Significant Changes:
    • Prioritize queries and pages with substantial traffic drops or gains to address the most impactful issues first.
  • Iterative Analysis:
    • Continuously refine your analysis by incorporating additional data points or adjusting your reporting criteria as needed.

Putting It All Together: Enhancing Your SEO Strategy

By integrating NLP-based content analysis and Delta reporting, you can create a robust SEO strategy that not only aligns your content with user intent but also allows you to proactively identify and address issues arising from algorithm updates. Here’s a step-by-step approach to combining these strategies:

  1. Initial Content Audit:
    • Use NLP tools to assess the alignment of your existing content with searcher intent.
    • Identify content areas that need enhancement or restructuring.
  2. Post-Update Performance Analysis:
    • After a core update, run Delta reports to identify affected queries and pages.
    • Cross-reference these findings with your NLP analysis to understand if misalignment with search intent contributed to the declines.
  3. Optimization and Implementation:
    • Optimize affected content based on insights from both NLP analysis and Delta reports.
    • Enhance content quality, adjust for better intent alignment, and fix any technical issues identified.
  4. Continuous Monitoring and Improvement:
    • Regularly use NLP tools to keep your content aligned with evolving search intents.
    • Continuously monitor your site’s performance with Delta reports to stay ahead of any negative trends.
  5. Community Engagement and Learning:
    • Engage with SEO communities (like the Search Bar Pro Community mentioned in your summary) to share insights and learn from others’ experiences.
    • Stay updated with the latest SEO trends and algorithm changes to adapt your strategy accordingly.

Additional Resources

  • Google’s Search Quality Evaluator Guidelines:
  • Moz’s Beginner’s Guide to SEO:
  • Ahrefs’ SEO Learning Center:
    • Offers tutorials and articles on various SEO topics, including content optimization and performance analysis.
    • Ahrefs Learning Center
  • Surfer SEO’s Guides:
    • Detailed guides on using their tools for content optimization and SEO analysis.
    • Surfer SEO Guides

Implementing these strategies effectively can lead to improved search rankings, better user engagement, and sustained organic growth. By continuously aligning your content with user intent and proactively monitoring your site’s performance, you can navigate algorithm updates with greater confidence and resilience.

If you have any further questions or need assistance with specific tools or techniques, feel free to ask!

Helpful Core Update – March 2024

Core Update and Helpful Content System:

  • As the name suggests, Google’s March 2024 core update aims to prioritise useful information over content that seems optimized for search engines. In theory, spammy shite should be dropped in the rankings. In theory. Hopefully the twats at hydragun who keep robbing my MMA blogs content will get a spanking.
  • The helpful content system is revamped and integrated into the core ranking algorithm to better identify genuinely helpful content and reduce unoriginal content.
  • The guys in the know, expect the rollout to take a month, with significant ranking fluctuations during this period.
  • Google aims to reduce low-quality, unoriginal content in search results by 40%. Not 100%, updates begin at 40.

Actionable To-Do List:

  1. Review Content Quality: Assess your website’s content for originality and usefulness to real users, not just for SEO.
  2. Write from experience, and in the first person.  Unless you’re a big brand, then do what you like.
  3. Monitor Rankings: Prepare for fluctuations in search rankings as the update rolls out. Use this time to identify areas for improvement.
  4. Update Content Strategy: Focus on creating high-quality, original content that addresses your audience’s needs and questions. Use forums and social media groups to identify your customers pain points and common questions. Your CS and sales teams can also provide insights. Also get chat gpt to put keywords into themes of different queries.
  5. Avoid Spam Tactics: Steer clear of expired domain abuse, scaled content abuse, and site reputation abuse to avoid penalties.
  6. Build your brand. Branded searches and search volumes, make a massive difference, in my experience (see what I did there?).

It generally helps, in theory, to write from experience rather than just giving an overview that anyone could scrape and rewrite from the internet. Include your own images, videos etc.

I don’t have any images of Google updates, see here’s a pic of my dog:


Big brands will still probably have a massive advantage regardless of what they do.

Spam Updates:

  • Significant changes to spam handling will start on May 5, 2024, affecting sites through algorithmic adjustments or manual actions.
  • New spam policies target expired domain abuse, scaled content abuse (especially AI-generated content), and site reputation abuse.

General Recommendations:

  • Recognize the shift towards rewarding authoritative and genuinely helpful content.
  • Anticipate a more significant impact from updates targeting spam and low-quality content.
  • Understand that recovery from these updates may require fundamental changes beyond SEO, focusing on building a reputable and sought-after brand for quality content.

Image from BBC