If the links don’t have .html in the protocol (at the end of them), then first concatenate the URLs to add * at the end of them all.
If they do have .HTML, then find and replace .HTML with .html*
Next – use Text to Columns, use * as the “delimiter”
Removing code from the Start of the URLs
If the links are relative:
Use text to columns again and use / as the delimiter
If the links contain the full domain name, then concatenate * at the start of the domain name and use text to columns again to remove the code fro the start of the URLs
CSS Grid allows you to layout items in 2 dimensions. It is similar to Flexbox but in 2D, instead of 1D (apparently).
You have a grid container, like in Flexbox. It is like a square perimeter around all the items or elements.
Grid has lines or tracks between the elements.
Getting Started with CSS Grid
You’ll want a code editor thingy, like Sublime Text, Visual Studio or if you’re old school, Notepad.
The easiest way to do this is to use codepen, but if you have Visual Studio, that’s ideal as you can test your code in real time.
Unless your using inline styles, you’ll want to create an HTML doc and a separate CSS sheet.
Getting started
If you want to follow along, you’ll want to add the <HTML> tags to declare you’re created a webpage with HTML, and add a <head>, a page title in <title> </title> tags and a <body> to add your content.
You should then save the HTML doc as something like “index.html”, just make sure it has .HTML at the end of the name.
In a separate doc/sheet, create your style sheet and save it as “styles.css”
You can then link to/connect your HTML doc to your CSS sheet in the <head> of the HTML doc with the code in bold:
The video should start at the grid template areas section:
Grid Alignment
Works similar to flexbox when it comes to aligning items.
justify-content: center;
justify-content: center; will place all the items in the middle of the page
Centred Grid
To place the content to the left:
justify-content: start;
justify-content: space-evenly;
Space-evenly, will spread the grid items evenly across the page:
*Everyone tends to use divs, but according to W3 (the internet police) you should be using divs as a last resort. More info here
Instead of <div class =classname>, you could for example, use <article class=”classname”> or <section class = “classname”>
***Remember to add specific styles to the .grid-container in the CSS sheet, and text-specific styles, such as text-align, in the grid.item styles section***
E.g. put this in the .grid-item styles, not the .grid-container:
Let’s say you have made a grid, with 6 items, but you want the first one to be different to the rest. Probably the easiest way to achieve this, is to add a “.grid-item-1” styles section, below the CSS you’ve created for the rest of the grid items.
If you use the W3 try it yourself tool, remember to click “Run” on the top left, as the code doesn’t update real time.
css grid equal height rows
If you add the value/property thing in CSS, to the .grid-container section of the CSS (assuming you’ve named the class of your container .grid-container) to “grid-auto-rows: 1fr;” – then all the rows should be equal height. Or you could just set the heights to 100px each
I needed a way of combining a load of commerce product identifier numbers into one cell, separated by columns.
You can download the spreadsheet with the formula here.
Textjoin Formula Example
TEXTJOIN(",",TRUE,F6:F35)
The comma in speech marks, adds the comma between the numbers Not sure what “TRUE” does to be honest! The F6:F35 is just the cells that the original list, that’s aligned vertically in this case, was in.
Semantic search adds context and meaning to search results. For example, if someone is searching for “Lego” – do they want to buy Lego toys, or see a Lego movie or TV show (Ninjago is great). Another example might be “Tesla” – do people want to see the latest self-driving car, or learn more about Tesla the scientist and inventor?
How to Optimise for Semantic Search
Make sure you understand search intent and any confusing searches like Tesla(inventor or car?), Jaguar (car or animal?), etc
Look for structured data opportunities
Optimise internal links – especially if you are using a “Pillar Post” and “Cluster Page” structure
Follow traditional on page SEO best practices with headers, meta titles, alt tags etc
Tools for Semantic Search
SMA Marketing have done a cool YouTube video about Semantic Search and they recommend tools including:
Wordlift
Frase
Advanced Custom Fields for WordPress
Google Colab with a SpaCy
Before you publish a post – look at the search results for the keyword(s) you are optimising the post for. Check in incognito in Chrome to remove most of the personalisation of the results.
For any answer boxes or snippets, you can click the “3 dots” to get information about the results:
As well as the snippets, you can click the 3 dots next to any organic result. Here’s another result for “MMA training program pdf” with some additional information:
With this in mind – if you are looking to rank for “MMA training program pdf” then you will want to include the search terms highlighted in the “About this result” box: mma, training, program, pdf and ideally LSI keywords “workout” and “plan”.
It’s also a good idea to scroll down to the bottom of the SERP and check out the “related searches”
Take a look too at any breadcrumb results that pull through below the organic listings. Combining all this information will give you a good idea as to what Google understands by your search query and what people are looking for too.
Hover over [1] and click the play icon that appears (highlighted yellow in screenshot below)
When that section has finished loading and refreshing, scroll down to the “Installation tensorflow + transformers + pipelines” section and click the play icon there.
When that’s finished doing it’s thing, scroll down again, and add your search query to the uQuery_1: section:
add your query and then press the “play” button on the left hand side opposite the uQuery_1 line
You should then see the top 10 organic results from Google on the left hand side – in the form of a list of URLs
Next, you can scrape all the results by scrolling down to the “Scraping results with Trafilatura” section and hover over the “[ ]” and press play again:
Next, when the scraping of results is done – scroll down to “Analyze terms from the corpus of results” section and click the play button that appears when you hover over “[ ]”
Next! when that’s done click the play button on the section full of code starting with:
“df_1[‘top_result’] = [‘Top 3’ if x <= 3 else ‘Positions 4 – 10’ for x in df_1[‘position’]] # add top_result = True when position <=3 “
Finally – scroll down and click the play button on the left of the “Visualizing the Top Results” section.
On the right hand side where it says “Top Top 3” and lists a load of keywords/terms – these are frequent and meaningful (apparently) terms used in the top 3 results for your search term.
Below that, you can see the terms used in the results from 4-10
Terms at the top of the graph are used frequently in the top 3 results e.g. “Mini bands”
Terms on the right are used frequently by the results in positions 4-10
From the graph above, I can see that for the search term “resistance bands” the top 3 results are using some terms, not used by 4-10 – including “Mini bands”, “superbands” “pick bodylastics”
If you click on a term/keyword in the graph – a ton of information appears just below:
e.g. if I click “mini bands”
It’s interesting that “mini bands” is not featured at all in the results positioned 4-10
If you were currently ranking in position 7 for example, you’d probably want to look at adding “mini bands” into your post or product page
You can now go to the left-side-bar and click “Top 25 Terms” and click the “play icon” to refresh the data:
Obviously – use your experience etc and take the results with a pinch of salt – some won’t be relevant.
Natural Language Processing
next click on “Natural Langauge Processing” in the side-menu
Click the “play” icons next to “df_entity =df_1[df_1[‘position’] < 6]” and the section below.
When they have finished running click the play icon next to “Extracting Entities”
Click “play” on the “remove duplicates” section and again on the “Visualising Data” section
This should present you with a colourful table, with more terms and keywords – although for me most of the terms weren’t relevant in this instance 😦
You can also copy the output from the “Extracting the content from Top 5” section:
Then paste it into the DEMO/API for NLP that Google have created here:
You can then click the different tabs/headings and get some cool insights
Remember to scroll right down to the bottom, as you’ll find some additional insights about important terms and their relevance
The Google NLP API is pretty interesting. You can also copy and paste your existing page copy into it, and see what Google categories different terms as, and how “salient” or important/relevant it thinks each term is. For some reason, it thinks “band” is an organisation in the above screenshot. You can look to improve the interpretations by adding relevant contextual copy around the term on the page, by using schema and internal links.
Speed Up Data Studio Reports (Significantly) – Extract Data
To speed up your reports – you can “Extract Data” and cache it.
It can help to have 2 copies of the report up – so you can see which metrics and dimensions you need to select when adding the data to extract and cache (also a good idea to test the extract data method on a copy of the report in case you faff anything up)
Go to “Add Data” in the top menu-bar
Click on “Extract Data”
Choose the data you need – eg Google Analytics
Add the dimensions and metrics you need for the report
On the Right hand side – click to turn “Auto Update” on
Select “daily”
Click “Save and Extract”
Sometimes you have to faff around a bit with the dimensions – Google Analytics doesn’t seem to like caching a dimension, but still goes super-quick if you cache the metrics only.
Edit in Bulk
If you want to edit all of the charts or tables on the page, in “Edit” mode, right click – go to “Select” and then choose “Tables on page” or whatever type of chart, scorecard or table you’ve selected.
This works instead of CTRL clicking or SHIFT clicking – but you can only change charts or visualisations of the same type at the same time. You can change the style, add a comparison date range etc.
Brand Colour Theme in Data Studio
Click on “Them and Layout” at the top of the screen and then, near the bottom right click “Extract Theme from Image” – you can then upload your logo and choose a theme with your brand colours.
If your shite at presentation like me, this is helpful.
Copy & Paste Styles
In Data Studio – If you want to copy a style from a chart or table, right click it, then choose “copy”
Click another chart/table and the right click – Paste Special – Paste Style Only
Add Chart Filters to an Entire Report
If you want to add a filter to all the data in a report, then it can be a pain going through the charts individually.
Right click on a blank part of the page –
Click “Current Page Settings”
On the right hand side – click “Create a Filter”
Choose or create a filter to apply to all the page
To add a filter to multiple pages
Right click on a blank part of the page
click “Report Settings”
click “Add a filter” in the right side-menu
Add Elements to All Pages of a Report in Data Studio
If you want to add a header and date range selector, for example, to all the pages in the report – add the elements to a page, then right click on the element – and choose “Make report-level”
Quickly Align Elements in Data Studio
Click and drag to select all the elements
Right click – choose “align” – “middle” to get everything inline horizontally
To get an equal space between all the elements, so they’re spaced evenly:
– click and drag to select the elements
– right click – select “Distribute”
– “horizontally” to space evenly across the page, or “vertically” to distribute evenly in a vertical manner.
You can also tidy up individual tables to align the columns vertically – right click and select “”Fit to data”
Just use SEMRush – Organic Research – Positions tab and download and pivot the pages data – no need for advanced filter
Once you’ve found the blog posts with the most traffic, you can analyse the “Exact URL” in SEMRush
This analysis, should show you the keywords on the page that generate most of the search traffic
I personally like to go after KWs with a Keyword Difficulty score of less than 20 for my personal blog and under 30 for my employer’s blog
You can also use Reddit & Quora for Content Ideas
Unsolicited #SEO tip: You can get great ideas for specific content ahead of features like PAAs being generated by using Google site operators with specific sites. For instance, I can use the command:
site:reddit[dot]com/r/amateur_boxing “how do i”
To search just the amateur boxing subreddit for questions starting with “how do I?” You can apply this on any niche or on other sites like Quora to get up to the minute questions people are asking.
I only want the URLs that reside at the third level – i.e. /productpage/
Go to your XML sitemap – usually at Myshop.com/sitemap.xml
Right click and “save as” – save on your computer
Open Excel
Go to the Developer Tab (you might need to add this as it’s not there by default)
Click “Import”
Browse to find your sitemap.xml and import it into Excel
This usually pulls all your URLs into column 1 and other info like priority into separate columns
Delete all the columns except the first one with your URLs in it
Remove the https:// from the URLs with “find and replace” – On “Home” tab under “Find & Select” on the right
In cell B2 add the function: (change A2 – to the cell you have put the first URL in)
=LEN(A2)-LEN(SUBSTITUTE(A2,"/",""))
11. Drag the formula down the rest of column B
12. You can now order column B by the number of “/” found in each URL
If different categories have different folder structures then you can conditionally format and use different colours for different categories and then do a multiple criteria sort – by colour, then folder depth (column B)
You can download an example spreadsheet with the formula in here
The idea of technical SEO is to minimise the work of bots when they come to your website to index it on Google and Bing. Look at the build, the crawl and the rendering of the site.
Tools Required:
SEO Crawler such as Screaming Frog or DeepCrawl
Log File Analyzer – Screaming Frog has this too
Developer Tools – such as the ones found in Google Chrome – View>Developer>Developer Tools
Web Developer Toolbar – giving you the ability to turn off Javascript
Search Console
Bing Webmaster Tools – shows you geotargetting behaviour, gives you a second opinion on security etc.
*Great for tailoring copy and pages. Just turn it on and add query parameter
Tech SEO 1 – The Website Build & Setup
The website setup – a neglected element of many SEO tech audits.
Storage Do you have enough storage for your website now and in the near future? you can work this out by taking your average page size (times 1.5 to be safe), multiplied by the number of pages and posts, multiplied by 1+growth rate/100
for example, a site with an average page size of 1mb with 500 pages and an annual growth rate of 150%
1mb X 1.5 X 500 X 1.5 = 1125mb of storage required for the year.
You don’t want to be held to ransom by a webhost, because you have gone over your storage limit.
How is your site Logging Data? Before we think about web analytics, think about how your site is storing data. As a minimum, your site should be logging the date, the request, the referrer, the response and the User Agent – this is inline with the W3 Extended Format.
When, what it was, where it came from, how the server responded and whether it was a browser or a bot that came to your site.
Blog Post Publishing Can authors and copywriters add meta titles, descriptions and schema easily? Some websites require a ‘code release’ to allow authors to add a meta description.
Site Maintenance & Updates – Accessibility & Permissions Along with the meta stuff – how much access does each user have to the code and backend of a website? How are permissions built in? This could and probably should be tailored to each team and their skillset.
For example, can an author of a blog post easily compress an image? Can the same author update a menu (often not a good idea) Who can access the server to tune server performance?
Tech SEO 2 – The Crawl
Google Index
Carry out a site: search and check the number of pages compared to a crawl with Screaming Frog.
With a site: search (for example, search in Google for site:businessdaduk.com) – don’t trust the number of pages that Google tells you it has found, scrape the SERPs using Python on Link Clump:
Too many or too few URLs being indexed – both suggest there is a problem.
Correct Files in Place – e.g. Robots.txt Check these files carefully. Google says spaces are not an issue in Robots.txt files, but many coders and SEOers suggest this isn’t the case.
XML sitemaps also need to be correct and in place and submitted to search console. Be careful with the <lastmod> directive, lots of websites have lastmod but don’t update it when they update a page or post.
Response Codes Checking response codes with a browser plugin or Screaming Frog works 99% of the time, but to go next level, try using curl and command line. Curl avoids JS and gives you the response header.
You need to download cURL which can be a ball ache if you need IT’s permission etc.
Anyway, if you do download it and run curl, your response should look like this:
Next enter an incorrect URL and make sure it results in a 404.
Canonical URLs Each ‘resource’ should have a single canonical address.
common causes of canonical issues include – sharing URLs/shortened URLs, tracking URLs and product option parameters.
The best way to check for any canonical issues is to check crawling behaviour and do this by checking log files.
You can check log files and analyse them, with Screaming Frog – the first 1,000 log files can be analysed with the free version (at time of writing).
Most of the time, your host will have your logfiles in the cPanel section, named something like “Raw Access”. The files are normally zipped with gzip, so you might need a piece of software to unzip them or just allow you to open them – although often you can still just drag and drop the files into Screaming Frog.
Lighthouse Use lighthouse, but use in with command line or use it in a browser with no browser add-ons.If you are not into Linux, use pingdom, GTMetrix and Lighthouse, ideally in a browser with no add-ons.
Look out for too much code, but also invalid code. This might include things such as image alt tags, which aren’t marked up properly – some plugins will display the code just as ‘alt’ rather than alt=”blah”
Javascript Despite what Google says, all the SEO professionals that I follow the work of, state that client-side JS is still a site speed problem and potential ranking factor. Only use JS if you need it and use server-side JS.
Use a browser add-on that lets you turn off JS and then check that your site is still full functional.
Schema
Finally, possibly in the wrong place down here – but use Screaming Frog or Deepcrawl to check your schema markup is correct.
You can add schema using the Yoast or Rank Math SEO plugins
The Actual Tech SEO Checklist (Without Waffle)
Basic Setup
Google Analytics, Search Console and Tag Manager all set up
Site Indexation
Sitemap & Robots.txt set up
Check appropriate use of robots tags and x-robots
Check site: search URLs vs crawl
Check internal links pointing to important pages
Check important pages are only 1 or 2 clicks from homepage
For render blocking JS and stuff, there are WordPress plugins like Autoptimize and the W3 Total Cache.
Make sure there are no unnecessary redirects, broken links or other shenanigans going on with status codes. Use Search Console and Screaming Frog to check.
Site UX
Mobile Friendly Test, Site Speed, time to interactive, consistent UX across devices and browsers
Consider adding breadcrumbs with schema markup.
Clean URLs
Image from Blogspot.com
Make sure URLs – Include a keyword, are short – use a dash/hyphen –
Secure Server HTTPS
Use a secure server, and make sure the unsecure version redirects to it
Allow Google to Crawl Resources
Google wants to crawl your external CSS and JS files. Use “Fetch as Google” in Search Console to check what Googlebot sees.
Hreflang Attribute
Check that you are using and implementing hreflang properly.
Tracking – Make Sure Tag Manager & Analytics are Working
Check tracking is working properly. You can check tracking coed is on each webpage with Screaming Frog.
Internal Linking
Make sure your ‘money pages’ or most profitable pages, get the most internal links
Content Audit
Redirect or unpublish thin content that gets zero traffic and has no links. **note on this, I had decent content that had no visits, I updated the H1 with a celebrity’s name and now it’s one of my best performing pages – so it’s not always a good idea to delete zero traffic pages**
Consider combining thin content into an in depth guide or article.
Use search console to see what keywords your content ranks for, what new content you could create (based on those keywords) and where you should point internal links.
Use Google Analytics data regarding internal site searches for keyword and content ideas 💡
Update old content
Fix meta titles and meta description issues – including low CTR
Find & Fix KW cannibalization
Optimize images – compress, alt text, file name
Check proper use of H1 and H2
See what questions etc. are pulled through into the rich snipetts and answer these within content