Tom Anthony from Distilled @TomAnthonySEO
The first talk I attended was from Tom Anthony from Distilled. Tom’s deck was incredible - full of trucks- being an amazing parable of the somewhat difficult to understand content- the storytelling was epic, all in all the technical networking was accessible for anyone interested in HTTP/2. He also explained different approaches to deploying HTTP/2.
Over to Tom’s slides for a lovely catch up:
The next presentation came form Fili Wiese, a former Google engineer.
Fili spoke about search bots, efficient crawling & mobile first crawling & indexing. Here are the highlights:
How to avoid SEO on-page pitfalls
Fili advised to check ‘blocked resources’ in Google Search Console to establish what could be invisible to crawlers.
Is your site still on HTTP - perfect -improve it before migrating to HTTPs - use Lighthouse to spot areas where your site performance could be greatly improved.
Ensure there is not much JS on your page as its common that users disable it - approx 15% of users disables JavaScript and Google also is not as sophisticated in crawling & rendering it yet.
Focus on progressive enhancements- Google has been recommending it since 2007 - there must be a reason for it..
Utilise server logs to see what’s crawled on your site - check at last 12 months worth of logs, ideally last 3 years.
Optimise your site for search bots
Google crawls from top to bottom (home page first & other pages) they stop the crawl at some point to use URL scheduler to determine which URLs to crawl and when. They also crawl non existent URLs (404 pages)
Google loves new content and of course any external links help your site performance. Fili pointed out that internal links are useful for Google to crawl sites better (highlight: avoid redirects where you can). Ensure your internal links are in tact especially breadcrumbs (I’d advise checking cached version of the page to see if and how Google sees them).
Ensure you use Google crawling budget efficiently → ensure your worthwhile pages have USP and if you have anything unwanted take care of it and ideally noindex it.
Fili also explained why Google likes so much AMP - you may wonder why second layer of code is so likeable as responsive sites are already (good )enough - apparently AMP is useful for voice search based queries...
*Be careful when using canonical tags as they are not the same as 301 redirects, Google can ignore them unlike the redirects (Fili advised that for a naked domain it is better to use redirects than canonical tags then).
Optimize a website for (mobile) indexing and effective search bots crawl
Stick to web standards - do not change any symbols as you may create duplicate content
Design pages for printing using CSS
Audit your site frequently - a few times/year
Alexis K Sanders from Merkle Inc @AlexisKSanders
Alexis’ session was about understanding how to annotate information with Schema.org. She quickly guided us through structured data learning process. Here are the highlights:
Schema.org is a vocabulary but it is more comparable to a dictionary
Small errors are easy to make so always VALIDATE the structured data code
Difference between Microdata and Json-LD
We we love structured data:
-
Enhanced SERPs (stars, currency symbols etc.)
-
CTR increase (in most cases)
-
It helps Google understand the content
-
Used in search features but not for rankings
-
Google Mini pulls web answers from featured snippets
-
ListItem - still the most popular in either microformats or Json-LD
If you are a beginner structured data implementation can be done via this mark-up generator. Intermediate and advanced users will most likely benefit from Google documentation and most likely will find some hidden gems there. Overachievers can reach out for this :-)
Common Pitfalls
Not all syntax is the same, watch out symbols- use text editors for consistency and ease of writing
Pay attention to a vocabulary- what’s allowed and what’s not
Do not violate the schema - do not add infor which is not on your page - check possible violations in Google’s Policy
Pay attention to potential nesting errors (curly brackets etc.)
Future predictions
There is loads of schema used without Google webpage documentation
They support voice efforts - see examples:
Google will most likely support more widely action related schema in eCommerce
And things users want (e.g for health related and reactive queries)
Enhanced SERPs (stars, currency symbols etc.)
CTR increase (in most cases)
It helps Google understand the content
Used in search features but not for rankings
Google Mini pulls web answers from featured snippets
ListItem - still the most popular in either microformats or Json-LD
Sam Marsden from DeepCrawl @sam_marsden
Sam was sharing a new framework for conducting regular content audits that make use of many data sources but that are time efficient to implement and put you in the best position to make decisions on how to deal with the content on your site.
Ahead of CMS migration at Deep Crawl the team had to audit the content to decide what to keep as is, improve or kill (remove) as they had to manually move the content aww….. They carried out an efficient content audit which was :
-
Comprehensive
-
Time saving
-
& ultimately replicable
to recognise relevant approach for Deep Crawl ahead of their CMS migration.
They used Deep Crawl to discover all on-site content (keep page URLs and page titles, word count, published & last modified date, links, duplicates, categories, author, metrics such as backlinks, social shares, traffic, impressions, engagement metrics )
Create a set of criteria for judging content performance
Decide what stays and what gets killed, answer vital questions:
What is and isn’t performing well (unique page views, share count, backlinks, page value - if your goals are correctly set up)
How to deal with underperforming content (keep, cut, combine - thin yet valuable content, convert→ further improvements later?)
If unsure about what to do with certain pages - check traffic and if the page appears in SERPs, does it bring any value to the site and finally is it written with E-A-T in mind?
How to get the most of content which is performing well ? → optimise it further- page titles, internal & external links, schema, page speed etc.
How to inform a content strategy?
-
Use data to justify your strategy
-
Use PIVOT for displaying different performance metrics :
-
channel
-
category of content to check shares vs pageviews,
-
content
-
Is content length correlated with engagement?If not can content resources be used more efficiently? Create guides for copywriters re content length
-
-
Check performance by author- for copywriter assessment
-
Publishing time assessment
Automate the auditing process when crawling the content regularly, create the rules for traffic drops, broken pages, noindex user generated content.
Comprehensive
Time saving
& ultimately replicable
Use data to justify your strategy
Use PIVOT for displaying different performance metrics :
channel
category of content to check shares vs pageviews,
content
Is content length correlated with engagement?If not can content resources be used more efficiently? Create guides for copywriters re content length
Check performance by author- for copywriter assessment
Publishing time assessment
0 Comments