Tom Anthony from Distilled @TomAnthonySEO

The first talk I attended was from Tom Anthony from Distilled. Tom’s deck was incredible - full of trucks- being an amazing parable of the somewhat difficult to understand content- the storytelling was epic, all in all the technical networking was accessible for anyone interested in HTTP/2.  He also explained different approaches to deploying HTTP/2.

Over to Tom’s slides for a lovely catch up:

The next presentation came form Fili Wiese, a former Google engineer.
Fili spoke about search bots, efficient crawling & mobile first crawling & indexing. Here are the highlights:

How to avoid SEO on-page pitfalls
Fili advised to check ‘blocked resources’ in Google Search Console to establish what could be invisible to crawlers.

Ensure there are no server errors as they may slow down Google crawlers.

Is your site still on HTTP - perfect -improve it before migrating to HTTPs - use Lighthouse to spot areas where your site performance could be greatly improved.

Ensure there is not much JS on your page as its common that users disable it - approx 15% of users disables JavaScript and Google also is not as sophisticated in crawling & rendering it yet.

Focus on progressive enhancements- Google has been recommending it since 2007 - there must be a reason for it..

Utilise server logs to see what’s crawled on your site - check at last 12 months worth of logs, ideally last 3 years.

Optimise your site for search bots

Google crawls from top to bottom (home page first & other pages) they stop the crawl at some point to use URL scheduler to determine which URLs to crawl and when. They also crawl non existent URLs (404 pages)

Google loves new content and of course any external links help your site performance. Fili pointed out that internal links are useful for Google to crawl sites better (highlight: avoid redirects where you can). Ensure your internal links are in tact especially breadcrumbs (I’d advise checking cached version of the page to see if and how Google sees them).

Ensure you use Google crawling budget efficiently → ensure your worthwhile pages have USP and  if you have anything unwanted take care of it and ideally noindex it.  

Fili also explained why Google likes so much AMP - you may wonder why second layer of code is so likeable as responsive sites are already (good )enough - apparently AMP is useful for voice search based queries...

*Be careful when using canonical tags as they are not the same as 301 redirects, Google can ignore them unlike the redirects (Fili advised that for a naked domain it is better to use redirects than canonical tags then).

Optimize a website for (mobile) indexing and effective search bots crawl

Stick to web standards - do not change any symbols as you may create duplicate content

Design pages for printing using CSS

Audit your site frequently - a few times/year

Alexis K Sanders from Merkle Inc @AlexisKSanders

Alexis’ session was about understanding how to annotate information with She quickly guided us through structured data learning process. Here are the highlights: is a vocabulary but it is more comparable to a dictionary

Small errors are easy to make so always VALIDATE the structured data code

Difference between Microdata and Json-LD
We we love structured data:
  • Enhanced SERPs (stars, currency symbols etc.)
  • CTR increase (in most cases)
  • It helps Google understand the content
  • Used in search features but not for rankings
  • Google Mini pulls web answers from featured snippets
  • ListItem - still the most popular in either microformats or Json-LD

If you are a beginner structured data implementation can be done via this mark-up generator. Intermediate and advanced users will most likely benefit from Google documentation and most likely will find some hidden gems there. Overachievers can reach out for this :-)

Common Pitfalls
Not all syntax is the same, watch out symbols- use text editors for consistency and ease of writing

Pay attention to a vocabulary- what’s allowed and what’s not

Do not violate the schema - do not add infor which is not on your page - check possible violations in Google’s Policy

Pay attention to potential nesting errors (curly brackets etc.)

Future predictions

There is loads of schema used without Google webpage documentation

They support voice efforts - see examples:
Google will most likely support more widely action related schema in eCommerce

And things users want (e.g for health related and reactive queries)
Alexis encouraged us to check out her tech SEO blog.

Sam Marsden from DeepCrawl @sam_marsden

Sam was sharing a new framework for conducting regular content audits that make use of many data sources but that are time efficient to implement and put you in the best position to make decisions on how to deal with the content on your site.

Ahead of CMS migration at Deep Crawl the team had to audit the content to decide what to keep as is, improve or kill (remove) as they had to manually move the content aww….. They carried out an efficient content audit which was :
  • Comprehensive
  • Time saving
  • & ultimately replicable
to recognise relevant approach for Deep Crawl ahead of their CMS migration.

They used Deep Crawl to discover all on-site content (keep page URLs and page titles, word count, published & last modified date, links, duplicates,  categories, author, metrics such as backlinks, social shares, traffic, impressions, engagement metrics )

Create a set of criteria for judging content performance

Decide what stays and what gets killed, answer vital questions:

What is and isn’t performing well (unique page views, share count, backlinks, page value - if your goals are correctly set up)

How to deal with underperforming content (keep, cut, combine - thin yet valuable content, convert→ further improvements later?)

If unsure about what to do with certain pages - check traffic and if the page appears in SERPs, does it bring any value to the site and finally is it written with E-A-T in mind?

How to get the most of content which is performing well ? → optimise it further- page titles, internal & external links, schema, page speed etc.

How to inform a content strategy?
  • Use data to justify your strategy
  • Use PIVOT for displaying different performance metrics :
  • channel
  • category of content to check shares vs pageviews,
  • content
  • Is content length correlated with engagement?If not can content resources be used more efficiently? Create guides for copywriters re content length
  • Check performance by author- for copywriter assessment
  • Publishing time assessment

Automate the auditing process when crawling the content regularly, create the rules for traffic drops, broken pages, noindex user generated content.

Maria Camanes Fores from Builtvisible @mariacamanes

Maria was pointing out that improving a page speed is not a one off task. Maria aimed to make non-technical marketers aware of site speed performance and she covered the technical solutions for improving site speed and how to benchmark it.

Every second counts - you only have 5 sec to engage visitors before they leave

In order to convince business to implement the changes focus on business metrics such as CR, revenue, returning visitors - find out what’s important to the business, use case studies from if you need an inspiration or if you own a site yourself - how about running experiences on it to share the proofs with the business?

Use Speed Scorecard & Impact Calculator to compare against the competitors to influence the key stakeholders

Maria swore by WebPageTest - I like that tool too :-)- check it out! You can compare your site performance against competitors too when using it.

Understand and measure the metrics which really matter (run ‘synthetic test ‘ → Lighthouse- here pay especially attentio to ‘Critical Request Change’ diagram, Chrome Dev Tools, WebPageTest and ‘RUM’ test-->real user experience

Focus on USER EXPERIENCE not just traditional metrics!

Benchmark your site performance and record progress against the
benchmark (define baseline -how slow is too slow, what’s your aim etc.)

Always compress & resize images &  how about using ?

Bastian Grimm from Peak Ace AG @basgr

Better Measurement
Bastian suggested better page speed measurement - Page Speed score is not enough!

Focus on performance timings (Time to first paint, Time to first contentful paint, first meaningful paint)

Use Chrome Dev Tools  for website performance checks → to get profiling of website rendering stages

Or use PerformanceObserver  (add it to the site code) to track paint timings in Google Analytics (events) → that only works for Chrome not any other browser

Behaviour-->events-->pages-->performance metrics (first contentful paint) to check how fast are your copy elements

You can combine PerfromanceObserver with Data Studio

Critical Rendering Path  Image Formats

Beatian explained what is CSSOM

CSSOM defines APIs for selectors, media files and the CSS  itself. That’s why it can be considered as a ‘map’ to locate all stylesheets.

Browser downloads CSSOM to understand the page better - but first the browser needs a resource before it can display anything so CSS stylesheets block rendering.

Google won’t make any GET requests for CSS because requesting external

CSS gets expensive.
That is why you need to host CSS internally!

How to speed up a page →load CSS in 2 ways : 1. Inline CSS for critical path rendering and  load the remaining CSS asynchronously, so the browser won’t have to wait to finish downloading it.

How to check what is required for the CPR? Use testing tool for CSS rendering on desktop :

Image Formats
Bastian mentioned a few new formats - some never picked up e.g WebP.
He was fond of Cloudinary - an image CDN.

Web Custom Fonts
Custom font style mean at least three additional HTTP requests
You can display them asynchronously but it can cause flickering in a browser, font style matcher helps reduce the flickering.

Fix this when looking for @font-face and add ‘font-display:optional’. This will result in custom font being downloaded in the background and the font change will happen once it is done loading. This solution will be soon implemented in AMP.