SMX Munich 

The conference took place in March 2021. Below described are the sessions I did manage to attend.

Day 1 – 17 March 2021

1. Core Web Vitals Part 1: CWV in Practice Martin Splitt, Developer Advocate, Webmaster Trends Analyst Team, Google Switzerland

A very interesting session with Martin reminding us about speed perceptions or quoted Jacob Nielsen on that:

  • 0.1 second is an instantaneous response
  • 1 second is an almost seamless response
  • 10 seconds the limit of attention

Martin reminded us that performance metrics are a journey to quantify the perceived speed of a page.

Performance is a process, not a number! We don’t fully understand everything for the user.

Martin was explaining what CVW is (LCP- visual completeness, FID-interactivity and CLS-visual stability) as well as performing a life optimisation of his demo site a few changes in the code and Lighthouse results were green!

Martin reminded us that Lighthouse works locally so it’s like lab data sort of tied to OUR specific machine, OUR Internet, etc.). It’s hard to estimate what influence our machines/Internet etc. are having on the Lighthouse audit.

So Lighthouse it’s not that good to estimate how users are viewing this page as their experience may vary depending on where they are and on THEIR Internet, THEIR browser (cookies, plugins, extension etc.) what counts in ranking is field data – how real people experience the page. Hence where Page Speed Tools comes in or other page speed tools out there (WebPage test).

Sites with little field data (new sites, low traffic ones) should stick to testing locally (Lighthouse) and maybe when using tools such as WebPage test.

Martin recommended looking at these:

5.     Resign from page speed heavy fonts -allow browser for swapping fonts ( https://transfonter.org/ - I remember Bastian Grimm was already mentioned that a few years ago at BrightonSEO 😉)

6.     Check if in-lining fonts is a good idea – as sometimes it doesn’t speed up the pageà best practices don't always work for everyone (like inlining the font) :)à get rid of 3rd party fonts ;)

7.     Avoid GTM if you can or optimise as much as possible

Page Speed checking Tools


Further resources:

 

2 . 5 fee SEO tools ‘ by Mario Fischer

but he was sharing his deck here.

1. aHrefs-Webmaster tools incl. broken links checker

2. Screaming frog up to 500 URLs

3. Chrome web devs tools incl. CVW checks s.17/40

4.status and redirects checker - httpstatus..io

5. Openlinkprofiler.org – free backlink profile checker

He was discussing the anchor text analysis when using R scrip ( more about that script here although in German, apparently, tools such as RYTE and Sistrix have it in their basic membership versions).

I may play with this script at some point. If you want to it's saved at Mario’s Drive.

He also presented some cool Excel tips although in German 😊

3. The tech monitoring compendium - Bastian Grimm (@basgr) – Director of Peak Ace AG

He was mentioning many useful SEO monitoring tools – most described below 😉

It looks like Little Warden is a nifty monitoring tool that can monitor global sites and of course be tailored to one's needs and preferences. What he liked about it was the:

  •           URL inventory changes monitoring
  •           It also shows if certain URLs dropped out of the XML sitemap
  •         There is default HTML mark-up monitoring (metadata and canonical checks etc.). That may be helpful for post-prod audits to see if all is intact
  •           The tool can detect not only when something ‘changed’ but also what’s new
  •           Multi-layer indexation monitoring (not just robots.txt but also headers)
  •           HTTP status codes monitoring – always preserving your authority – right?
  •           Monitoring staging/test server URLs
  •           Checking geo-redirects (for proper redirects as per request’s origin)

Monitoring GA, GSC, GTM, Twitter Cards, Open Graph, different types of schema/own custom HTML or scripts? à regex.com for building and testing stuff on the go.

Tip: monitor the sites’ main nav using ReEx

Bastian was saying that RYTE is useful for large-scale monitoring over time. When spotting anomalies and understanding trends by comparing crawl data at scale.

What he liked about Content King is the segmentation which allows us to easily prioterize fixes, don’t we all need that? Apparently Content King does continuous crawling so it allows for ‘real-life discoveries’ e.g broken links etc. so no need to re-run your crawl. The notifications from Content King don’t happen over email only but can also be sent via Slack etc.

For a large site crawl depth and spikes, monitoring is crucial. SearchVIU was his choice for SEO quality assurance for website migrations & releases. It is also awesome to monitor changes in inventory (e.g in categories) or monitor lost vs new categories.

LeanKoala was recommended for compliance checks audits (legal text, imprint, terms, opt-outs (tracking), etc.). Ryte has also a GDPR compliance report showing external scripts on a website that are active prior to the user giving their consent.

He was suggesting these for a visual site monitoring

  •         Visualping.io
  •           Hexowatch.com

More monitoring tools:

  •         Uptimerobot.com- continuous uptime checks every 5 minutes
  •          Pingbreak.com – free BETA uptime monitor – it relies on Twitter as its default communication channel (account required) but allows alerting to Slack, Telegram, Discord, Mattermost, and custom alert services.
  •          Testomato.com – website content & uptime monitoring – very affordable solution + API pricing – starts at $49/mth
  •          Leankoala.com – 30 + tools that check different things
  •          Fluxgard.com – enterprise change monitoring – website changes, content and design edits (includes lighthouse/cookie/network activity changes)- rather expensive $10k per quarter
  •      Google Safe Browsing site status monitoring
For notificationsZapier.com – it allows you to push data from one software system to the another without writing custom code. Multi-steps Zaps possible.

Integromat.com – for when you need complex automation to be made easy. Is harder to use than Zapier but allows for more complex workflows, so it’s a really strong Zapier alternative really!

  • Finally, tray.io is an enterprise-level automation solution.
  • Pipedram.com tests webhooks and APIs

Bastian was also reminding us that there are nice custom alerts in GA. Many tech issues can be measured using GA.

Using BigQuery you can review both failing and successful tracking tags – which in turn also means you can spot pages with no tracking at all.

Another interesting tool is SpeedCurve- by far the most comprehensive toolset on the market allowing you to monitor any metric you deem relevant not only for yourself but also for your competitors.

4. What if...? Imagining an Apple Search Engine- Tom Anthony, VP Product, SearchPilot

Tom was speculating that Apple is working on their very own search engine. They not only hired ex-head of Search at Google - John Giannandreabut also their released in Sep-20  iOs14 – focuses on web results pulled out from Apple's Web Index. So, yes Apple crawls not as much as Google but they do… Apparently, some searches return decent results, some less so

Tom was saying that Apple relies heavily on data from users app (their eco-system allows for that easily as they have more than 2 mln apps in their App Store) and the search results are federated – not just from their small (compared to Google) web index but from your ‘behavior’ on your phone. In another word, it is a ‘blended’ search (Web+ apps). Ultimately, they focus heavily on ‘intent’

It appears that Apple's competitive advantage maybe when using federated search and bypassing Google’s web index that way. Another brand currently relying heavily on the federated search is SONOS.

Is it a question of a future to see if Apple with its federated search will win over Google’s search engine results? Personalization and data privacy will play a part in that shift – if the shift happens.

They know that a fully authenticated search will be a game changer.

5. Leveraging the Quality Rater Guidelines for a Content Audit - Dr. Marie Haynes, Owner, Marie Haynes Consulting

I listen to Marie’s podcast every time she releases a new ‘episode’, so I must say her session included some of the information that she shares there. She mainly was going through the Quality Ratters Guidelines highlighting what is important to Google. She was reminding us that Google is rewarding pretty much everything from their Quality Ratters Guidelines.

Since Medic Update the content quality has been becoming more and more important.

There are some patterns following each algo update. The last algorithm update from the beginning of December 2020 was however hard to spot patterns she said. It appeared that Google got a bit better at promoting sites that fulfill users’ needs though. Ultimately Marie was referring us to this blog post (the one you should have been really familiar with as it feels like a refresh of this May 2011 entry from Amit Singhal, Google Fellow)…

We analyzed those 21 points the way Mary was going through at my last job (using the May 2011 post as a reference). It was a great way of planning iterative changes for a few of our sprints 😉 I suppose anyone thinking seriously about the quality content should know those 21 points and try to adhere to them 😉

Compare your content against the sites in the top 3 and ask yourself ‘is it substantially different/adds a substantial value compared to the sites ranked at the top of page 1?

If not, ask yourself why would anyone want to read it? No reason save your time and skip that copy production 😉or work on it but make it MUCH, MUCH ‘better’ than those at the top of Google – if possible…!

Ultimately if you want to rank well try to produce all copies, so it has magazine alike quality!

Ensure the copy demonstrates expertise, get author bio’s to reinforce the trust for the copy/site, connect with experts, include references to resources you base your copy on etc. Basically, follow guidelines for a good copy written with EAT in mind.

Many of us have been doing EAT related optimizations since August 2018…

6. Finding Patterns in Your Data 2.0 - Kevin Indig, Director of SEO, Shopify

Marcus Tandler, RYTE owner was reminding us that Google is testing who is worthy of the top positions and ranking the sites accordingly. So of course search intent and great keyword research are still very much needed as it is all about ‘relevance’ and CTRs. If we satisfy users' expectations and give them the best experience Google will notice and prize us. If however CTRs are low Google will resign from ranking us high.

Kevin was reminding us that it's good to check keywords ranking on page 2 of SERPs. He was saying that core also updates often punish sites that cant prove themselves worthy at scale!

Similarly to Marcus, he was reminding us that we should always check if

  • The content meets the user’s intent
  • Is relevant and good enough

If your query drops look for syntax patterns, group keywords by the same pattern to see for example if user intent is met:

  •           Understand your query patterns the site targets
  •           Group pages by traffic to see what Google is after

-        Try to spot which page is proving itself to Google and see what it is different there (authority or content or anything else design etc?)

Kevin was also reminding to analyze log files to check if they correlate with traffic- Google will often crawl a lot of the site before any algo update to evaluate sites ahead of them.

Finally, Kevin was showing 3 pairs of sits and showing why one of them, pairs were doing better.

Amongst them, he looked at Healthline and Webmed and the main findings for them were: 

User intent was the key e.g Healthline split comprehensive topics into 2 great pieces and attracted more audience based on their specific to each subject needs

Content format mattered

Healthline was focusing not just on the problem but also was giving ideas on the solutions to the problems ( e.g. possible treatments)

Ultimately both speakers focused on the quality of our SEO delivery 😉

7. Stop Relying on the Facebook/Google Duopoly For All Your Marketing - Rand Fishkin – the founder of SparkToro.

Rand was reminding us that Google and Facebook built a the duopoly that gets 75% of searches out there (as per SimilarWeb data). He was trying to remind us that you can’t build a competitive advantage based on Google & Facebook alone.

His recipe for success is:

5 Step Process for Learning about Audience

1.     1. For discovering more information about our audiences, he suggested using some other platforms such as Buzzsumo, SparkToro (his very own tool), or SimilarWeb. Focus on what brings you quality traffic. He was also reminded how important it is to leverage social media for outreach purposes (reach out to relevant people on social media and engage there and then email).

2.      2. Get to know your audience especially if you are buying ads (ProfitWell.com). Use Typeform for surveys.

3.      3. Do a lot of non-promotional posts on social media before you do promote something – build up a report and then slowly promote your products/service.

4. He was reminded how great ‘brandable’ visuals are underutilized.

4.     5.Use the News – Google your business name/service + city name and get publications likely to cover you.

5.     6. Capture every email you can (email opens are 252x higher than Facebook page engagement) 😉

6.     7.Audit your ads.

 

Day 2 – 18 March 2021

 1. Three Words that will Change Your Messaging - Andi Jarvis – Founder & Strategy Director, Eximo Marketing Successful Content: 

7.     These 3 words are:  

Which means that

 An example of good usage of that:

9.     

which means that

2. Successful Content: What is the Secret Sauce- Samuel Schmitt the builder of thruuu*

Samuel was explaining how he was analyzing the copy ranking in SERPs to come up with ideas for content that could outrank those sites. In his case, it was, for example, writing more comprehensively or being one of the first to explain certain concepts (e.g Google news conveyed in a much easier language). He also shared how he was outreaching his tool on Linkedin when reaching out to SEO professionals he genuinely wanted to test his tool.

* Thruu is an SEO analyzer- a free version allows you to scrape SERPs for any keyword or country but limits you to 10 searches per month.

3. We've Got Data, But We're Not Data-Driven - Wil Reynolds, Founder & Director of Digital Strategy, SEER Interactive

Wil was the last guest on day 2 and he was just brilliant! Not only was he talking about ideas for finding an old copy to refresh it but was showing cool examples how some brands get the users intent right and get their site and messaging aligned to it (his examples of joint bank accounts etc.).

He was suggesting to analyse the data (especially PPC data for SEO)  not rely just on search volumes. Ultimately you want to find the most valuable keywords likely to drive revenue. He learned and used SQL to extract data at scale. He was referring to Deloitte Insights to remind us what counts for CMO success:

CMO Success