June 2021 Google Algorithm and Search Industry Updates | Impression

[ad_1]

June 2021 saw several significant search industry updates, with the launch of two new changes to Google’s ranking algorithms. In this post, we’ll explore:

  • The impact of Google’s June 2021 Core Algorithm Update;
  • The launch of the Google Page Experience update;
  • The introduction of Search Console Insights;
  • Discussions on the impact of poor HTML, spelling and grammar on rankings from Google’s John Mueller;
  • Google warning users when it doesn’t have a reliable answer;
  • Shopify enabling edits to robots.txt files;

Following our approach in previous posts, updates are set out in terms of their significance using our traffic light system – a red light is used for a key update that should be a priority, a green light is for news that is less immediately significant, and amber is in between.

Read on for the latest news and updates in the search industry.

The Impact of Google’s June 2021 Core Algorithm Update

Google’s June 2021 Core Algorithm Update was launched on the 2nd June:

red traffic light

This recent change to Google’s ranking algorithms has been long-awaited in the search industry, with the previous algorithm update rolling out in December 2020. In previous years, we’ve seen these updates on a quarterly basis – you can learn more with our guide to Google Core Algorithm Updates.

As is often the case with this type of broad ranking algorithm update, the June 2021 Core Algorithm Update caused more visibility fluctuations in some verticals than others. SEO software company Sistrix has provided data on the update’s biggest winners and losers.

According to the Sistrix Visibility Index (VI) metric, the following sites saw the greatest gains:

  • Homeessentials.co.uk (+209.12% VI change)
  • Gardentrading.co.uk (+182.05% VI change)
  • For-sale.co.uk (+ 158.93% VI change)

On the other hand, these sites saw the biggest decreases in organic visibility:

  • Exchangeandmart.co.uk (-36.66% VI change)
  • Avclub.com (-34.88% VI change)
  • Famousbirthdays.com (-33.06% VI change)

Based on the Sistrix data, it’s immediately clear that the home and garden products sector saw a significant impact from the update. Other verticals that were shown to be affected included localised directories, lyrics sites, and second-hand car marketplaces.

The analysis from Amsive Digital highlights a number of other site categories that were hit in a noticeable way during the weeks following the June 2021 Core Algorithm Update. Dictionaries and reference sites emerged as the category that improved the most, with established players like Wikipedia and Dictionary.com seeing the greatest increases in visibility.

What does this mean for me?

If you own a site in one of these verticals, it’s likely that you saw some form of change in your organic visibility and traffic during the first few weeks of June. In any case, it’s worth checking your analytics software to see if there were any movements.

Should you find that your site has been negatively impacted by the update, don’t panic! There are plenty of areas you can focus on to help your site bounce back – check out our guide on How To Recover From A Google Algorithm Update to learn more.

Alternatively, get in touch with us today to find out how our SEO team can support you.

Google Page Experience Update Begins Rolling Out

This image has an empty alt attribute; its file name is Orange.png

Google’s long awaited page experience algorithm update is starting to roll out now and will be completed by the end of August 2021.

The page experience update considers several signals that contribute to an optimal browsing experience for users and gives a website an overall ‘page experience score’ based on its assessment of each signal. Site owners are able to view their score in the page experience report in Google Search Console.

The page experience update currently only applies to mobile search results, with a roll out to desktop expected at a later date.

The signals that make up the page experience update include:

  • Core Web Vitals: See our guide to Core Web Vitals to learn more. 
  • Mobile usability: A page must have no mobile usability issues.
  • Security issues: Any security issues for a site will disqualify all pages on the site from a Good status.
  • HTTPS usage: Pages must be served over HTTPS to qualify for a Good status.
  • Ad Experience: Advertising techniques must not be distracting or interrupt the user in a way that is not conducive to good user experience.

The company announced:

“We’ll begin using page experience as part of our ranking systems beginning in mid-June 2021. However, page experience won’t play its full role as part of those systems until the end of August. You can think of it as if you’re adding a flavoring to a food you’re preparing. Rather than add the flavor all at once into the mix, we’ll be slowly adding it all over this time period.”

What does this mean for me?

Site owners should continue to work towards improving their Core Web Vital scores if they are currently poor. Resolving any issues around Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), or First Input Delay (FID) should be part of your ongoing SEO activity.

As the page experience update is being rolled out gradually, the majority of sites should not expect to see any drastic changes in organic search performance.

Google’s John Mueller Discusses the Impact of Poor HTML, Spelling and Grammar on Rankings

This image has an empty alt attribute; its file name is Orange.png

Digital content writers and site owners take note! In a recent edition of Google SEO Office Hours, John Mueller answered a query regarding the effect of poor HTML, spelling and grammar on a site’s ranking potential.

With regards to broken HTML, Mueller’s response suggested that this shouldn’t be an issue in the vast majority of cases:

“For the most part we don’t care about HTML if it’s broken or not.

Most of the web does not have valid HTML and we have to live with it.

The main exception that I know of with regard to broken HTML is if it’s really broken in a very bad way in the sense that if we can’t recognize that a page is mobile friendly.

Or if we can’t recognize that this is a title or a heading, then obviously we can’t do a lot of things with the HTML.”

He then gave a general rule of thumb to go by when determining if HTML errors will impact rankings:

“If you look at the page and it doesn’t even load properly, then probably you need to fix that.

However, if you look at the page and it looks normal in the browser, then even if there’s broken HTML it’s probably okay.”

Content creators have long thought that poor spelling and grammar can definitely be an issue from an SEO perspective – and Mueller’s answer confirmed that this is the case. From Google’s point of view, major spelling and grammatical errors are a quality issue, meaning that they can affect rankings.

Mueller had this to say on the topic:

“We try to find really high quality content on the web and sometimes it can appear that a page is lower quality content because it has a lot of grammatical and technical mistakes in the text.

So that’s something where – from my point of view – if you’re aware of these kinds of issues I would just fix that.

I would almost say spelling and grammar is probably for most websites a higher priority than broken HTML.”

What does this mean for me?

It’s pretty clear what the implication is here: spelling and grammar are important aspects of a page’s quality from Google’s perspective. 

With this in mind, it’s important for site owners and content writers to focus on spelling and grammar as part of their on-page work. Not only will this impact how readers perceive your site, but it could also play a part in determining your rankings.

Shopify Enables Sites to Edit Their Robots.txt File

This image has an empty alt attribute; its file name is Green.png

Shopify store owners are now able to edit their robots.txt file, allowing them to have greater control over how search engines crawl their site

 All Shopify stores start with the same robots.txt, but whereas stores were previously unable to make changes, the file can now be edited through the robots.txt.liquid theme template. 

Changes that site owners can now make to the robots.txt file include:

  • Allow or disallow certain URLs from being crawled
  • Add crawl-delay rules for certain crawlers
  • Add extra sitemap URLs
  • Block certain crawlers 

Shopify recommends that any changes made to the robots.txt are done through the liquid template as it preserves the ability to keep the file automatically updated. 

What does this mean for me?

Ecommerce stores using Shopify are particularly prone to creating URL query strings from product and category filters. These can have technical SEO implications that were tricky to resolve prior to this update.

Being able to edit the robots.txt file allows Shopify store owners to easier implement technical actions to resolve crawling issues by implementing directives that prevent certain URLs or areas of a site from being crawled.

Google is starting to warn users when it doesn’t have a reliable answer

This image has an empty alt attribute; its file name is Orange.png

Google is testing a new feature that informs users when they’re searching about information that is popular online. The aim is to provide more context about breaking information such as suspected UFO sightings or news stories that are rapidly evolving.

The prompt warns users that the results they are seeing are “changing quickly”, and that “If this topic is new, it can sometimes take time for results to be added by reliable sources.” 

This reflects a long-term incremental approach to educating users about questionable or incomplete information.

Zpn28OKG6FguS6AXpMpq4zcwVbT4orv2Y8oTioiowh4depwqJUq1RSLjkTuNvJCCsdM0i7MkOPxoosljaNsBKv9Zzvg5IYbz8kX8jb0e8BuxlAHJhSQowe3e tVHLF8DWXTTLnU2 Twin Front

What does this mean for me?

As a contributor, ensuring that any contribution to news stories is backed up by reliable sources is not only essential for establishing expertise, authority and trust (E-A-T), but can help to avoid being labelled as an unreliable source. As a user, this means an enhanced and more reliable experience when searching on Google. 

Google Launches Search Console Insights

Google is introducing a new experience called Search Console Insights. They describe it as “An easier way to understand how your content resonates with readers”.

RrGqTi9Y3J3ptpHRnB0Wy9iZ8aLZM4u1LbKEgcD2iSOlqPdPgpwp3J62osiDYnaBfNbXTlOrRJlJyuMUl2Wg6Z t4dBXcYh49bwoExUM4UOd3jYa 1FSHaH9JEaJ0alNJtP 4uAK Twin Front
This image has an empty alt attribute; its file name is Green.png

This experience joins data from both Search Console and Google Analytics and is designed to help content creators understand how audiences discover their site’s content and what resonates with their audiences – including information such as;

  1. What are your best-performing pieces of content?
  2. How are your new pieces of content performing?
  3. How do people discover your content across the web?
  4. What do people search for on Google before they visit your content?
  5. Which article refers users to your website and content?

What does this mean for me?

As content creators, this update means better insights into what makes your audiences tick and therefore can help to aid and enhance your content strategy going forward.

This post has explored all of the key Google algorithm and search industry updates from June 2021. If you have any questions about anything you’ve read here, don’t hesitate to get in touch.


#June #Google #Algorithm #Search #Industry #Updates #Impression

[ad_2]

Leave a Comment

Your email address will not be published. Required fields are marked *

Chat on WhatsApp
1
Hello
Hello,
How can i help you?