(+44) 01438 870220
(+44) 01438 870220

March Search Updates 2021


Blog Article

Google may see web pages as duplicates if URLs are too similar

During the Google Search Central SEO hangout recorded on March 5, the topic of Google indexing URLs incorrectly was brought up by website owner Ruchit Patel. 

John Mueller explained that when crawling websites, Google sometimes tries to be predictive in order to reduce the need for unnecessary crawling and indexing. A part of this process is taking a “broader predictive approach” to duplicate content.

Google may crawl a couple of URLs that have a very similar structure and find what it determines to be duplicate content. Google would then apply this to other URLs with the same structure, sometimes resulting in unique content being written off as duplicate. 

While there is no penalty or negative ranking signal associated with duplicate content, Mueller gave his advice on a way around the issue: 

“So what I would try to do in a case like this is to see if you have this kind of situations where you have strong overlaps of content and to try to find ways to limit that as much as possible.

So that really every URL that we crawl on your website and index, we can see, well, this URL and its content are unique and it’s important for us to keep all of these URLs indexed.”

Google’s top five video optimisation recommendations

In one of the latest Lightning Talks videos on the Google Search Central channel, Google revealed the five recommendations for optimising video content for greater SEO value. 

When Google crawls a page and discovers it contains a video, it uses these signals to understand more about what type of content it is: 

  • On-page text: Such as page title, headings, and captions near the video
  • Referral links: Signals sent from other sites linking to the video
  • Structured data: Markup that communicates video metadata to Google
  • Video files: Google can crawl the file itself to understand the audio and visual content

Google suggests using these five best practices to make sure your video is found and understood by search crawlers:

  1. Make your video publicly accessible

Your video should be publicly available to view, easily visible, and with a corresponding page and URL that Google can access.

  1. Use structured data

Use structured data to help Google find and understand your videos, including things like video title, description, duration, thumbnail, video content file URLs, and more.

  1. Provide high-quality thumbnails

A high-quality thumbnail that Google can access is essential. Make sure your thumbnail isn’t blocked with robots.txt.

  1. Submit a Video Sitemap

This is another way to help Google find video content associated with pages on your site. You can also include metadata tags to help Google understand the video content.

  1. Use accessible video files

This makes sure your video files are eligible for search features such as video previews and key moments. When Google can fetch and analyse your video content, it can serve it for more relevant queries.

Copied content outranking original suggests a quality issue

During the Google Search Central SEO hangout recorded on March 12, John Mueller addressed an issue about content being copied from a site without permission, and then ranking higher than the original site.

Mueller said that if that is a consistent issue, the problem may be with site quality. He suggested reassessing overall site quality, saying:

“The other thing that I would also keep in mind is that if you’re regularly seeing other people with copied content ranking above your content, then to me that points at a situation where maybe the overall perceived quality of your website is something that our algorithms are having trouble with.”

Google faces $5 billion lawsuit over incognito tracking issue

Google is facing a $5 billion lawsuit after claims that users are still subject to tracking when in Incognito mode on Google Chrome. 

According to Search Engine Journal, “The lawsuit alleges Google is in violation of wiretapping and privacy laws for intercepting, tracking, and collecting communications when Chrome’s Incognito mode is in use.”

Those who started the claim say that Google does not clearly state that websites can still gather data while in Incognito mode, although Google disputes this. Google has been trying to have the lawsuit dismissed, but a federal judge announced that it must go forward.

New Job Posting structured data requirements

Google recently announced an update to the “JobPosting” structured data, relating to education and experience requirements. From the announcement by Google, here is the list of beta structured data: 

  • educationRequirements.credentialCategory
  • eexperienceRequirements (typo)
  • experienceRequirements.monthsOfExperience
  • experienceInPlaceOfEducation

These new types of structured data are intended to fill a gap in the current schema.org choices, allowing postings to show what specific education requirements the job has, particularly to show when a degree or degree-level qualification is not required. 

It’s interesting to note the typo in the second point (it should read experienceRequirements with only one “e”) – does this mean that the announcement was rushed? It seems a coincidence that this and Google’s Career Certification Courses were announced on the same day. 

Google maps lets users add photo updates without leaving a review

Google has announced new Google Maps features that make it easier for users to share useful information about places and roads. One of these is a new content type; photo updates. This allows users to post a photo of a place with a short text description, without having to leave a full review. 

Another update that is soon to be released is the ability to make amendments to road layouts, directions and names. Users will be able to edit roads if how they appear on Maps isn’t how they appear in real life, with all changes being vetted by Google before being published. 

There is also a new community challenge feature being trialled on Android in the US, with the aim of getting more users involved in providing updates, reviews and photos to local businesses.

Google announces free Hotel Booking Links

Previously, Hotel Booking Links were offered through Hotel Ads, but now it has been announced that Google will be making it free to appear, saying: 

“Now, we’re improving this experience by making it free for hotels and travel companies around the world to appear in hotel booking links, beginning this week on google.com/travel. …For all hotels and travel companies, this change brings a new, free way to reach potential customers.”

While this is a good thing for travel companies, it could stem from the anti-trust issues that Google is currently facing, with claims that Google pushes its own hotel products in SERPs at the expense of other high-quality sites.

Google Search results updated with ‘Full Coverage’ for news

Google News has had the Full Coverage feature for a few years, which connects news stories in realtime to provide users with more details on a breaking news story. Now, it has been rolled out to Google Search, although two years later than promised.

Full Coverage features are not triggered by every news story, only ones that are set to develop over time as more updates are released, and there is also no real way to optimise for Full Coverage results. 

Google said: “With this launch, we’re introducing new technology that is able to detect long-running news stories… We then organize the Full Coverage page to help people easily find top news along with additional content like explainers and local coverage that are helpful to understanding these complex stories.”

Google updates PageSpeed Insights scores

Google announced an important change relating to the PageSpeed Insights tool and how it gathers information. The change is a switch to using the http/2 protocol for connecting to a web page.

In the announcement, Google said: “As of March 3, 2021, PageSpeed Insights uses http/2 to make network requests, if the server supports it… In general, performance scores across all PageSpeed Insights runs went up by a few points.”

If you have seen better PageSpeed Insights scores from March 3rd, this is the reason why. The switch has given many websites a PSI performance boost. If you haven’t, it might be worth checking to see if your server supports http/2.

Google will not track users after replacing third-party cookies

After making it known last year that it plans to drop support for third-party cookies in its Chrome web browser, Google has announced that it won’t be replacing them with another function that tracks a user’s web activity. 

Google says: “People shouldn’t have to accept being tracked across the web in order to get the benefits of relevant advertising. And advertisers don’t need to track individual consumers across the web to get the performance benefits of digital advertising.”

Google released information in January about a cookie replacement mechanism called FLoCs that will still enable advertisers to get good results in the SERPs, without using individual identifiers.

Google not testing FLoC in Europe due to GDPR

Although Google has started to test its new FLoCs-based system in the US, it has said it will not be rolling out the testing to Europe yet. One of the reasons is that Google appears to be auto-enrolling sites in the FLoC origin trials, rather than asking for signups. This lack of consent goes against GDPR and the ePrivacy Directive, and Google doesn’t appear to want to run the risk of getting any more privacy-related fines just yet.

Although Google hasn’t provided a timeline for Privacy Sandbox testing in Europe, it’s assumed it will need to at some point, so stay tuned for more updates.

Google Search adds new rich results for education sites

Possibly due to the rise in online learning brought about by COVID-19, Google has introduced a new type of structured data that education sites can use to become eligible for rich results in Google Search.

Sites that offer practice problems and assistance for subjects such as maths can use these new types of schema markup to appear in “practice problem rich results” or “math solver rich results”.

After implementing the required markup, site owners can also use new reports in Google Search Console to check for any errors, update and validate changes to trigger a recrawl. 

Google doesn’t click buttons when crawling

During the Google Search Central JavaScript SEO hangout with Martin Splitt (recorded on March 24), the topic of buttons on web pages came up. It was revealed that, when crawling a site, Googlebot cannot click on buttons such as “Load More”. This means that any additional content that these buttons reveal will not be crawled or indexed by Googlebot.

One of the solutions given would be: “to implement that button as a link that basically goes like “?page2”, or “/2”… It doesn’t matter what the URL looks like the point is it goes to a URL that shows a different batch of content.”

Split then went on to say that the link should produce the next batch of content (page 2), and not repeat the existing (page 1) content followed by the next content. He then recommended using JavaScript overwrites to make the link look like it’s loading content from the same page, when it’s actually loading content from a different page, improving the user experience.



GET IN TOUCH

Find out more about how we can maximise your search marketing performance. Fill in the form below or call us on 01438 870220.

Search Updates September 2021

Search Updates September 2021

Fuzzy Lookup For Mac Using Python and Fuzzywuzzy

Fuzzy Lookup For Mac Using Python and Fuzzywuzzy

The Future of Marketing - three emerging trends that will rewrite how we reach customers

The Future of Marketing - three emerging trends that will rewrite how we reach customers

How To Scrape Youtube Video Views With Python

How To Scrape Youtube Video Views With Python