What’s New in SEO: December 2022 Update

Talks around AI, what it can do, and how it will change SEO continue to be popular – and marketers explore the best ways to incorporate AI into our content strategies.

By Petra Kis-Herczegh

Dec 19, 2022

6 min

Welcome back to our "What's New in SEO" monthly recap of search engine news.

This month of SEO news saw a number of interesting updates from Google. Talks around AI, what it can do, and how it will change SEO continue to be popular — and it looks like our industry has a lot of work ahead of us to learn the best ways and the right way to incorporate AI into our content strategy.


The Benchmark Conference in Manchester took place last month, organized by the brilliant Click Consult team. There were some amazing speakers and presentations from Nick Wilsdon, Omi Sido, John Warner, Giulia Panozzo, and Roxana Stingu, just to mention a few.

We recommend anyone to check out Roxana's; The internet for SEOs presentation, which can be watched on Omi Sido's site. Nick Wilsdon presented insights on how to leverage your CDN for improving SEO, and John Warner presented some great insights about Anonymous content. You can read the full summary on the event on the Click Consult site.

Search Engine Toolkit

Google has updated its documentation to reflect some known facts and create an official reference for it. One of these updates was about JavaScript links. Injecting the right <a> tag with an href attribute via JavaScript has been a way to create crawlable links since about 2008, as Barry Swartz references it on Search Engine Roundtable. The key is to make sure that the links are marked up properly and therefore they are crawlable. (Note that just because links are crawlable, it doesn't mean they are the best solution for your site. This implementation doesn't take into account factors such as crawl and rendering budget limitations, which might cause these links to perform worse than static HTML links would.)

Another update to Google documentations was around unsupported meta tags. Again, no new information here, but now there's an official resource from Google that says meta keywords and HTML tag lang attributes aren't supported.

Google has integrated shopping tab listings to Google Search Console for eligible online stores. This link allows the user to create a Merchant Center account using a simplified process without the need to re-verify their account. There will also be no need to submit a product feed; users will simply need to make sure that they keep product structured data up to date. The update can be reviewed on the Google Search Central blog.

SERP Updates

Google announced a few months ago that they will improve the SERPs to help users make more sustainable decisions when purchasing products. This month, many reported seeing the "green leaf" as a label for pre-owned and refurbished products within the SERPs. Search engine roundtable nicely summarised the examples. These now appear both on mobile and desktop, and, considering that Google Merchant Center has a condition attribute which can also be marked up as part of product structured data, it is very likely that this information feeds the new labels within the SERPs.

Algorithm Updates

There were no confirmed algorithm updates this month, but you can always count on Barry to publish all the SEO chatter around the volatility of the SERPs. There's definitely been a lot of activity, but whether that's due to seasonality, holidays, and the World Cup — versus some unconfirmed algo updates — remains unconfirmed. Google publishes a high number of algo updates every year, but it is only the core updates and the ones that create a bigger impact that they officially announce. (If you want to learn more about Google updates, Wix has a great blog post on this.)

It can be hard to tell if an impact on your website has been caused by an algorithm update or not, but if you are suspicious that an update might be the cause, you can always check if other SEOs reported algo suspicious changes. The best place to go for this are Barry's updates or Marie Haynes' blog — and for official updates, the Google rankings page.

That said, Google did make an interesting change to their terminology around updates: they introduced a new guide and new terminology. The new terminology refers to ranking systems and updates to those systems, to help users understand the concept and evolution of these ranking algorithms. At the moment, the new guide references 19 major groups of systems and explains each of these:

  1. BERT

  2. Crisis information systems

  3. Deduplication systems

  4. Exact match domain system

  5. Freshness systems

  6. Helpful content system

  7. Link analysis systems and PageRank

  8. Local news systems

  9. MUM

  10. Neural matching

  11. Original content systems

  12. Removal-based demotion systems

  13. Page experience system

  14. Passage ranking system

  15. Product reviews system

  16. RankBrain

  17. Reliable information systems

  18. Site diversity system

  19. Spam detection systems

They have also listed their retired systems for historical purposes. These include the likes of Hummingbird, Panda, Mobile-friendly systems, and others. These have either been incorporated into successor systems or were made part of the core ranking systems. For example, Google's Panda system has evolved into Coati which is part of the core ranking algorithm, and the 2010 and 2018 Page speed updates were replaced by Core Web Vitals, which is part of the Page experience system. Lily Ray has also published 7 key takeaways from SMX Next keynote with Hyung-Jin Kim, VP Search at Google. These further explain the main goal Google has — and always had — when it comes to improving their algorithms, which is to organize the world's content and provide the most relevant information to users.

Food for Thought

An interesting study has been published by Ziemek Bućko that shows how JavaScript crawling can take 9x longer to be indexed than static HTML. While we often talk about the improved capabilities of Googlebot handling JavaScript content and links (as mentioned above), to decide if it is the right choice for websites is an ongoing debate — and one that doesn't have a clear answer. There are many aspects to consider when thinking about JavaScript content, as they allow great flexibility to create engaging and efficient sites for users. Considering options like Server Side Rendering or Pre-rendering are definitely worth considering if you are working with a large JavaScript heavy site, and/or if your content changes often.

Finally, it seems like we can't pass a month without talking about AI content. It's definitely one of the biggest buzzwords out there at the moment — with a lot remaining to research, test, and learn.

During the November office-hours, Duy Nguyen from Google's search quality team said that Google has specific algorithms going after AI plagiarized content that are working to demote site scraping content and other spam.

Mike King also published a very comprehensive article on AI and why it might not be the SEO threat everyone seems to think it is. It goes into details about how AI evolved, as well as the risks it proposes, while highlighting the Do's and Don'ts of using language models in SEO. Our top ones are: human review and the incorporation of the data as part of key things to focus on, and, to quote, Mike "Do. Not. Generate. Copy. And. Immediately. Publish. It." (Please.)

That's all for December. This is my final recap for now, as I will be enjoying some well-deserved time off traveling in the coming months. Thanks to all of you reading these monthly updates, and don't forget to keep up to date on all things SEO while I'm away. As a reminder here are some of my favorite newsletters that you can sign up for:

SEO Newsletters

SEOFOMO by Aleyda Solis (weekly)

Search News You Can Use by Marie Haynes (weekly)

Rich Snippets by TrafficThinkTank (weekly)

Women in Tech SEO by Areej AbuAli (monthly)

Share this Article

Read Next

loading icon