Google Begins Using BERT to Generate Top Stories Carousels in Search


Google Begins Using BERT to Generate Top Stories Carousels in Search

Google is now using BERT models, along with other machine learning techniques, to group related news articles together in carousels.

New accessibility tech on GAAD 2023: Apple, Google, Microsoft and more

Tech’s major companies took Global Accessibility Awareness Day this week as a chance to launch and highlight their assistive products. From Apple’s Assistive Access and Google’s Visual Q &A to Webex’s captions for those with non-standard speech and Adobe’s AI-generated tags for screen readers, here’s a roundup of the biggest GAAD 2023 news.

Like us on Facebook:
Follow us on Twitter:
Follow us on Instagram:
Follow us on TikTok:

The Engadget Podcast:
More about Engadget Audio:

Read more:

Ways that the Google BERT Update Will DRAMATICALLY Impact Your SEO | The Brandastic Show # 55

In fall 2019, Google introduced a new update for Bidirectional Encoder Representations from Transformers. The update is commonly known as the BERT update and affects the way Google’s algorithm understands natural language processing.

More specifically the update allows Google to better comprehend the context of what a person is searching for. It aims to understand the nuances of our everyday language.

Thanks to Google’s self-learning, the algorithm can better understand complex phrases and different word usages.

For example, a word like ‘stand’ could mean different things to different people. It could represent going to a food stand, taking a stand, or standing up. All use the same word but have different meanings that Google can now understand.

This ability is useful for people when searching for different items that have similar names or labels (e.g. looking for a bank account versus a holiday near a riverbank). Thanks to the BERT update Google can now understand the correct intent.

Although the update only rolled out a few months ago, it now affects about 10% of all search queries.

In this video, I’ll be going over ways that the Google BERT update will dramatically impact your SEO and content strategies.


#1 Digital Marketing Agency
At Brandastic ( our Mission is to ignite the brand potential of our clients! We do this through services like website design, Magento and WooCommerce development, branding, and digital marketing. We are a full creative agency that can help you with A-Z of your business online.

Visit our website online ( to see how we can help your business grow its online presence, and increase sales for your eCommerce store.

Learn how to do digital marketing from the top digital marketers in the industry!

Learn more Digital Marketing tips:

Find us on:

#BERT #googleupdate #SEO

Google 12/11 Algorithm Update, BERT Expanded, Google News Changes & Bing – This week, we may have had a larger Google update roll out around December 11th, before that a possible Google Analytics bug. Google expanded BERT to 70+ languages but still impacted 1 in 10 of queries for where it is implemented. Google News revamped a bunch of stuff around top stories, it now uses BERT and other machine learning. You also do not need to be included in Google News to show in Google News. That means Speakable markup works for none Google News sites. Google News Publisher Center was updated and news content comes from the web. Google Search Console now reports on errors where you incorrectly tag a URL. Google Search Console now shows more specific errors around job postings. Google will provide more guidelines around pagination with rel=next/prev being gone. Google Translate shows photos for what it is translating. Google is teasing local service ads for realtors. Google is testing the black ad label on desktop for local ads. Google is testing a review carousel in the local pack. Bing does webmaster outreach for crawling issues. Bing has to determine the cost/value of a ranking improvement change. Bing says you need to be okay with not being perfect if you use machine learning. Alexis Sanders made a new SEO game that is fun to play. This weeks’ vlog was with Ben Cook from Perficient Digital on site migrations. That was this week in search at the Search Engine Roundtable.

Recent Google Algorithm Update Chatter May Be Related To An Analytics Bug (0:48)
December 11th Google Search Ranking Algorithm Update Signals (1:11)
Google BERT Now International Supporting Over 70 Languages (2:21)
Google BERT International Launch Still Impacts ~10% Of Queries (3:11)
Google News Submission Not Required, Powered By BERT & New Top Stories Features (3:24)
Google: Speakable Markup Works Outside Of News Content (4:55)
Google News Publisher Center Updated & News Content Now From Web (5:32)
Google Search Console Showing Errors For Incorrectly Tagging URL With RDFa or Microdata (6:27)
Google Search Console Job Postings Errors Adjustments (6:40)
Google To Revise Guidelines Around Rel=next/prev (7:02)
Google Translate In Search Can Show Pictures (7:31)
Google Tests Local Service Ads For Realtors (7:50)
Google Tests Local Pack With Black Ad Label On Desktop (8:13)
Google Tests Review Carousel In Local Knowledge Panel (8:33)
Bing Does Reach Out To Webmasters With Search Issues (8:59)
Bing: Cost & Value Determined In Bing Core Ranking Improvements (9:35)
Bing: To Use Machine Learning; You Have To Be Okay With It Not Being Perfect (10:13)
Fun SEO Role Play Game (10:56)
Vlog #30: Ben Cook On The Wild West Days Of SEO & Enterprise Site Migrations (11:28)

Learn Bert – most powerful NLP by Google

What is BERT?

BERT has been the most significant breakthrough in NLP since its inception.
But what is it? And why is it such a big deal?

Let’s start at the beginning. BERT stands for Bidirectional Encoder Representations from Transformers. Still none the wiser?

Let’s simplify it.

BERT is a deep learning framework, developed by Google, that can be applied to NLP.

Bidirectional (B)

This means that the BERT framework learns information from both the right and left side of a word (or token in NLP parlance). This makes it more efficient at understanding context.

For example, consider these two sentences:

Jimmy sat down in an armchair to read his favorite magazine.

Jimmy took a magazine and loaded it into his assault rifle.

Same word – two meanings, also known as a homonym. As BERT is bidirectional it will interpret both the left-hand and right-hand context of these two sentences. This allows the framework to more accurately predict the token given the context or vice-versa.

Encoder Representations (ER)

This refers to an encoder which is a program or algorithm used to learn a representation from a set of data. In BERT’s case, the set of data is vast, drawing from both Wikipedia (2,500 millions words) and Google’s book corpus (800 million words).

The vast number of words used in the pretraining phase means that BERT has developed an intricate understanding of how language works, making it a highly useful tool in NLP.

Transformer (T)

This means that BERT is based on the Transformer architecture. We’ll discuss this in more detail in the next section.

Why is BERT so revolutionary?

Not only is it a framework that has been pre-trained with the biggest data set ever used, it is also remarkably easy to adapt to different NLP applications, by adding additional output layers. This allows users to create sophisticated and precise models to carry out a wide variety of NLP tasks.

BERT continues the work started by word embedding models such as Word2vec and generative models, but takes a different approach.

There are 2 main steps involved in the BERT approach:

1. Create a language model by pre-training it on a very large text data set.
2. Fine-tune or simplify this large, unwieldy model to a size suitable for specific NLP applications. This allows users to benefit from the vast knowledge the model has accumulated, without the need for excessive computing power.

You may also like...