A Google research paper named Generative Models are Unsupervised Predictors of Page Quality paints a negative picture of SEO practices. There’s a section called “Search Engine Optimization (SEO) Attempts” that says, “Documents that attempt to do SEO tend to be flagged as very low quality.”
This paper was originally featured by Roger Montti. Roger Montti asked what this research paper is about useful content updates. I think Roger is here. At least in part, this paper seems to explain a lot about useful content updates.
This section says: “This is intuitive because these texts simply strung together a series of keywords and are inconsistent. Additionally, there are a modest number of product pages and professional profiles looking to do some form of SEO. It turns out that media-centric domains such as image hosting domains often have embedded text that they don’t understand, probably for SEO.”
Former Google spam fighter and now SEO Pedro Dias pointed this out on Mastodon, stating: This is one of the reasons for the “low quality”. “
Pedro isn’t wrong, but Google’s John Mueller tried to downplay it. — These are not attributes associated with SEO with the larger web. “I think it’s a losing battle to say ‘not all SEO is bad,'” he later added. Remember, Googlers such as John Mueller say SEO is important and something Google values, but there’s this research paper…
It’s sad to see this wording in a Google research paper, but we know how badass the SEO industry is in the outside world. If you’re not sure, ask a layman about his SEO and see what they think. I hope someone who works in search at Google doesn’t share that view, but Google is a big company.
Either way, it may make sense to read research papers, knowing that just because there is a patent document or research paper doesn’t mean Google uses it. But this could very well coincide with helpful content updates.
Forum discussion on Mastodon.