The Real-Time In Google’s Last Penguin Update
by Laurie Sullivan, Staff Writer @lauriesullivan, December 8, 2016,
Google released the Penguin 4.0 update in September after promising that no others would follow. This update, probably one of Google’s most important final updates, weights links to sites in real-time. As Google bots crawl the Web and discover links, the algorithm evaluates and stores each, creating a database of information that it can draw from to make decisions.
Penguin also devaluates spam and adjusts ranking based on signals rather than affecting the entire site. The real-time doesn’t mean that rankings changes instantaneously occur to pages when new links connect, but rather that Google devaluates them and have “less positive impact.”
That’s how Stone Temple CEO Eric Enge describes it. His consulting firm tracks the changes and makes recommendations to search engine marketers that are looking to optimize content and campaigns that runs on mobile and desktop devices.
After being live for a few months, Enge notes that the algorithm processes data “more in real-time” compared with previous versions because “sites don’t have to wait until a major update or refresh of the algorithm to see positive or negative effects.”
This is especially good news for Web sites with Penguin-penalized pages “because they don’t need to wait for the next time Google bots crawl the page to see a correction, he writes. “Changes and updates to the algorithm are now made without necessity of an entire update. These changes will be seamless and largely invisible to us.”
The latest update addresses new link types, adjusts ranking weights, and improves the process of collecting link signals, he explains.
The types of links targeted by Penguin include Web and article directories, international links, bad anchor text mix, coupon codes, poor quality widgets, affiliate spam, other non-editorial links, and request removal first links.
Engage writes that getting better-quality links to the page should improve its ranking. Among a list of options, he also suggests removing bad links regularly by using tools such as Bing WMT, Open Site Explorer, Majestic, Ahrefs, and Google search Console — and building a list of the backlinks to the site and categorize the link sources such as blogs, multi-link pages, rich anchor text and comment links.
MediaPost.com: Search Marketing Daily
(53)