Google has been very volatile in their search results recently. This website (valewood.org) has been impacted by that volatility where some days this site completely vanishes from the SERPs.
While I am not entirely happy that my site is being Thanos snapped out of existence, as I illustrated in my recent post "The Top Five Things I Learned About Search Engine Optimization In 2023", I know that SEO is not about day or day changes. Instead SEO is a long game that where you must have persistance.
Let's dive into trying to figure out why this is happening and maybe it will help you in your future SEO efforts.
- How Can DevOps Teams Take Advantage Of Artificial Intelligence?
- Thoughts On Recent Google Volatility - January 2024
- The Top Five Things I Learned About Search Engine Optimization In 2023
Timing of Changes VS Recent Volatility
First, we should spend a little bit of time diving into the timeline of events that lead up to this blog post to help provide a frame of reference.
November 2nd 2023: Google launches its November 2023 Core Update
November 8th 2023: Google launches its November 2023 Reviews Update
November 22nd 2023 - November 25th 2023: Valewood.org was completely missing from the SERPs. I could find this site under the "omitted results" section of Google SERPs, but everything had effectively vanished.
November 26th 2023: Everything returned to normal and my featured snippets, SERP positions, images, etc were all back to their original positions or better placement.
November 28th 2023: Google finishes their November Core Update (25 days, 21 hours)
December 7th 2023: Google finishes their November Reviews Update (29 days)
December 15th 2023: I relaunched the front page of this website to be more business oriented. I also launched a series of services pages.
December 16nd 2023 - December 17th 2023: Valewood.org was completely missing from the SERPs.
December 18th 2023: Everything returned to normal and my featured snippets, SERP positions, images, etc were all back to their original positions or better placement.
Note: Between November 22nd and December 15th, valewood.org was very static, I published almost nothing to the website in that period of time.
Updates after original publish date
December 19th 2023: Valewood.org is again completely missing from the SERPs.
December 20th 2023: I have observed that the Google SERPs appear to be in "Limp Mode". See more about Limp Mode below.
December 20th 2023: Everything returned to normal and my featured snippets, SERP positions, images, etc were all back
I have taken some time to dissect this timeline (more in the November time frame than December since December just happened). Unfortunately Google does not release a lot of public information about what is going on inside of their business or with their search engine, but that is for good reason. If they did, it would be much easier to game Google and get your site to rank even if it should not be at the top of the SERPs. That means I am left with some speculation and conjecture, but I do believe that this is mildly educated speculation and conjecture.
What this timeline really means for me is that Google is going through a period of instability and change of their own. This is unfortunate for a lot of people who wholistically rely on Google for the traffic to their website. This is also a sign that a centralized internet is dangerous to the health and prosperity of a free and open internet.
Google's primary goal is to bring high quality search results to searchers with the hope that repeat users show up under the mission of selling advertisements. If the SERPs are garbage, people will leave to go use Bing, DuckDuckGo, or some other service which would lead to lost revenue for Google.
Because of this mission, Google is constantly tweaking their algorithms under the covers to try and make sure that SERPs are relevant. Most of the factors used to rank a site are hidden from the public otherwise Google would be in a multi sided arms race for SERP supremacy. One side would be creating content blissfully unaware of what it takes to rank while the other side is working to game the system to their benefit.
That arms race would lead to a massive imbalance where nobody wins.
So, the take away here is that I do not believe that any substantive changes to my site have caused them to disappear from the SERPs.
AIs Impact On SERP Volatility
The AI genie is out of the bottle and their is no stuffing it back in.
Google has always tried to take a balanced approach to computer generated content and if or how it is displayed on the SERPs. They are looking for content that matches what a user is looking for and sometimes that content is generated by a machine. I would say that in my experience, the content being served was probably created by a human more often than via machine.
With AI storming onto the scene, I believe that Google was caught off guard at how quickly it gained mainstream adoption and how quickly it became accessible to the general public. I think Google knew this was eventually coming, but I don't think anyone could have expected the kind of response that GPT got, except for Sam Altman you magnificent man you.
How Does Search Work?
I am going to preface this section by saying that anything you are reading here is based off experience and not based off of real data. I am not nor have I ever been a Google search engineer. I do have a fair amount of experience with small scale search which is where I am drawing some of my inference from.
From day zero, search indexes do not know anything. A search index is an empty slate looking to be filled with tokens necessary to classify and index data for quick retrieval later.
If I were building a search index for a website, I would simply pull something like Elasticsearch into the mix where I could start shoving documents into ES and utilize its query APIs to pull back relevant document IDs to create SERPs. This is the "every tool in the toolbox is a hammer" approach.
If I were Google and I was indexing the persistently unregulated and untrustworthy hellscape of the internet and I was trying to figure out what content was authoritative to be used as a benchmark for all other content, I would need to make some choices first. Letting my indexing robots determine what content is trustworthy is probably not ideal because I would then let the search index be susceptible to a "51% attack" where someone would need to simply own 51% of a topical in order to tip the scales in their favor. Owning 51% of a space could be simply a volumetric play where a topic is flooded with millions of articles very quickly across numerous domains tilting the scales about what the search index feels "Truth" is.
▶ Key InsightI believe that the primary goal of Google providing high quality search results is based off of finding out what "Truth" is and then gauging how much the rest of your content supports that idea of "Truth". Computer's are dumb, and they only know what 1's and 0's are, so it has taken decades of fine tuning to build an idea of Truth that technology can evaluate.
If I wanted to avoid a 51% style attack, I would need to make some choices on who my trustworthy authorities are before I start to index other content where the blend of my chosen authorities are THE benchmark. I believe that Google has operated this way for years. There are the big X websites that Google has determined are THE gold standard for SERPs across a variety of different topics. Those websites are what all other web content are measured against.
This approach for Gold Standard websites gives Google a baselines "knowledge seed" for all other content. The reason they are telling you to write quality content is because they are looking for someone who helps up the game to create a new gold standard for content thus strengthening their baselines. It is a symbiotic relationship.
But, what happens when those Gold Standard websites all of a sudden get their hands on AI and start to use their economic power to create a 51% attack on Google SERPs unintentionally? How would Google know what the most current benchmark for success currently is?
How Is AI Content Impacting Website SEO
In the DevOps niche, what I am witnessing is a a series of either rollbacks or SERP flushes. With one of the long tail keywords that I normally rank for, I have seen the results flush everything that are not major brand name websites.
I am being cagey about the specific keywords and topics I am targeting on purpose. SEO is a cutthroat game where too many details can bury your website very quickly.
I actually stopped ranking for my own name (Geoff Wagner) which is an interesting factor. What I found instead of my website were a series of much older and mature websites along with some big brands like LinkedIn (and a bunch of other Geoff Wagners).
Because my website is small, and yes there is some AI generated content which goes through an editorial process, I believe that Google is knocking my site out of the SERPs intentionally as they are going through algorithm changes. Now, it would be arrogant to think that I am specifically being targeted here, but I am using my website as a generalized example. I believe that this website, along with many others, do not fit the bill (yet) as a Gold Standard authoritative source of information that Google can seed "knowledge" from. Due to that, when the index needs an aggressive correction, I am getting bounced out of the SERPs.
Do I think my site is being removed from the SERPs because of the AI content on my site, NO. That would be absurd simply because the mathematical models for detecting AI content are all still very flawed.
I believe that my site is not yet of a certain size, traffic volume, or authority for Google to put it into the bucket of Gold Standard sites. Google doesn't know if it can trust this domain (or me) yet or not.
Other Recent Trends
To help support these assertions, I am seeing an interesting trend in both social media and supporting graphs from tools like SEMRush.
First, I am seeing traffic numbers starting to increase for large sites like Stack Overflow and Reddit. I believe that Google finds these kinds of sites to be a Gold Standard of knowledge to pull information from to seed the initial index. Additionally, if I put on my black helicopter deflecting tinfoil hat, I also believe that these 2 very large data repositories will be crucial towards LLM development into the future and they may be getting a bit of preferential treatment since Google is harvesting their data to build out their new Gemini LLM.
Now, I need to balance the social side of this a bit since social media is dominated by a vocal few who may or may not represent a non-vocal majority. I have seen a lot of chatter from what I can assume are smaller websites having their search volume completely annihilated. I think that there are many possibilities for this, but I will highlight what I think are the 2 top reasons for this:
Google currently has a real big problem with who and what to trust. Knowing that a 51% style attack is easier than ever if you have the means and disposable income, Google has REALLY pulled back on the idea of trust and leaned even heavier into EEAT. Smaller websites just don't have the authority that big established websites do.
Google has cracked down on robots using their SERPs to pull in information. One, robots are bad for their ad business. I would imagine Google spends a lot of time and money on scrubbing their ad spend charge backs from bot traffic. Additionally, LLMs are all the rage right now and robots were probably spending a fair amount of time using Google SERPs to figure out good quality sources of information to feed the robots.
There is a potential that a large amount of traffic that your site was getting was from robots. Google may have added stronger bot fighting measures (with recent news of high percentages of Ad click-throughs being performed by bots). The rationale here is simply, there are not LESS people using the internet, there are more and more every day. This means that traffic is not going into a black hole unless Google built a better black hole.
If you use tools like SEMRush or Ahrefs, you will see that major sites like Reddit getting a lot more traffic than they used to get before the recent HCUs. This is both a measurement problem and an interpretation problem. SEMRush and Ahrefs only predict and project traffic, they are not definitive sources and their projections are built off the idea that traffic doesn't vanish overnight.
Now, I have not definitive evidence to assert 100% that these major websites are not getting more traffic, but I cannot see how this intentionally hits any of Google's goals.
When my site goes missing, My first stop is to go look to see if I have "Security issues" or "Manual actions" in GSC. When I click on the "Sitemaps" section of GSC, I am not able to open the "See page indexing" and "See video page indexing" sections. The links are grayed out and unavailable. Otherwise, I don't have any alerts, actions, or issues. Indexing shows that everything is indexed, though I know that everything in GSC is always woefully behind what current day.
Seasonality is also a huge factor in traffic and a lot of what I am seeing out in SEO Social Media land I believe is people confusing HCU updates with seasonality. Consumer trends start to change around the holiday season and shift from knowledge based searches (i.e "Why is my cat purple") over to purchase oriented searches. I believe that much of the blog-o-sphere is targeting easy knowledge based SERPs where they will naturally tail down over the holidays and that is being conflated with the November HCU as the culprit. Only time will tell if sites start to bounce back in the coming months and normal search trends start to re-emerge from the shadows.
I am also seeing Google go into "Limp Mode". I have dedicated a section below to walk through what I am seeing.
Something else I am noticing is Google going into "limp mode". When this website disappears from the SERPs, the Google frontend seems to only be showing around 100ish search results and not providing any links to Page 2->Page N. This is interesting because the topics that this website covers are pretty broad and pretty competitive meaning that there are thousands of articles in this space. Most of the time, but not always, there is a link at the bottom of the SERPs for "show omitted results" (or something to that effect). I can usually clicks that link and this website will show up in the omitted results. My guess is that my pages are currently sitting somewhere between having been crawled but not yet having been classified and ranked.
One of the primary things to check when you are seeing this kind of odd behavior is to make sure that the results are not personalized. To combat this, while performing my reviews, I am hopping on a VPN to somewhere else in the US and using Chrome in private browsing mode. I am normally a Firefox user for my day to day and I only use chrome for doing local dev testing. Chrome is not signed in to any of my Google accounts, and I habitually clear out my history data on Chrome. It is by no means perfect, but it is as close to "clean room review" as I can get without setting up TOR.
To summarize all of this, what it appears is happening is a series of search index reversions. It feels like Google is currently in a state where the search index is being rolled back over and over again at which point it needs to re-rank pages against the index for them to show up again. With my tinfoil hat on, I would assume Google is doing this to combat something that it cannot currently push down the rankings with a classifier alone. If I were to take an extremely myopic view of the state of the state, I would assume that the thing they are struggling to block through classifiers is a wave of completely junk AI content.
▶ The Trouble With AI Detectors
If you are wondering why AI content is so difficult to detect and block, and why all AI content classifiers detectors are snake oil, have a look at this video.
So, what do I do going forward? Unfortunately I am a the mercy of Google for both SERPs and Ad revenue. I think the only real path forward is to both endure and just keep posting. I believe that I am up against a numbers game and the only way to keep moving forward is through volume and quality. Eventually valewood.org will be established enough that this site is part of the Gold Standard baseline content that other content is measured against.
I also believe that the quick startup blog-o-sphere is dead. I think that the only way anyone will make it in blogging going forward (with the exception of real moonshots) is going to be through blood, sweat, and tears. Google is going to be a lot more cautious going forward and trying to get a quick win with blogging is going to be more difficult than ever.