“PageRank” is the zombie concept of SEO. It refuses to die, shambling through every forum thread and conference slide deck for 25 years. But in 2025, when checking your “Crawled - currently not indexed” report, invoking PageRank is worse than useless—it is misleading.

The classical definition of PageRank was a probability distribution: the likelihood that a random surfer would land on a page. Today, the metric that matters is Indexing Probability.

And unlike PageRank, Indexing Probability is not a smooth gradient. It is a Binary CLIFF.

The Zombie Page Phenomenon

A “Zombie Page” is a page that exists on your server, is known to Googlebot, but is effectively dead to the index. It sits in the “Crawled - Not Indexed” purgatory.

Why? Because it fell off the Indexing Cliff.

In the old days (circa 2015), Google might index a low-authority page and rank it on page 10. Today, Google simply does not index it. The cost of storage has made the “Long Tail” of the index too expensive to maintain for low-probabilistic-value assets.

MetricClassic SEO EraAgentic Era (2025)
Minimum Viable PageRank0.00001 (Indexed, but ranked poorly)0.5 (Indexed) / <0.5 (Not Indexed)
Crawl FrequencyMonthlyReal-time or Never
Discovery SourceXML SitemapContextual Link Graph
Failure State“Omitted Results”“Crawled - Not Indexed”

The prompt suggests that “External links improve crawl-to-index conversion.” This is an understatement. In many verticals, external links are the only thing that converts a crawl to an indexation event.

Think of an external link not as a vote, but as a Keep Alive Signal.

When a crawler encounters a link to your deep page from a trusted Hub (e.g., a major industry portal or news site), it resets the “Time to Live” (TTL) on that URL’s indexation status. Without that external signal, the URL’s internal TTL decays rapidly.

If your site relies solely on internal linking to prop up deep pages, you are relying on a closed energy system. Entropy eventual takes over. You need external energy (links) to combat the entropy of the index.

The Consensus of the Graph

Google’s “Indexing System” is likely separate from its “Ranking System.” The Ranking System orders the results. The Indexing System decides if there are results to order.

The Indexing System looks for Consensus.

  • Does this page exist in a vacuum? (Only internal links)
  • Does the broader web acknowledge its existence? (External links)

A page with zero external validation is a hallucination of the site owner. Google has no obligation to store your hallucinations.

Practical Steps to Resurrect Zombies

  1. Kill the Weak: If you have 1,000 pages in “Crawled - Not Indexed,” delete 800 of them. Send a 410 Gone status. This concentrates the “Crawl Budget” (another zombie term, but applicable here) onto the remaining 200.
  2. The “Lighthouse” Strategy: Build one massive, link-worthy asset (a Lighthouse) and 301 redirect the dead pages to it. You are consolidating the tiny, fragmented authority of the zombies into one living entity.
  3. Cross-Platform Validation: Use social signals. While Twitter links are nofollow, they are discovery events. A burst of traffic from a social platform can force a “Freshness” check, temporarily bypassing the Indexing Threshold.

The Bottom Line

Stop staring at the “PageRank” of your home page. It doesn’t trickle down like trickle-down economics. Deep pages need their own reasons to exist. If they are “Crawled - Not Indexed,” the market has spoken: they are insolvent.

Liquidate them or recapitalize them with external links. But do not just let them rot in the report.

Read more on Indexing Theory at Search Engine Land