Julian Henry's Blog

🏆 Yes, NP=P

20 Mar 2025

NP≅P. There. I asserted it. Thu Mar 20 07:26:02 CDT 2025.

I have been asserting that since my freshman year at the University of Pennsylvania, shamelessly optimistic.

Scant support there be, is not Stephen Smale’s 1958 eversion of the sphere enough evidence evincing the possibility of the counterintuitive being possible? For decades, topologists “knew” you couldn’t turn a sphere inside out without creasing it. Then Smale proved you could. The “impossible” was merely a failure of our collective imagination to visualize the high-dimensional path.

My senior year arxiv studies led me to conclude the functor collapsing NP to P must be supra-NP time in complexity.

Can it not be that the hyper-difficult is the simple by another name? Is not every religion’s end to explain the rise of mankind?

For those who demand “proof” or at least “reason,” let us survey the pillars of this iconoclastic hope. Here are the strongest arguments for why the $P$ vs $NP$ wall is actually a mirage:

I. The Great Silence (The Lower Bound Failure)

If $P \neq NP$, there should be a mathematical “fence” separating them. Yet, after 50 years of intensive search, we haven’t found a single post of that fence. We cannot even prove that $SAT$ requires $n^2$ time on a general Turing machine. We can’t even prove it’s harder than $O(n)$. The “argument from ignorance” is actually a signal: perhaps we can’t prove a lower bound because no such bound exists.

II. The Ghost in the Machine (Knuth’s Statistical Argument)

Donald Knuth, perhaps the greatest living algorithmist, has come to believe $P=NP$. His reasoning is probabilistic: there are a “humongous” number of possible polynomial-time algorithms. If you consider an algorithm of degree $10^{100}$, the space of potential strategies is so vast that it is statistically unlikely that not a single one of them solves an NP-complete problem. We haven’t found it because we are looking for “elegant” algorithms, not “galactic” ones.

III. The Historical Collapse (AKS & Linear Programming)

History is a graveyard of “inherently hard” problems that turned out to be easy:

  • Linear Programming: Everyone “knew” it was hard because Simplex was exponential. Then Khachiyan (1979) and Karmarkar (1984) proved it was in $P$.
  • Primality Testing: Long thought to be the barrier of number theory. In 2002, the AKS test showed $Primes \in P$. The feeling of “hardness” is often just a reflection of our current algorithmic poverty, not a mathematical law.

IV. The Mathematical Blockade (Natural Proofs)

Razborov and Rudich’s “Natural Proofs” theorem shows that almost all standard techniques for proving lower bounds (the kind we’d need for $P \neq NP$) are fundamentally doomed to fail if certain cryptographic primitives exist. If the universe is built so that we cannot prove $P \neq NP$ using standard tools, maybe it’s because $P \neq NP$ isn’t actually a theorem.

V. The Galactic Algorithm

Consider the possibility that $P=NP$, but the optimal algorithm for $SAT$ is $O(n^{10,000})$. Mathematically, this is a total collapse. Practically, it would be indistinguishable from an exponential algorithm for any input size that could fit in the observable universe. The “hardness” we see in $NP$ would be a local phenomenon—a “transient” difficulty that disappears only at scales we cannot reach.

Leonid Levin (the co-discoverer of NP-completeness) already wrote the algorithm! Levin’s Universal Search can solve any NP problem in $O(Time_{optimal} \cdot Constant)$. If there exists any polynomial algorithm for an NP problem, Levin’s algorithm will find the answer in polynomial time. The algorithm is literally sitting there, waiting for the constant to be manageable.


An iconoclastic diatribe, one that most likely will be proven wrong, but who cares? I am in favor of being pleasantly surprised, rather than axiomatically correct.

Yes, people : NP = P! . I said it.

I’ll soon eat such words as a 7th grader’s dog eats homework.