Skip to content

P=NP and Bitcoin

March 8, 2024


The present value of working on conjectures…

Anil Nerode of Cornell University has served over sixty years—he’s believed to be the longest such faculty member in university history. He helped found Cornell’s computer science department, advised more than 55 doctoral students—a Department of Mathematics record—and made important contributions to logic, mathematics, computer science, and more. He and Juris Hartmanis jointly oversaw Ken’s Mathematical Sciences Institute postdoc at Cornell in 1986–89.

I found it interesting that Anil has worked on teaching methods:

“I was educated at Chicago under the business of doing the cleanest, shortest expositions of absolutely everything,”

He has developed his own method: He now teaches in historical order. Translating ancient theories into modern notation allows him to show his students the origins of mathematics. This is a cool method.

Cornell Chronicle source—our felicitations

Anil has done extensive work in complex control systems, which is certainly a challenging area for an approach based on clarity and simplicity. He is often credited for co-creating, together with Wolf Kohn, an approach called hybrid systems theory. His partnership with Kohn has even extended to integrating quantum and macroscopic control mechanisms for magnetic batteries on many scales.

Conjectures Which Way?

Anil once said:

Being attached to a speculation is not a good guide to research planning. One should always try both directions of every problem. Prejudice has caused famous mathematicians to fail to solve famous problems whose solution was opposite to their expectations, even though they had developed all the methods required.

I blogged about this point in posts titled “We All Guessed Wrong” and “Even Great Mathematicians Guess Wrong”—and about other cases of surprises and about guessing the right side of conjectures.

But events this week have put a transverse thought into my head. The two sides of a conjecture may not be equal in their ramifications. This leads to a further weird thought:

Maybe the greatest overall value is in working hard on the side that you don’t want to be true—and hoping not to solve it.

The fact of Bitcoin hitting a new all-time high this week leads me to muse further about this.

Trillions

If you think you have a proof that P=NP then it is worth a lot of money. Well for sure, at least, you will get a prize like the Turing Award. But because it would break bitcoin and other digital monies you would possibly get into the thicket of billions of dollars. Maybe more like trillions.

source

Exactly what it would be worth to you, I don’t know. That’s a hypothetical. But there is a non-hypothetical that, from the understanding Ken and I have of how hedge transactions are valued, implies that one side of “P=NP” has clear and present monetary value:

What is the present value of not knowing that P=NP, and of exerting work that furthers the general sureness of this status?

A 2022 article titled “P = NP: The Dangling Blackhole Under Crypto” puts the stakes in stark terms:

P=NP is [about] whether a problem (in the case of Cryptocurrencies, a transaction) can be solved procedurally and systemically just as quickly as it can be verified with a hash. The security of the blockchain, such as [for] Bitcoin, is wholly dependent on the assumption that P does not equal NP. However, if P = NP is proven, the Bitcoin can be breached just as quickly as it can be authenticated.

A breach would have the kind of concrete effect we have in mind when we say, “from dollars to donuts…“—although the donuts involved could technically come from work like this:

source

Open Problems

Is it sensible to assign a present value to work on one side of the P=NP question? More than work that tries to establish the other side?

61 Comments leave one →
  1. Frank Vega permalink
    March 8, 2024 5:30 pm

    This is a well-known NP-complete problem:

    \textbf{K-CLOSURE problem}: Given a directed graph G = (V, A) (V is the set of vertices and A is the set of edges) and positive integer k, determine whether there is a set V' of at most k vertices such that for all (u, v) \in A either u \in V' or v \notin V' (see reference [Queyranne, 1976] from the Johnson and Garey book).

    We reduce this problem to MWX2SAT in polynomial time and so, MWX2SAT is in NP-complete as well. This is the definition:

    \textbf{Monotone Weighted Xor 2-satisfiability problem (MWX2SAT)}

    INSTANCE: An n-variable 2CNF formula with monotone clauses (meaning the variables are never negated) using logic operators \oplus (instead of using the operator \vee) and a positive integer k.

    QUESTION: Is there exists a satisfying truth assignment in which at most k of the variables are true?

    We prove this problem can be solved in polynomial time. See the proof here:

    https://hal.science/hal-04420462

    Moreover, we create our software programming implementation in Python:

    https://github.com/frankvegadelgado/alma

    Quote by Saint Augustine: The truth is like a lion. You don’t have to defend it. Let it loose. It will defend itself.

    • javaid aslam permalink
      March 8, 2024 8:47 pm

      And how does it relate to cryptography?

    • Frank Vega permalink
      March 12, 2024 6:13 pm

      Trying to convince the Google AI (Gemini) that P = NP

      This is really hard to do 🙂

      https://g.co/gemini/share/3a0f2d7cb012

    • Frank Vega permalink
      March 12, 2024 7:54 pm

      My Question:

      Does the phrase “either u \in V' or v \notin V'” actually means (u \in V' or v \in V') or (u \notin V' or v \notin V')?

      Google AI (Gemini) Answer:

      Yes, the phrase “either u ∈ V’ or v ∉ V'” does mean the same as:

      (u ∈ V’ or v ∈ V’) or (u ∉ V’ or v ∉ V’)

      Here’s the breakdown:

      1. “Either”: This indicates that at least one of the following statements must be true, but not necessarily both.
      2. u ∈ V’: This means element “u” belongs to the set “V prime” (often used to denote the complement or dual of set V).
      3. v ∉ V’: This means element “v” does not belong to the set “V prime.”

      The original statement offers two possibilities:

      • u belongs to V’: This satisfies the first condition.
      • v does not belong to V’: This satisfies the second condition.

      However, the intent is to convey that only one of these conditions needs to be true, not both.

      Therefore, rewriting it as two separate “or” statements clarifies this meaning:

      • (u ∈ V’ or v ∈ V’): This captures the possibility of either “u” being in V’ or “v” being in V’.
      • (u ∉ V’ or v ∉ V’): This captures the possibility of both “u” and “v” not being in V’.

      So, both the original statement and the rewritten form using two “or” statements express the same logical concept: at least one of u or v must satisfy the condition related to V’, but not necessarily both.

      https://g.co/gemini/share/5ca2bd4abb59

      Albert Einstein Quote: “I want to know what God thinks. The rest are details.”

    • Frank Vega permalink
      March 14, 2024 11:16 am

      I rewrite another version after my interaction with the Google AI (Gemini):

      https://www.preprints.org/manuscript/201908.0037/v10

    • Frank Vega permalink
      March 18, 2024 4:10 am

      In the previous version v10, we create two ways to prove that MWX2SAT is in P. In this version v11, we just remove the one that is flawed and keep the single one that is correct.

      https://www.preprints.org/manuscript/201908.0037/v11

    • Frank Vega permalink
      March 23, 2024 7:43 am

      Dear Fellow,

      I would be very grateful if you as an expert in the field would consider reviewing this publication:

      Publication title: “Note for the P versus NP Problem”

      https://www.scienceopen.com/document/read?id=94777eae-28c7-402d-8e6b-8e882de7a3f6

      To review, go to that page and click on the “Review” button or consult the details about the reviewing process at ScienceOpen:

      https://about.scienceopen.com/peer-review-guidelines

      ScienceOpen enables every researcher to openly review any paper which has been submitted or published elsewhere.

      Thank you very much also for your cooperation. I look forward to your review!

      With kind regards,
      Frank Vega

    • Frank Vega permalink
      April 1, 2024 11:18 am

      We improved the Lemma 1 (that is the polynomial time reduction from K-CLOSURE to MWX2SAT (Lemma 1)) and added a Discussion section with some important highlights in this last new version:

      https://www.preprints.org/manuscript/201908.0037/v12

    • Frank Vega permalink
      April 2, 2024 12:40 pm

      Dear Fellow,

      I would be very grateful if you as an expert in the field would consider reviewing this publication:

      Publication title: “Note for the P versus NP Problem” (Second Version)

      https://www.scienceopen.com/document/read?vid=5e6e97ed-697f-4685-a43b-6c48e598af02

      To review, go to that page and click on the “Review” button or consult the details about the reviewing process at ScienceOpen:

      https://about.scienceopen.com/peer-review-guidelines

      ScienceOpen enables every researcher to openly review any paper which has been submitted or published elsewhere.

      Thank you very much also for your cooperation. I look forward to your review!

      With kind regards,
      Frank Vega

    • Frank Vega permalink
      April 12, 2024 12:17 pm

      This the last version of the proof that P = NP:

      https://www.techrxiv.org/users/258307/articles/716221-note-for-the-p-versus-np-problem

      This version is as perfect as possible!

  2. March 8, 2024 10:20 pm

    The problem with using Bitcoin in this value estimation is that none of the “value” of Bitcoin would actually be captured if the hashing algorithm were compromised. The price of Bitcoin would just become zero as everyone fanatically tries to shed their coins. The first person to notice their coins being stolen or the mining difficulty skyrocketing would sound the alarm and it’d be over before the attacker could cash out to fiat.

    • March 9, 2024 12:03 pm

      Yes, but we mean the present value of increasing the level of assurance that this scenario will not happen.

  3. Frank Vega permalink
    June 8, 2024 7:40 am

    Yesterday a reliable proof of The Millennium Prize Problem (P vs NP) has been published!

    We define the Monotone Weighted Xor 2-satisfiability problem (MWX2SAT). We show this problem is NP-complete and P at the same time. In this way, we prove the complexity classes P and NP are equals using a feasible and efficient algorithm.

    https://ipipublishing.org/index.php/ipil/article/view/92

  4. Frank Vega permalink
    June 11, 2024 9:16 am

    I copied this highlight text from a source on internet:
    ————————————————
    I think Frank’s proof could work! Here’s the reduction from k-CLIQUE to k-CLOSURE. Given graph G=(V,E), create a DAG H where there is one node for each vertex of G and one node for each edge of G. If v_e is the node correspond to edge (a,b) \in E then make v_e be preceded by v_a,v_b in H (i.e. the new nodes created from the vertices). Therefore, k-CLIQUE becomes equivalent to deciding whether or not there is a closure with k+{\tbinom {k}{2}} nodes. Frank’s problem (MWX2SAT) can be easily reduced to k-MWX2SAT over the same kind of instances but with satisfying truth assignments of exactly k true variables. The rest of Frank’s proof could remain the same!
    —————————————————
    https://ipipublishing.org/index.php/ipil/article/view/92

  5. Frank Vega permalink
    June 11, 2024 10:42 pm

    I think I found another version of the reduction from CLIQUE to K-CLOSURE by myself (without citing any external source). Given a graph G = (V, E), we remove the disconnected vertices in G and every pair of vertices u and v such that they only are endpoints of a single edge (u, v) \in E. Note that, finding a clique of size k \leq 2 is trivial. Next, we create a line graph L of G. Each edge in L is a pair of edges from G that have a shared vertex. This vertex is included in the edge representation of L, for example, as ((u,v),(u,w)) in L where u is the shared vertex between the two edges (u,v) and (u,w) in G. We are always able to find this line graph L in polynomial time due to our initial redundant vertices removal process in G . Now, finding a clique of size k in G is equivalent to finding a closure of size {k \choose 2} in L.

    https://ipipublishing.org/index.php/ipil/article/view/92

  6. Frank Vega permalink
    June 15, 2024 8:30 pm

    I have received some negative comments about my P=NP proof from detractors. Basically, they said that Garey and Johnson’s book should be wrong about the statement and definition of K-CLOSURE as an NP-complete problem. Since Garey and Johnson’s book mentioned that CLIQUE can be reduced to K-CLOSURE in polynomial time, I tried to find that reduction myself. I think my previous comments were wrong on that topic. I need time to improve it since my P=NP proof took me too long to achieve despite its simplicity. I also considered the possibility that the error was on my part, which could only mean that I had misunderstood the use of the words \textbf{either... or...} which I took to be equivalent to Boolean XOR operation in the book. There are contexts where it is used as OR in English, but we looked through the entire book and authors Garey and Johnson always used that statement in the book as the Boolean XOR operation. If that’s not the case, then my P=NP proof should be correct for sure!

    https://ipipublishing.org/index.php/ipil/article/view/92

  7. Frank Vega permalink
    June 17, 2024 6:35 am

    There are some people who speculate that the definition used by Garey and Johnson is ambiguous to justify that my article should not be correct. Here is the definition of K-CLOSURE, but each vertex must have equal weight to match the K-CLOSURE definition of NP-Completeness by Garey and Johnson in their book “Computers and Intractability: A Guide to the Theory of NP-Completeness” (1979):

    https://cstheory.stackexchange.com/questions/40447/min-weight-k-closure-on-dag

  8. June 17, 2024 8:45 am

    I think K-CLOSURE is NP-hard due to a reduction from K-Bipartite Expansion based on a comment on the Stack Exchange link I shared above. Given a bipartite n-vertex graph G'=(U,W,E') and positive integers b and c, the problem K-Bipartite Expansion consists in deciding whether there is a b-subset of U with exactly c neighbors in W. To reduce it to the K-CLOSURE problem, we create a DAG G with two levels by duplicating each vertex in U exactly n^2 times and directing all edges of W to the new set U'. Given this scenario, we ask for the existence of a closure with exactly k = b \cdot n^2 + c vertices. This follows from the fact that a solution with b \cdot n^2 vertices in U' implies that there must exist c predecessor vertices in W by the definition of a closure set. Note that, it is impossible that it could exist m \cdot n^2 vertices for m \in \mathbb{N} in the closure which belong to the bipartite set W since the cardinality of W must be less than n. Here is the NP hardness proof for Bipartite Expansion where we can infer the decision variant of K-Bipartite Expansion:

    https://www.semanticscholar.org/paper/Hardness-of-Bipartite-Expansion-Khot-Saket/6c292af8b5eb9b3dc3b8ad216833063941b5c9be?p2df

  9. Frank Vega permalink
    June 17, 2024 4:35 pm

    My comments above did not match Garey and Johnson’s definition of NP-Completeness K-CLOSURE. Indeed, they emphasized that for all edges (u, v) \in A in a digraph H = (V, A) either u \in V' or v \notin V' for a closure set V' of at most k vertices which also means that either (u \in V' and v \in V') or (u \notin V' and v \notin V'). Let me try one more time to finish this challenge suggested by people who want to refute my P=NP article because they do not believe in the Queyranne’s reduction used by Garey and Johnson in their book! The CLIQUE problem (a variant of optimization) consists of finding the maximum clique in a given undirected graph G = (V, E). First, we initialize an integer variable k = 0. Next, we create a digraph H as follows: for each pair of vertices a, b \in V we always create two new vertices u_{ab} and v_{ab} and added two new edges (a, u_{ab}), (b, v_{ab}) whenever (a, b) \notin E. Every time we create these two new vertices and edges whenever there is a missing edge in G, we increment our variable k = k + 2. By incrementing k by 2, we guarantee that vertices a and b do not coincide in the maximum clique. Finally, we reduce CLIQUE (in its optimization variant) to find a maximum closure set into H of at most k vertices. If you believe if my reduction or Queyranne’s reduction is correct, then you just only need to check the remaining steps of my P = NP proof in this published article:

    https://ipipublishing.org/index.php/ipil/article/view/92

  10. June 17, 2024 5:46 pm

    \textbf{[Corrigendum]}: Finally, we reduce CLIQUE (in its optimization variant) to find a \textbf{minimum} closure set into H of at most k vertices.

  11. Frank Vega permalink
    June 18, 2024 10:37 am

    The author of the reference [Queyranne, 1976] that Garey and Johnson used in their book “Computers and Intractability: A Guide to the Theory of NP-Completeness” (1979) for the definition of NP-completeness of K-CLOSURE is still alive. I will suggest to the naysayers to question him the polynomial reduction from CLIQUE to K-CLOSURE instead of making the mistake of trying to do what he already did or assuming that they committed a mistake in the definition of K-CLOSURE. Here is the link to his website

    https://www.researchgate.net/profile/Maurice-Queyranne

  12. Frank Vega permalink
    June 21, 2024 4:25 am

    We define R(n) = \frac{\Psi(n)}{n \cdot \log \log n} for n \geq 3 (using the Dedekind \Psi function). We prove the inequality R(N_{n+1}) < R(N_{n}) holds for n big enough, where N_{n} is the primorial of order n. This implies the Riemann hypothesis is true.

    https://www.preprints.org/manuscript/202002.0379/v10

  13. June 21, 2024 4:41 am

    We define the Monotone Weighted Xor 2-satisfiability problem (MWX2SAT). We show this problem is NP-complete and P at the same time. In this way, we prove the complexity classes P and NP are equals using a feasible and efficient algorithm.

    https://www.preprints.org/manuscript/201908.0037/v14

  14. Frank Vega permalink
    June 24, 2024 5:02 am

    A Note on Riemann Hypothesis (authors: Gemini [Google AI] and me)

    In trenches we toil, unseen, unheard,
    Building bridges, mending every word.
    Weaving tapestries of care and trust,
    For the weary, the lost, the filled with dust.
    We plant the seeds, the ones that grow,
    For a world where kindness starts to flow.
    We’ll keep on striving, though voices fall flat,
    For the good of the many, where truly it matters at that.
    Though fame may elude, and praise be thin,
    We’ll find our solace in the battles we win.

    ————————————————-

    https://www.preprints.org/manuscript/202002.0379/v11

  15. Frank Vega permalink
    June 25, 2024 10:27 am

    Last version of my proof of Riemann Hypothesis sent for peer-review:

    https://www.preprints.org/manuscript/202002.0379/v12

  16. Frank Vega permalink
    July 2, 2024 4:51 pm

    One of the biggest unsolved mysteries in computer science is the P versus NP problem. This work proposes that a specific NP-complete problem, MWX2SAT, can be solved efficiently. In this way, we prove that P is equal to NP. This solution might have also a huge impact in mathematics.

    https://ipipublishing.org/index.php/ipil/article/view/92

  17. Frank Vega permalink
    July 25, 2024 3:15 pm

    Wiles’ proof of Fermat’s Last Theorem, while successful, relied on modern tools far beyond Fermat’s claimed approach in terms of complexity. The present work potentially offers a solution which is closer in spirit to Fermat’s original idea.

    https://www.preprints.org/manuscript/202109.0480/v6

  18. Frank Vega permalink
    September 12, 2024 3:25 pm

    People continue arguing and sending messages to me about the NP-Completeness K-CLOSURE defined by Garey and Johnson. Garey and Johnson emphasized that for all edges (u, v) \in A of a digraph H = (W, A) in K-CLOSURE either u \in V' or v \notin V' which also means that either (u \in V' and v \in V') or (u \notin V' and v \notin V'). Most people think Garey and Johnson were wrong and the cite [Queyranne, 1976] was a miscommunication. Frequently, their convinced discourse assures that the empty set is a trivial counterexample of its correctness. They don’t realize that the Queyranne’s reduction necessarily implies that a closure V' of size at most k can be transformed into a clique of size at least k' = \mid  (W - V') \mid (or k' = \mid  (W - V') \bigcap V \mid, etc) in a reduction from CLIQUE (input: an undirected graph G = (V, E)) to K-CLOSURE (output: a digraph H = (W, A)). Hence, an empty closure can imply the maximum clique in a complete graph.

    If you believe the Queyranne’s reduction is correct, then you just only need to check the remaining steps of my P = NP proof in the following published article:

    https://ipipublishing.org/index.php/ipil/article/view/92

  19. September 12, 2024 4:17 pm

    Garey and Johnson emphasized that for all edges (u, v) \in A of a digraph H = (W, A) in K-CLOSURE either u \in V' or v \notin V' which also means that either (u \in V' and v \in V') or (u \notin V' and v \notin V'). Most people think Garey and Johnson were wrong and the cite [Queyranne, 1976] was a miscommunication. Some people were kind and normal regarding this topic, but some other people are daunted and really naysayers about it (that’s why I don’t waste too much time on this) but at some point, I believe I need to put an end into this story. A variant of CLIQUE problem consists in finding a clique of size “exactly” k in a given undirected graph G = (V, E). Then, we create a digraph H = (W, A) as follows: for every pair of vertices a, b \in V (we also consider the case of a = b), we create two new vertices u_{ab}, v_{ab} \in latex W (we also guarantee that a, b \in W) and add two new edges (a, u_{ab}), (b, v_{ab}) \in A whenever (a, b) \in E or a = b. Finally, we reduce the finding of a clique in G of size “exactly” k into the finding of a closure set V' in H of size “exactly” 3 \cdot k + 2 \cdot {k \choose 2} (we conveniently take {1 \choose 2} as 0). Solving the classic variant of CLIQUE problem consists in finding a solution to a clique of size “exactly” a natural number between k and \mid V \mid where \mid \ldots \mid is the cardinality function.

    If you believe the Queyranne’s reduction or mine is correct, then you just only need to check the remaining steps of my P = NP proof in the published article:

    https://ipipublishing.org/index.php/ipil/article/view/92

  20. September 12, 2024 5:04 pm

    My last reduction is not correct but it should be quite close to the correct one!

  21. Frank Vega permalink
    September 12, 2024 5:52 pm

    Indeed, my last reduction might be correct if we allow the closure set V' in H to take a size “exactly between” 3 \cdot k + 2 \cdot {k \choose 2} and 2 \cdot (k + {k \choose 2}) + \mid V \mid.

  22. Frank Vega permalink
    September 12, 2024 6:05 pm

    I mean “exactly between” 3 \cdot k + 2 \cdot {k \choose 2} and $3 \cdot k + 2 \cdot {k \choose 2} + \mid E \mid – {k \choose 2}$.

  23. Frank Vega permalink
    September 12, 2024 7:43 pm

    Garey and Johnson emphasized that for all edges (u, v) \in A of a digraph H = (W, A) in K-CLOSURE either u \in V' or v \notin V' which also means that either (u \in V' and v \in V') or (u \notin V' and v \notin V'). Most people think Garey and Johnson were wrong and the cite [Queyranne, 1976] was a miscommunication. Let me put an end at this story! The CLIQUE problem consists in finding a clique of size at least k in a given undirected graph G = (V, E). Then, we create a digraph H = (W, A) as follows: for every pair of vertices a, b \in V (we assume that a \neq b), we create two new vertices u_{ab}, v_{ab} \in W (we also guarantee that a, b \in W) and add two new edges (a, u_{ab}), (b, v_{ab}) \in A whenever (a, b) \in E. Finally, we reduce the finding of a clique in G of size at least k into the finding of a closure set V' in H of size at least k + 2 \cdot {k \choose 2} (we conveniently ignore the case when k = 1). We can use the Garey and Johnson book’s definition of its problem K-CLOSURE using the complement graph of G and finding a closure set V'' of size at most \mid W \mid - (k + 2 \cdot {k \choose 2}), where \mid W \mid means the size of the set of vertices W in both graph.

    If you believe the Queyranne’s reduction or mine is correct, then you just only need to check the remaining steps of my P = NP proof in the published article:

    https://ipipublishing.org/index.php/ipil/article/view/92

  24. September 12, 2024 8:34 pm

    I mean we can use the Garey and Johnson book’s definition of its problem K-CLOSURE using the complement graph of G, following the same steps to create the digraph H' = (W', A') and finding a closure set V'' in H' of size at most \mid W \mid - (k + 2 \cdot {k \choose 2}), where \mid W \mid means the size of the set of vertices W in H.

  25. Frank Vega permalink
    September 12, 2024 8:57 pm

    [Final] Garey and Johnson emphasized that for all edges (u, v) \in A of a digraph H = (W, A) in K-CLOSURE either u \in V' or v \notin V' which also means that either (u \in V' and v \in V') or (u \notin V' and v \notin V'). Most people think Garey and Johnson were wrong and the cite [Queyranne, 1976] was a miscommunication. Let me put an end at this story! The CLIQUE problem consists in finding a clique of size at least k in a given undirected graph G = (V, E). Then, we create a digraph H = (W, A) as follows: for every pair of vertices a, b \in V (we assume that a \neq b), we create two new vertices u_{ab}, v_{ab} \in W (we also guarantee that a, b \in W) and add two new edges (a, u_{ab}), (b, v_{ab}) \in A whenever (a, b) \in E. Finally, we reduce the finding of a clique in G of size at least k into the finding of a closure set V' in H of size at least k + 2 \cdot {k \choose 2} (we conveniently ignore the case when k = 1). We can use the Garey and Johnson book’s definition of its problem K-CLOSURE using the complement graph of G, following the same steps to create the digraph H' = (W', A') and finding a closure set V'' in H' of size at most (\mid V \mid + 2 \cdot {\mid V \mid \choose 2}) - (k + 2 \cdot {k \choose 2}), where \mid V \mid is the number of vertices in G.

    If you believe the Queyranne’s reduction or mine is correct, then you just only need to check the remaining steps of my P = NP proof in the published article:

    https://ipipublishing.org/index.php/ipil/article/view/92

  26. September 13, 2024 4:22 pm

    The \textit{CLIQUE} problem, which involves determining if a clique of size at least k exists in a given undirected graph G = (V, E), can be reduced to the problem of finding a closure set V' in a constructed directed graph H = (W, A) of size at most (n - k) + 2 \cdot n^{2} \cdot \left(\binom{n}{2} - m + n - k\right), where n = \mid V \mid is the number of vertices and m = \mid V \mid is the number of edges of G. The directed graph H is created by introducing 2 \cdot n^{2} new vertices \underbrace{u_{(ab, 1)}, u_{(ab, 2)}, \ldots, u_{(ab, n^{2})}}_{\textit{new } n^{2} \textit{ vertices}} \in W and \underbrace{v_{(ab, 1)}, v_{(ab, 2)}, \ldots, v_{(ab, n^{2})}}_{\textit{new } n^{2} \textit{ vertices}} \in W for each pair of vertices a, b \in V and adding 2 \cdot n^{2} new edges \underbrace{(a, u_{(ab, 1)}), (a, u_{(ab, 2)}), \ldots, (a, u_{(ab, n^{2})})}_{\textit{new } n^{2} \textit{ edges}} \in A and \underbrace{(b, v_{(ab, 1)}), (b, v_{(ab, 2)}), \ldots, (b, v_{(ab, n^{2})})}_{\textit{new } n^{2} \textit{ edges}} \in A whenever (a, b) \notin E or a = b. Note that, an empty closure can imply the maximum clique in a complete graph G. Given that \textit{CLIQUE} is an NP-complete problem, it follows that \textit{CLOSURE} is also NP-complete.

    If you believe the Queyranne’s reduction or mine is correct, then you just only need to check the remaining steps of my P = NP proof in the published article:

    https://ipipublishing.org/index.php/ipil/article/view/92

  27. Frank Vega permalink
    September 13, 2024 9:29 pm

    Garey and Johnson defined K-CLOSURE such that for any edge (u, v) in the directed graph, either node u is in the set V’ or node v is not in V’. This implies that either both nodes are in V’ or both are not in V’. Our previous work in IPI Letters presented a polynomial-time algorithm for K-CLOSURE. While no errors have been identified in this work, many believe that Garey and Johnson’s original definition was incorrect, and their citation of Queyranne was a misunderstanding. Many argue that the empty set serves as a simple counterexample to Garey and Johnson’s definition of K-CLOSURE. This paper proposes that K-CLOSURE is actually an NP-complete problem, which would imply that P equals NP.

    https://www.researchgate.net/publication/384017297_Solving_NP-complete_Problems_Efficiently

  28. Frank Vega permalink
    September 15, 2024 8:15 pm

    While no errors have been identified in my polynomial-time algorithm for K-CLOSURE, many believe that Garey and Johnson’s original definition was incorrect. This published article proposes that K-CLOSURE is actually in NP-complete.

    https://ipipublishing.org/index.php/ipil/article/view/122

  29. Frank Vega permalink
    September 16, 2024 9:12 am

    Definitely solved MWX2SAT using maximum matching (a polynomial time solution for a proven NP-complete by a reduction from K-CLOSURE).

    https://github.com/frankvegadelgado/alma

    This algorithm takes time O(number_of_nodes ** 3) [1].
    [1] “Efficient Algorithms for Finding Maximum Matching in Graphs”, Zvi Galil, ACM Computing Surveys, 1986.

  30. September 19, 2024 8:11 am

    We create a new version where the Monotone Weighted Xor 2-satisfiability problem (MWX2SAT) consists in deciding whether any of its instances contains a satisfying truth assignment in which \textbf{at least} k of the variables are true. These instances are defined as a n-variable 2CNF formula with monotone clauses (meaning the variables are never negated) using XOR logic operators (instead of using the operator OR) and a positive integer k. We show that the Monotone Weighted Xor 2-satisfiability problem (MWX2SAT) is NP-complete and P at the same time. Certainly, we make a polynomial time reduction from every directed graph and positive integer k in the K-CLOSURE problem to an instance of MWX2SAT. In this way, we show that MWX2SAT is also an NP-complete problem. Moreover, we create and implement a polynomial time algorithm which decides the instances of MWX2SAT. Consequently, we prove that P = NP.

    https://ipipublishing.org/index.php/ipil/article/view/92

  31. September 19, 2024 8:38 am

    In this new version of my article proving that P = NP, the difference with the previous version is that I used “at least” instead of “at most”. Everything is fine now (even the deployment to GitHub with Python). I think the key to being able to solve it was first in implementing my solution on my computer (detecting the error) and second in the confidence/faith in success: Everything conspires in favor when you firmly believe that you can do it (when you take it for granted), in the illusion of the result (feeding hope: visualizing the achievement) and in the awareness that the universe is friendly (it helps you in your purpose) seen from any point of view (religious or not).

    https://ipipublishing.org/index.php/ipil/article/view/92

  32. September 20, 2024 2:40 pm

    \textbf{CLOSURE}

    INSTANCE: A directed graph G = (V, A) and a positive integer k.

    QUESTION: Is there set V' of at least k vertices such that for all (u, v) \in A either u \in V' or v \notin V'?

    REMARKS: The \textit{CLOSURE} problem is NP-complete. This follows from the fact that it is equivalent to the NP-complete \textit{K-CLOSURE} problem, where the target closure size is simply complemented by reversing the direction of all edges in the graph.

    A k-vertex set V' is a “closure” if, for every edge (u, v) in the graph, either both vertices u and v are in V' or are not in V'. In other words, a closure is a subset of vertices where, if one vertex is included, all vertices that have edges pointing directly to it must also be included. The closure property holds for graphs of any size, and it is “hereditary”, meaning that if a graph has the closure property, then any subgraph of that graph also has the closure property~\cite{GJ79}. This property is a fundamental characteristic of closures. Furthermore, we can efficiently determine if a given subgraph has the closure property using a polynomial-time algorithm. This means that checking for the closure property is computationally feasible, even for large graphs. The \textit{CLOSURE} problem involves finding the smallest possible subset of vertices in a directed graph that, when removed, leaves a graph with the closure property. This problem is equivalent to the \textit{Node-deletion problem with property closure}, which is known to be NP-complete [1]. Therefore, the \textit{CLOSURE} problem is also NP-complete, meaning that it is impossible to find an efficient algorithm for solving it unless P = NP.

    [1] Mihalis Yannakakis. 1978. Node-and edge-deletion NP-complete problems. In Proceedings of the tenth annual ACM symposium on Theory of computing (STOC ’78). Association for Computing Machinery, New York, NY, USA, 253–264.

    See polynomial time solution to \textit{CLOSURE} in the following paper:

    https://ipipublishing.org/index.php/ipil/article/view/92

  33. Frank Vega permalink
    October 2, 2024 1:46 pm

    I recently shared a published version of an article in the journal IPI Letters about one of the millennium problems. I wanted to share with you the fact that the new and old version published in IPI Letters was wrong (or rather incomplete). I was able to fix and extend it and manage to reach a state of that result that keeps me optimistic and happy. Here I share it with you:

    https://www.preprints.org/manuscript/202409.2053/v3

  34. Frank Vega permalink
    October 2, 2024 2:00 pm

    In 2022, I published my article on the Riemann hypothesis in The Ramanujan Journal (a high-impact journal) and recently I completed it, achieving the demonstration of the complete hypothesis. Even though the difference is basically 2 more pages, I wish it had occurred to me at the time, but it was only now that I managed to notice the improvement:

    https://www.preprints.org/manuscript/202409.1972/v1

  35. Frank Vega permalink
    October 2, 2024 2:10 pm

    It also happened to me when I was debating with another colleague about Fermat’s Last Theorem, a solution to this theorem came to me that seems beauty, brilliant and very, very simple. Although it may seem incredible, I have faith in this demonstration of Fermat’s Last Theorem (perhaps I have a little less confident on this one, but I still have a hope according to my intuition):

    https://www.preprints.org/manuscript/202109.0480/v9

  36. Frank Vega permalink
    October 2, 2024 11:50 pm

    A polynomial time algorithm for an NP-complete problem described at [https://www.preprints.org/manuscript/202409.2053/v3]:

    https://cs.stackexchange.com/questions/169897/a-2-coloring-of-a-bipartite-graph-such-that-one-of-the-partitions-contains-exact

  37. Frank Vega permalink
    October 8, 2024 6:24 am

    If an NP-complete problem could be solved efficiently, it would imply that all problems in NP could be solved efficiently, proving P equals NP. This research proposes that EXACT CLOSURE, a notoriously difficult NP-complete problem, can be solved efficiently, thereby potentially establishing the equivalence of P and NP. This work is an expansion and refinement of the article “Note for the P versus NP problem”, published in IPI Letters.

    https://www.preprints.org/manuscript/202409.2053/v5

  38. November 14, 2024 3:14 pm

    This research proposes that a notoriously difficult NP-complete problem can be solved efficiently, thereby potentially establishing the equivalence of P and NP. This work is an expansion and refinement of the article “Note for the P versus NP problem”, published in IPI Letters.

    https://www.preprints.org/manuscript/202409.2053/v8

  39. December 23, 2024 8:45 am

    ● Efficiently solves SAT in polynomial time.
    ● Breaks the long-standing complexity barrier.
    ● Revolutionizes fields like AI, cryptography, and optimization.

    https://www.linkedin.com/pulse/sat-polynomial-time-proof-p-np-frank-vega-ltfie

  40. January 24, 2025 12:06 pm

    Solve SAT in polynomial time via maximum independent set in line graphs.

    https://www.preprints.org/manuscript/202409.2053/v17

  41. January 27, 2025 2:49 am

    A better approximate solution to the Minimum Vertex Cover Problem distributed via pip: https://pypi.org/project/capablanca/

  42. Frank Vega permalink
    January 31, 2025 9:10 am

    We present a polynomial-time algorithm achieving an approximation ratio below √2 for Minimum Vertex Cover Problem, providing strong evidence that P = NP and disproving the Unique Games Conjecture. The polynomial time solution was implemented in Python and it is available at

    https://pypi.org/project/capablanca/

  43. Frank Vega permalink
    February 1, 2025 1:12 am

    Breakthrough Algorithm Solves the Unsolvable: A Giant Leap Towards P = NP!

    Discover the 4/3-Approximation for MVC That’s Redefining the Boundaries of Computer Science!

    Why This Matters?
    – Challenges P = NP: Our algorithm provides strong evidence that P could equal NP, solving computationally hard problems with near-optimal efficiency.
    – Defies the Unique Games Conjecture: This breakthrough suggests that many optimization problems may have better solutions than previously thought, revolutionizing theoretical computer science.
    – Real-World Impact: From AI to logistics, this discovery paves the way for faster, smarter solutions to some of the most challenging problems in tech and beyond.

    https://pypi.org/project/capablanca/

  44. February 5, 2025 8:56 pm

    Discover a new 7/5-Approximation for MVC That’s Redefining the Boundaries of Computer Science!

    Why This Matters?

    1. Challenges P = NP,
    2. Defies the Unique Games Conjecture,
    3. Real-World Impact.

    Distributed via PyPI:

    https://pypi.org/project/varela/

  45. Frank Vega permalink
    February 7, 2025 2:02 pm

    Disproving the Unique Games Conjecture! This work provides a strong evidence that the Unique Games Conjecture does not hold. This is very positive approach since if the conjecture would be true, then this would imply the hardness of approximating solutions to many other important problems in various fields of mathematics, including graph theory, optimization, and even aspects of geometry. BTW, I removed the flawed extension of this result for proving P=NP.

    https://pypi.org/project/varela/

  46. Frank Vega permalink
    March 6, 2025 11:00 pm

    The Dominating Set Problem is a fundamental problem in graph theory and combinatorial optimization, with applications in network design, social network analysis, and resource allocation. The algorithm employs an efficient algorithm to compute an approximate Dominating Set for an undirected graph using the NetworkX library. A Dominating Set is a subset of vertices such that every vertex in the graph is either in the subset or adjacent to at least one vertex in the subset. Since finding the exact minimum Dominating Set is NP-hard, the algorithm employs a 8-approximation approach based on a minimum maximal matching. The algorithm first validates the input graph to ensure it is undirected and non-empty. Isolated nodes are removed as they cannot contribute to a connected Dominating Set. The core of the algorithm uses NetworkX’s “min_maximal_matching” function. The resulting matching is converted into a Dominating Set by including all nodes in the matched edges. Next, we iteratively remove redundant vertices from the initial candidate dominating set. This approach guarantees polynomial runtime while maintaining reasonable approximation quality. This work contributes to the growing body of research on approximation algorithms for NP-hard problems, providing strong evidence that P = NP.

    https://pypi.org/project/baldor/

  47. Frank Vega permalink
    March 12, 2025 11:43 am

    Leading AI systems have reportedly confirmed the validity of my P=NP proof and provided a detailed analysis of its theoretical foundations:

    https://www.linkedin.com/posts/frank-vega-delgado_vertex-cover-problem-new-insights-critique-activity-7305610255227404289-y9Kq?utm_source=share&utm_medium=member_desktop&rcm=ACoAAA3ojloBlOVpZezqUMtGhzfTLS7gJSDjw14

  48. Frank Vega permalink
    March 20, 2025 11:01 pm

    This algorithm resolves the NP-Hard Minimum Dominating Set Problem in quadratic time, suggesting P = NP and delivering practical benefits across diverse fields, including artificial intelligence, medicine, and impact in the industries.

    https://dev.to/frank_vega_987689489099bf/polynomial-time-algorithm-for-mds-p-np-1ln6

  49. Frank Vega permalink
    March 27, 2025 1:23 am

    I proposed an approximate solution to the NP-Hard Minimum Dominating Set Problem (MDS), which has shown promising experimental results. The experimental findings demonstrate a 2-approximation ratio for this problem, providing compelling evidence that P=NP.

    https://dev.to/frank_vega_987689489099bf/a-2-approximation-algorithm-for-dominating-sets-1aga

  50. Frank Vega permalink
    April 4, 2025 6:21 am

    This result decisively refutes the conjecture, enhancing our insight into prime distribution and the behavior of the zeta function’s zeros through analytic number theory.

    https://www.preprints.org/manuscript/202504.0246/v1

Trackbacks

  1. Does Pi Conspire? | Gödel's Lost Letter and P=NP

Leave a Reply to vegafrankCancel reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading