Talk:Chaitin's constant
This article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Prohibition of Certain Programming Languages
[edit]"The definition of a halting probability relies on the existence of a prefix-free universal computable function. Such a function, intuitively, represents a programming language with the property that no valid program can be obtained as a proper extension of another valid program."
Chaitin added this requirement when someone pointed out that Chaitin's "constant" a.k.a. "Halting Probability" could be greater than 1. However, adding this to reduce its size eliminates any programming language that I am aware of as a computer programmer. A computer program is a sequence of commands. So you can always add additional commands, i.e. another program, to extend a valid program.
Invariance Theorem
[edit]Chaitin's Invariance Theorem, added to respond to the problem of programming languages with varying size for the same function, states that there is a finite bound to the difference in the sizes of the minimum programs to calculate a given number in any two programming languages. However, consider two programming languages, one of whose programs differ from the other programming language only in that every character is repeated to occur twice in a row. For example, it might be designed for unreliable communications over the internet and the second occurrence of every character serves as a check bit (byte) for verification.
If you follow through the proof of the Invariance Theorem with this in mind, you see that the problem is where they assume that the lengths of the representations of a numeric literal in two programming languages are the same, which is not the case.
Calculation of the Chaitin constant...?
[edit]Calculation of the start of a Chaitin Ω
Calude, Dinneen, and Shu have calculated the first 64 bits of a Chaitin ΩU for a particular machine. These are, in binary notation
0.0000001000000100000110001000011010001111110010111011101000010000...
or in decimal notation
0.0078749969978123844...
Note that the notion of whether a specific digit of Ω can be correctly calculated is different from the notion of computability: any specific digit of any number is computable since for any integer there is a program printing it.
- Do you mean that there is something wrong with the "Relationship to the halting problem" section stating that only the first few digits can be calculated? ΩU is uncomputable so there is no finite list of instructions such that if you give a computer that list of instructions, it will out put an arbitraily large number of digits of ΩU provided that the computer is big enough, but the first 64 digits of it have been calculated. Does that mean a brain is better than a computer, though there may have been partial use of a computer with constant human intervention to calculate those digits of ΩU even though a computer can't do it alone? Blackbombchu (talk) 16:29, 19 August 2013 (UTC)
What is this? Who are Calude, Dinneen, and Shu? What particular machine is being talked about? What references are there to back up this statement?
And more importantly, this little bit:
Note that the notion of whether a specific digit of Ω can be correctly calculated is different from the notion of computability: any specific digit of any number is computable since for any integer there is a program printing it.
is wrong. If all of the digits of Ω could be correctly calculated, then the number would be computable, as that function would then compute all of Ω. The reason given, that "for any integer there would be a program printing it" doesn't make sense and is unclear. I think we should remove this from the article.
192.88.124.204 (talk) 22:11, 12 April 2008 (UTC)
- That particular bit comes from the Calude, Dinneen and Shu paper "Computing a Glimpse of Randomness" and I agree that it is incorrect as worded. A "specific digit of any number" is not something to which computability applies; computability applies to the entire number up through that particular digit. Calude, et al were reinforcing the point that their particular method of calculating a portion of Chaitin's constant doesn't violate the basic tenants of the uncomputability of the constant itself. Regardless, it's incorrect and not necessary to the article. 76.99.236.216 (talk) 04:54, 16 May 2008 (UTC)
- I took a look at the paper and it seems they where able to calculate an upper bound on Chaitin Ω, because it is possible to decide the halting problem for very short programs (<=84 bits). The language they use has only primitive operations like shift left (=add 1) and shift right (=subtract 1). Even conditional jump (=a,0,L1) requires atleast 6 characters (=42 bits). I estimated that goldbach's conjecture takes about 1200 bits.
- The first 21 bits are 0.000000100000010000011. There's exactly one program of 7 bits (HALT). There's exactly one program of 14 bits (DUMP, HALT). There's exactly 3 programs of 21 bits (DUMP, DUMP, HALT), (JUMP 1, HALT), (dunno, HALT). There's nothing random in the first 64 bits.
- I'll delete this section, but leave the reference to the paper. Tlepp (talk) 01:00, 21 June 2008 (UTC)
The Omega defined in their paper is kind of defective. Each bit of data read by a program contributes 7 bits to its size (see end of section 5 of their paper), making it impossible to prove their Omega incompressible. In fact, the more data one of these program reads, the more compressible it is according to proper definitions like Chaitin's. John Tromp. —Preceding undated comment was added at 00:20, 10 February 2009 (UTC).
comments by exa
[edit]It is a normal and transcendental number which can be defined but cannot be computed
I really doubt that this fact is proven yet. Both (the normalness and the transcendence) is quite likely, but I have never seen a proven for either of that. I would be surprised if the irrationality of the number is already proven, which I doubt again due to the incomputableness of the number.
(The constant N heavily depends on the encoding choices and does not reflect the complexity of the axiomatic system in any way.)
Where did you get this terrible idea?.... Do you know the invariance theorem? Do you know Theorem C in AIT that shows the relation between the axiomatic complexity of a FAS to the number of digits that can be proven? exa 16 Sep, 2004
Proof of normalness: Omega is ML-random. Every ML-random is normal. QED.
Proof of transcendence: Omega is non-computable. Every algebraic number is computable. QED.
- ... then there exists a constant N such that no digit of Ω after the N-th can be proven to be one or zero within that system.
Is this true? It seems like this states that there are at most a finite number of Turing machines that can be proven to either halt or not halt (given some system of enumeration, etc.). Chas zzz brown 02:06 Feb 5, 2003 (UTC)
Ooops! In the words of emily latella, "never mind". I misread that the nth digit of Ω was 1 iff Turing machine n halts. Chas zzz brown 09:00 Feb 5, 2003 (UTC)
- The proof of the transcendence of Chaitin's number for any particular encoding could rely on the following indirect argument. If Chaitin's number is algebraic, then it is computable. But we can prove it isn't computable as this leads to a contradiction with the undecidability of the halting problem. However, does this argument hold water if Chaitin's number is algebraic, but it is absolutely impossible to determine which one it is? At 3am, I'm not sure... Elroch 02:11, 27 March 2006 (UTC)
I have rewritten the introduction to this article and added some headings to divide it into sections. I am not sure if this belongs to algorithmic information theory so perhaps someone can put it in the right category. MathMartin 17:44, 20 Jul 2004 (UTC)
What does the statement "Ω is uncompressible" mean? Is compressible number defined anywhere in the Wikipedia? - Mike Rosoft 16:55, 8 Dec 2004 (UTC)
Oops! I thought the link I had supplied to Data compression would clear it up. It means that it is impossible to supply a program + infinite "compressed" sequence such that for any , the program can return the first bits of Ω using the first bits of the compressed sequence, and for any there exists such that for , . I couldn't find this definition anywhere in the data compression category. Lev 20:15, 9 Dec 2004 (UTC)
I have included a summary of the definition for incompressible number. Rattle 09:00, 1 Jun 2005 (UTC)
Certainly there may be an algorithm that computes Chaitin's constant. There is just no way to prove that any algorithm does so.
- Nope. If there were such an algorithm, it could be used to solve the halting problem (even though we would not be able to prove it). But there is no algorithm, provable or not, that solves it. – Lev 10:35, 10 August 2005 (UTC)
- To clarify this a bit: There is no algorithm that computes ALL bits of Chaitin's constant. Otherwise, it could be used to solve the halting problem: We would try to compute Omega by running all possible programs in parallel (we would have to launch them one by one, of course). Then we would sum up 2^-(length) for every program that would terminate. Now one of two things would happen: either the prog in question would terminate, or the sum would eventually excede Omega-2^(-our prog's length), proving by contradiction that the answer is no.
- Now if there were an algorithm for computing Omega but we didn't know which it was, the above meta-algorithm would give us an algorithm for solving the halting problem, but we still wouldn't know what it was. But there is no solution to the halting problem, not even one we can't find.
- Certainly, for any finite N there is an algorithm that computes the first N bits of Omega, but we don't always know which it is. And there might (for all I know) be an algorithm that computes an infinite number of bits, but still leaves an infinite number of bits unknown.
- Lev 12:33, 21 December 2005 (UTC)
- I think the last remark here is wrong, or at least it contradicts to the last sentence of this article: from some N the digits cannot be computed algorithmically, i.e. the algorithm does not exist. From my experience with automatic theorem provers, I'd like to add a metaphor: the automated theorem proving can algorithmically prove quite a lot and help automate proving that certain algorithms do halt or do not halt, but at some (more complicated) point it needs human assistance. There are even more complicated problems than the halting problem, so the later numbers of Chaitin's constant will be even harder to compute.
- Lev is right. For any finite N, you can just print out a string which happens to be the relevant bit of omega :). Beyond a certain N, you cannot prove that you are right, as the article states.
- One candidate for an algorithm that produces an infinite number of bits is the program which starts simulating every Turing machine (cascaded, so that you're not trying to do an infinite number of things at once), and records those that halt. This machine can produce a sequence that limits to omega from below. IIRC, all of the `1's it produces are correct digits of omega. To prove that such a machine produces infinitely many bits of omega, it only remains to be shown that it keeps on producing 1s (I don't know if that can be done; it might depend on the model of computation/distribution of programs). -- pde 00:16, 14 March 2006 (UTC)
- I think the last remark here is wrong, or at least it contradicts to the last sentence of this article: from some N the digits cannot be computed algorithmically, i.e. the algorithm does not exist. From my experience with automatic theorem provers, I'd like to add a metaphor: the automated theorem proving can algorithmically prove quite a lot and help automate proving that certain algorithms do halt or do not halt, but at some (more complicated) point it needs human assistance. There are even more complicated problems than the halting problem, so the later numbers of Chaitin's constant will be even harder to compute.
On the probability interpretation
[edit]The following two sentences don't agree; the first one doesn't make sense.
It can then be shown that Ω represents the probability that a randomly produced bit string will encode a halting program. This means that if you start flipping coins, always recording a head as a one and a tail as a zero, the probability is Ω that you will eventually reach the encoding of a syntactically correct halting program.
It is true that Omega is the probability that a randomly chosen infinite sequence of 0s and 1s begins with a string that is a halting program. This makes sense, because there is a probability measure (Lebesgue measure) on the set of such infinite sequences, and the set of sequences that begin with a string encoding a halting program is a measurable set. But there cannot be a randomly chosen bit string because there is no nontrivial probability measure on the set of finite bit strings. CMummert 04:37, 16 June 2006 (UTC)
- Yeah, it shold be probability that randomly chosen infinite bit string begins with a proper program which stops after finite time.
I'm not sure will it make things more clear... But the way it is now is correct with some explanations. After all you can assume that you randomly chose bits of a string until (if ever) you reach a valid program string. In such random experiment probability of obtaining program that stops is exactly \Omega. Macieksk 09:28, 24 June 2006 (UTC)
Merge
[edit]The article "Omega (computer science)" is very similar to this one. I suggest that they be merged.P.L.A.R. 21:41, 29 June 2006 (UTC)
- See my comments on the other article's discussion page. This page should stay, the other one should go, since this is not the only omega in use in computer science, and perhaps not even the most common. There doesn't seem to be much info on the current Omega (computer science) page that should be brought over here. --Dantheox 23:40, 29 June 2006 (UTC)
Omega (computer science) now redirects to this page.P.L.A.R. 21:00, 30 June 2006 (UTC)
Properties of Chaitin's constant
[edit]Shouldn't some reference be made to the fact that one can use Chaitin's constant to determine the answer to any possible phrasable question? Jenigmat429 03:16, 13 February 2007 (UTC)
- That "fact" is not true. Chaitin's omega is at level Sigma_1 of the arithmetical hierarchy. Complete knowledge of it cannot even determine the truth of Pi_2 sentences of Peano arithmetic, much less more complicated questions. Chaitin's constant is Turing equivalent to the Halting problem, and nothing more. CMummert · talk 03:24, 13 February 2007 (UTC)
- Ok, more accurately, couldn't you use it to determine the provability of a proposition L in a formal system F? Because, if there's a program that's sufficiently small that will halt if L is provable in F, and if you know enough of chaitin's constant, one could use Chaitin's omega to determine L's provability in F. Or maybe I'm interpreting Chaitin's omega wrong. Jenigmat429 14:16, 13 February 2007 (UTC)
- Yes, you can use a Chaitin constant to do that. But this is because these constants are equivalent to the Halting problem, not in particular because of their randomness. CMummert · talk 16:56, 13 February 2007 (UTC)
rewrite
[edit]I expanded and rewrote the article this morning. Feel free to hack away at it, I'm done for a while. CMummert · talk 17:48, 13 February 2007 (UTC)
more rewrite
[edit]Currently the article says: "Each halting probability is a normal and transcendental real number which is definable but not computable, which means that there is no algorithm that enumerates its digits." But there is a non-halting algorithm that does enumerate its digits - Omega can be approximated from below. Eventually the first n digits will be correct. I'll correct this: "which means that there is no halting algorithm that enumerates its digits." Algorithms 20:25, 23 June 2007 (UTC)
Super Omega
[edit]One should mention the "Super Omega" of Jürgen Schmidhuber: http://www.idsia.ch/~juergen/kolmogorov.html . The first n bits of Gregory Chaitin's constant Omega are incompressible in the sense that we cannot compute them by a halting algorithm with fewer than n-O(1) bits. Hoewever, consider the short but never halting algorithm which lists and runs all possible programs; whenever one of them halts its probability gets added to the output (initialized by zero). After finite time the first n bits of the output will not change any more (it does not matter that this time itself is not computable by a halting program). So there is a short non-halting algorithm whose output converges (after finite time) on the first n bits of Omega, for any n. In other words, the enumerable first n bits of Omega are highly compressible in the sense that they are limit-computable by a very short algorithm; they are not random with respect to the set of enumerating algorithms. Schmidhuber constructed a Super Omega which is still limit-computable, but more random than Omega, as one cannot significantly compress it with an enumerating algorithm. Algorithms 15:43, 8 July 2007 (UTC)
---
If there exists a non-halting program that will converge to the first N-bits of Omega in a finite amount of time, it can be written in the form of an infinite loop:
omega=0 while(TRUE) /*LOOP BODY*/ end while
It is very easy to convert this infinite loop into a finite loop:
constant ITERATIONS = ___ omega=0 while ( a< ITERATIONS) /*LOOP BODY*/ a++ end while
The value ITERATIONS can be set to any arbitrarily high value, allowing this loop to execute for an arbitrary long amount of time. Since the non-halting version of the program can converge to any N number of bits you want in a finite amount of time, so can this one, assuming ITERATIONS is a large enough integer.
Suppose the non-halting version can compute X digits in Y iterations. The halting version, if ITERATIONS is set equal Y at the beginning of the program, will also produce the same X number of digits as the non-halting version. However ITERATIONS can be arbitrarily large. You could set ITERATIONS to larger and larger values, and compute more and more digits of Omega. You can set it as large as you want and compute as many digits of Omega as you want. But this seems to violate the property of incompressibility? —Preceding unsigned comment added by 64.162.134.186 (talk) 20:47, 8 October 2007 (UTC)
It would seem that way at first, but it turns out that the bits you determine via that non-halting algorithm may be innaccurate until it completes (which of course never happens) - see the reference "Computing a Glimpse of Randomness" in the article. —Preceding unsigned comment added by 76.99.236.216 (talk) 04:46, 16 May 2008 (UTC)
Yes the problem is obviously the "carry problem". Add a 1 in the googleplexth position to a binary fraction and it may flip all the lower order bits, even the first one. Indeed Solvay has shown that ZFC cannot determine even a single bit of certain Omegas. They come from certain specifically designed Universal machines usually called Solvay machines. Moreover Calude has shown that every computably enumerable random real is the halting probability of some Solvay machine. So the value of even a single bit of such random numbers (after an initial sequence of 0s) is neither computable nor decidable in standard (ZFC) formal mathematics —Preceding unsigned comment added by 38.104.189.46 (talk) 16:21, 14 March 2011 (UTC)
Also to further clarify, while there is a simple program that run for some number k steps will output the first n bits of Omega, this does not imply that it can be compressed. Yes by "dovetailing" one gets better and better approximations of Omega (this is the definition of a C.E. random real). In fact the number k would have to be part of the program (as a bound on a for loop, say, or a parameter to the primitive recursive STEP function). After k steps the values of the first n bits of Omega would have stabilized. But the incompressibilty of Omega implies that the length of k in bits would be proportional to n itself! So there is no contradiction, or way around it. And the super-Omegas have the property that they are not even computably enumerable, i.e., they can't be approximated from below in this way. —Preceding unsigned comment added by 207.237.121.70 (talk) 10:09, 15 March 2011 (UTC)
Errata
[edit]In the Background section, shouldn't say "f = λ x.F(p,x)" instead of "f(x) = λ x.F(p,x)"?
Clarification in lead section requested
[edit]For the benefit of people like me, for whom the technical details later in the article are a bit tough to digest, a couple of clarifications in the lead section would be helpful
- In the computer science subfield of algorithmic information theory a Chaitin constant or halting probability is a real number that informally represents the probability that a randomly-chosen program will halt. These numbers are formed from a construction due to Gregory Chaitin.
How is it possible, even in principle, to randomly choose a program, given that "a program" can be of unlimited length? Do you not need to choose randomly from all N-bit programs (or maybe all programs of <= N bits)?
- Although there are infinitely many halting probabilities, it is common to use the letter Ω to refer to them as if there were only one. Because Ω depends on the program encoding used, it is sometimes called Chaitin's construction instead of Chaitin's constant when not referring to any specific encoding.
I don't understand what this is saying. Is it saying:
- There are infinitely many individual halting probabilities because there are infinitely many programs?
- There are infinitely many halting probabilities for a randomly chosen program, one for each of infinitely many encoding schemes?
- There are infinitely many halting probabilities for a randomly chosen program, one for each of infinitely many program lengths?
- Something else?
86.133.243.133 (talk) 03:52, 11 November 2008 (UTC)
There are infinitely many halting probabilities, because there are infinitely many different programming languages. Each has its own halting probability.
You select a random program by flipping a balanced coin forever. This produces an infinite bit string. Since the programs are self-delimiting, if some initial sequence of this string is a valid program, that is the one selected. Then the probability of a program is just the probability that you get the program in this way or just 1/2^n, where n is the length of the program. Requiring that programs be "self-delimiting", as they are in the real world, was the key insight that led to this beautiful theory. Kolmogorv made the mistake of not counting "blank" characters on his Turing Machine tapes as information, but they clearly provide information, namely where input (a program) begins and ends. —Preceding unsigned comment added by 207.237.121.70 (talk) 10:21, 15 March 2011 (UTC)
More Errata
[edit]The page contains several inaccuracies, that I hope to address in the nearby future. For starters, I have fixed an error in the Goldbach conjecture argument, Running all <= N bit programs doesn't suffice, since their joint halting probability is not guaranteed to match Ω in the first N bits. The formulation 'on the other hand' is particularly misleading, since the two cases mentioned are not mutually exclusive.
A bigger problem is the definition of universality in terms of machines with two inputs. This fails to guarantee algorithmic randomness of the halting probability. For instance, an F that only accepts programs consisting of 2^i zeroes, could still be universal, but would obviously lead to a very non-random halting probability.
I changed the definition of universality. A proof of Omega's randomness, now possible, should be added next. John Tromp
Use of Chaitin's constant in proving unsolved problems in number theory
[edit]The section 'Use of Chaitin's constant in proving unsolved problems in number theory' makes no sense to me. How is that different from saying that a program that does that iteration will eventually say if the conjecture is false?. So, how many digits of Omega do we need to know to know wether Goldbach conjecture is true or false? Chaitin himself though that Fermat Last theorem is "true with no reason", before the proof in the 90's. Did he say also that Omega can help in that problem? —Preceding unsigned comment added by 76.193.219.199 (talk) 21:17, 9 July 2009 (UTC)
- I agree. That section is simply the observation that the halting problem can settle any \Pi_1 sentence in number theory. While true, it has very little to do with the constant. —Preceding unsigned comment added by 130.195.86.40 (talk) 04:05, 16 October 2009 (UTC)
- Yes; this is just a consequence of the binary expansion of the constant being Turing equivalent to the Halting problem, nothing more. That section should be reworked to that it doesn't imply this would be a practical method of solution. — Carl (CBM · talk) 12:43, 16 October 2009 (UTC)
- Per WP:BB I have partly rewritten [1] that section and renamed it to Relationship between the knowledge of Chaitin's constant and the halting problem. The first paragraph contains, now in generalized form, the fact that one could use knowledge of the constant to solve the halting problem. The second paragraph mentions the relationship between the number theory problems and the halting problem while dismissing the possibility of solving these problems this way. Hope that helps. X127 (talk) 18:40, 22 June 2011 (UTC)
- Yes. This can be supported (I do believe . . .) by Chaitin's book MetaMath!. Take a look at the last sentence, though; it seems to be misconstructed. But this is minor. It's a pleasure to encounter an editor who is coherent and who knows their stuff. (Please register as an official wikipedian. We need all the help we can get.) Bill Wvbailey (talk) 21:59, 22 June 2011 (UTC)
Proving Riemann Hypothesis
[edit]Don't you need peer review to claim that you have a method for proving Goldbach's Conjecture and the Riemann Hypothesis? I mean really, a hypothetical proof is not a proof and a verification by imagination alone is no guarantee in such a case. You are still left with the need for a leap of faith based upon something far less reliable than the character of God. —Preceding unsigned comment added by 99.156.205.151 (talk • contribs) 17:56, 12 September 2010
Possible unproven statement in the article
[edit]The Properties section states that all chaitin numbers are normal. How is it that chaitin constants have been proven using their randomness that they're normal and yet we can't use the very random nature of π to prove that π is normal or use the very random nature of the set of prime numbers to prove the twin prime conjecture. π is simpler do define than any of the chaitin numbers and yet it's harder to prove normality of that number. Did any of the refrences in this article prove the normality of chaitin numbers or state their normality, sourcing another source that stated the same thing. I'm sure that any chaitin number can be trivially proven to be normal from being computationally random, but have they really been proven to be computationally random, which is a much harder property to prove than uncomputability. Blackbombchu (talk) 03:14, 16 August 2013 (UTC)
- This bothers me. There are a number of similar "numbers" for which it can be proved that many individual bits are non-computable, but that the number is normal. ("This" number doesn't appear to be "computationally random" in the sense that I would mean it; in fact, with little effort, one could replace Ω with where d is any nonnegative integer less than 2n. — Arthur Rubin (talk) 20:15, 14 January 2014 (UTC)
Getting the digits of Omega ?
[edit]Good evening,
Am I right : given F, there is no algorithm taking an integer n as argument and returning the n-th digit of (or, equivalently, an infinite-time looping algorithm writing all its digits in order), but there is, for all n, an algorithm giving the n-th digit ? 86.196.217.126 (talk) 17:57, 12 October 2013 (UTC)
- Well, since the nth digit is one of a finite number of possibilities, there is an algorithm producing that number. There isn't a computable function from n to the nth algorithm, though, as that would compute the number. Perhaps this might be noted in the article, but I don't really see the point. — Arthur Rubin (talk) 20:23, 14 January 2014 (UTC)
Clarification for the general reader
[edit]The section 'Relationship to the halting problem' does not explicitly state that if p is a member of the domain of F, this is equivalent to saying that p halts. This may be confusing for the mathematically-literate general reader. Would it be possible to add a little more explanation to that section?
My apologies if I have just misunderstood the article - I am a general reader who has not touched computability theory for decades! Booleanmattock (talk) 16:39, 15 March 2014 (UTC)
The value
[edit]Every computer has its Chaitin's constant, does there exist a computer whose Chaitin's constant Ω is exactly 0.03? If no, why not? — Preceding unsigned comment added by 140.115.189.24 (talk) 06:16, 11 August 2015 (UTC)
- Nope. See intro: all values of Ω are transcendental. 0.03 is rational. ~ CZeke (talk) 14:08, 29 November 2015 (UTC)
Super Omega section formatting issue
[edit]At Chaitin's constant § Super Omega, the last sentence doesn't seem to make sense (though the subject is way beyond my decades-dusty engineering math ). Is it missing a word or two? Also, the parenthetical:
- 0``` = 0^{(3)}
seems like it should be something like:
- O‴ = O(3)
(3 backticks → triple-prime, zero → letter "O", "^" → superscript) or is it O(3) or ... —[AlanM1(talk)]— 09:49, 14 August 2018 (UTC)
Definition not compatible with algorithmic randomness
[edit]I don't see how the definition would exclude usage of a prefix-free universal computable function which e.g. outputs the empty string for input and simulates the th turing machine (in some computable enumeration) for input . The Chaitin's constant for this would not be algorithmically random, as we know already all the even bits. I think the missing part is 'additively optimal universal partial recursive prefix function' here, as taken from this book. The proof for algorithmic randomness goes like this: If you know , the first digits of , then you know the halting property of every -program of length at most . Call the smallest string not expressible by such a program . Since you can compute from , you expect the complexity of to differ from only by a fixed constant. But to make sure this argument translates into the actual -complexities, you want to be additively optimal.
Why "informally speaking"?
[edit]The first sentence of the article is the following:
"In the computer science subfield of algorithmic information theory, a Chaitin constant (Chaitin omega number) or halting probability is a real number that, informally speaking, represents the probability that a randomly constructed program will halt."
But the section Interpretation as a probability appears to make it clear that the Chaitin omega in fact is a probability.
Am I mistaken about this? If it really is a probability, then it should not be described as representing a probability only "informally speaking". 2601:200:C000:1A0:68A1:236F:AC0B:9D14 (talk) 20:47, 26 August 2022 (UTC)
"Chaitin`s number" listed at Redirects for discussion
[edit]An editor has identified a potential problem with the redirect Chaitin`s number and has thus listed it for discussion. This discussion will occur at Wikipedia:Redirects for discussion/Log/2022 October 15#Chaitin`s number until a consensus is reached, and readers of this page are welcome to contribute to the discussion. 1234qwer1234qwer4 15:29, 15 October 2022 (UTC)