In our codebase, we have some extremely powerful Busy Beaver techniques for accelerating Turing Machine simulation. But they are all dependent upon effective Tape Compression. If we cannot compress the tape, we cannot hope to gain much acceleration at all. And the bane of our run-length encoding tape compression are Counters.
As far as I know the name Counter was coined for these machines by Allen Brady in his 1983 paper where he proved BB(4) = 107. These are TMs which simulate “Binary Counters” by repeatedly “adding one” to a binary representation of an integer stored on the tape. These TMs take an exponentially increasing amount of time to expand the tape each additional increase. bbchallenge.org calls them “Exponential Counters”.
Consider (TM #11,004,366 on bbchallenge.org):
If we look at this machine only when it is in state B, reading a 0 off the right side of the tape, we see:
|Step||Tape||Binary Counter Value|
As advertised, this TM increments a binary counter and takes an exponentially increasing amount of time to expand the tape each step (the bolded step numbers).
This tape cannot be compressed at all times because over the course of time it will contain every finite binary sequence including the super-non-compressible ones like:
101001000100001 … whatcha going to do with that? Certainly not compress it using run-length encoding. And if you can’t compress it using run-length encoding, my simulator cannot prove rules and cannot accelerate it’s simulation.
Consider the following rule:
0n 1 00 B> 0 -> 1n+1 00 B> 0.
The base case is obvious.
00 1 00 B> 0 -> 11 00 B> 0 in
0 steps (the left and right sides are the same tape).
Now, for the inductive case: Let us assume that
P(k) is true:
0k 1 00 B> 0 -> 1k+1 00 B> 0 in
f(k) steps. Then:
|Step #||Left tape||State||Right tape||Via|
|f(k) + 0||
|f(k) + 1||
|f(k) + 2||
|f(k) + 3||
|f(k) + 4||
|f(k) + 5||
|f(k) + k + 6||
|f(k) + k + 7||
|f(k) + 2 k + 10||
|f(k) + 2 k + 11||
|f(k) + 2 k + 12||
|f(k) + 2 k + 13||
|f(k) + 2 k + 14||
|f(k) + 2 k + 15||
|f(k) + 2 k + 16||
|f(k) + 2 k + 18||
|2 f(k) + 2 k + 18||
Thus we have proven
0k+1 1 00 B> 0 -> 1k+2 00 B> 0 in
f(k+1) = 2 f(k) + 2 k + 18 steps.
Thus for any
P(n) is true:
0n 1 00 B> 0 -> 1n+1 00 B> 0 and thus this TM will obviously never halt since it will write an unbounded number of
To get the exact value of the steps function
f(n), we need to solve the recursive relation:
f(0) = 0
f(n+1) = 2 f(n) + 2 n + 18
g(n) = f(n) + 2n, then:
g(0) = f(0) + 0 = 0
g(n+1) = f(n+1) + 2(n+1) = 2 f(n) + 2n + 18 + 2n + 2 = 2 (g(n) - 2n) + 4n + 20 = 2 g(n) + 20
h(n) = g(n) + 20, then:
h(0) = 20
h(n+1) = g(n+1) + 20 = 2 g(n) + 20 + 20 = 2 (h(n) - 20) + 40 = 2 h(n)
h(n) is easy to solve:
h(n) = 20 2n so
f(n) = g(n) - 2n = h(n) - 2n - 20 = 20 2n - 2n - 20.
which computes the following step estimates:
which does match the results above.
Inductive Rules vs. Meta Rules
In our current simulator, we are able to prove:
- Basic Rules: Rules whose proofs consist of all base (or chain) steps or
- Meta Rules (aka Recursive Rules): Rules whose proofs include also previously proven rule steps.
This proof requires a more powerful type: Inductive Rules - rules whose proofs include rule steps for (smaller versions) of this Rule being currently proven.
Here I showed a hand-written proof for accelerating a counter using Induction. I believe that this style of inductive proof generalizes to work for most variations of Counters. But, can we automate the detection and proving of these sorts of rules (like we do for Christmas Tree rules)? I think it should be possible, but I’m not sure about the details. This is future work: How do you detect a counter, guess the rule and know now long to simulate in order to prove the rule inside your system?
In order to prove
BB(4), Brady did automate Counter proving 40 years ago. Like his Christmas tree detection, he built a sort of “cellular decomposition” of the tape into
<1> blocks and then proved by induction that for any TM which could be decomposed this way, it will never halt.
The downside of these sorts of strict decompositions is that there turn out to be many Counter-like TMs that didn’t fit into a single decomposition and so several decompositions needed to be hand-made to deal with different variations. A similar issue came up for Christmas Tree cellular decomposition leading to the need to define new decompositions for things like Leaning Trees and Trees with Shadow, etc.
Tape compression, chain steps and our current automated proof systems have generally been powerful enough to prove all variations of Christmas Trees without the need for variations-specific code. In my opinion, it has made proving Christmas Trees much simpler and less error prone by producing a single general system which happens to work with all of these variations.
I would like to see something similar happen with automating counter proving and my hope is that it is possible by following the pattern I layed out above: Running an Inductive Proof that allows applying smaller versions of the rule being proven itself. My hope is that this would be general enough to cover all variations of counters without the need to each variation to have it’s own decomposition and proof.
I think that my idea to prove Counters via induction is not original. In fact, reading Allen Brady’s in 1983 paper today, I noticed that he mentions that he has proven counters via induction. However, I am not aware of any previous description of Inductive Rules as akin to Meta Rules like I describe above.