How Fast Can You Count?

After the Short Tutorial On Big Numbers, let ω be the first uncountable ordinal and [a->b->c] be Conway's Chained Arrow Notation. Suppose that through some ingenious counting trick, the author can "count" to ω in finite time. So let's "count" as follows:

  1. 1,2,3,...,ω,
  2.           2*ω,
  3.           3*ω,
  4.           ...
  5.           ω*ω=ω^2=[ω->2->1],
  6.               ω^3,
  7.               ...,
  8.               ω^ω=ω^^2=[ω->2->2],
  9.                   ω^^3,
  10.                   ...,
  11.                   ω^^ω =ω^^^2=[ω->2->3],
  12.                         ω^^^3,
  13.                         ...,
  14.                         ω^^^ω=ω^^^^2=[ω->2->4]
  15.                               ω^^^^3,
  16.                               ...,
  17.                               ω^^^^ω=[ω->2->5],
  18.                                      [ω->2->6],
  19.                                      ...,
  20.                                      [ω->2->ω],
  21.                                      ...,
  22.                                      [ω->3->ω],
  23.                                      ...,
  24.                                      [ω->ω->ω]=([3-ω's with arrows])
  25.                                                ([4-ω's with arrows])
  26.                                                ...,
  27.                                                ([ω-ω's with arrows])
  28.                                                ...,
  29.                                                etc.

Can you estimate the author's counting rate?

For any counting rate R that you give the author, as long as the given rate is representable by some consistent ordinal notation, the author can show that the above counting method counts faster than your rate.

For example: Suppose that your counting rate R is exponential. Then R=O(e^ω). In the above counting method, when the author reaches step 8: ω^ω he is already counting faster than you.

Suppose that your counting rate R is super-exponential. Then R=O(e^e^ω). In the above counting method, when the author reaches step 9: ω^^3; he is already counting faster than you.

Suppose that your counting rate R is ω-tetrational. Then R=O(e^^ω). In the above counting method, when the author reaches step 11: ω^^ω he is already counting faster than you.

Now generalize: Suppose that your counting rate is R. Then, there exists a step n, such that the ordinal fn(ω) found on row n, is such that fn(ω)>R. Further, for any steps m≥n, the following holds: fm(ω)≥fn(ω)>R. Hence the method above always counts eventually faster than any counting rate R.

This it what it means for someone to be able to count infinitely fast.

On the other hand, for each ordinal fn(ω) on row n there exists m, such that fn(ω)≤2^2^...(m-2's)...^2^ω. Therefore the (counting) operators of the above list and of each step resolve essentially into the following set of operators:

ST={ω,2^ω,2^2^ω,2^2^2^ω,...}.

We define the power P(T) of the operator T to be the number of 2's n the corresponding upper bounding tower of 2's in T. In other words,

P(T∈ ST)=m, with fn(ω)≤2^2^...(m-2's)...^2^ω.

We then note that if m<n then P(Tm o Tn)=m+n, and the set ST becomes order isomorphic with the set of Natural numbers. Hence:

Ord(ST)=ω and

|ST|=ℵ0.

There is a natural correspondence between the operator T with P(T)=m and the corresponding binary operator Bm=1/2^m. Therefore the following set is order-isomorphic to the set ST:

SB={1/2k: k∈ N U {0}}.

In other words: Counting infinitely fast is equivalent to keep on diagonalizing at a constant rate over any given counting rate. That's much faster than simply counting to infinity, which can plainly be done by diagonalizing over SB.


The (binary) operators T8 and T9 give the resolution on the 397 pixels-wide figure above, which is the resolution limit of the author's computer screen (1px) using a binary decomposition, since 397/2^8-397/2^7~1.55 and 397/2^9-397/2^8~0.7754.