This is just a thought, but I wonder if there is a mathematical connection between this game and something like the binary representation of irrational (or maybe transcendental) numbers.
The article is also notable for its consistency in spelling "lose" as "loose".
One could interpret the outcome of the game as a number by ○ being the digit 0 and ● being 1. For fun we could also say that if there is a repeating subsequence at the end (someone lost), then that is repeated infinitely. I suggest this because any won game has a sub-string repeated three times at the end, so we might as well repeat it to infinity!
Say the example game, ● ● ○ ○ ● ● ○ ○ ● ● ○ ● ● ○ ● ○ ○ ● ● ○ ○ ● ● ○ ○ ●, would be 0.11001100110110100110011001, or perhaps 0.11001100110110(1001) where the parenthesis express infinite repetition. If we choose the first, it is 768160/959951 and the second would be 65553/81920.
In any case, a won game would be a rational number, while a game which goes on forever would be an irrational! One could then wonder which irrational numbers are represented by such games.
Just adding to op, for some (myself included) it is quite painful to see loose in place of lose, this should be fixed asap as it distracts the reader from the content. Lose = opposite of win, loose = opposite of tight.
Some irrational numbers are not valid games. For instance, I am sure the expansion of let us say π/4 in binary has 000 as a subsequence somewhere. But that could never happen in a game, because it disallows repetitions of a substring three times in a row.
Ah, you are right! At least wrt the "naive" mapping of irrationals to binary representation. My gut still tells me it's bijective, but the mapping has to be a bit more involved :D
> The file is a mere 512 bytes, and unpacks to a 26kb file, which again unpacks to 3Mb.
My brain hurts when thinking about that. How could 512 bytes be enough to store ~3 million bytes? I know that compression is basically about finding patterns and this sequences should be very compressible.
If it was a file filled entirely with one character, the compression could simply be to write a file saying "this character copied 3 million times", which is less than 512 bytes.
This is not exactly what happens here, but many compression algorithms work by recognising that certain substrings are very common, and give them a "shorter code". In this game, there are some quite long such strings, giving a good compression rate. Furthermore, because of the recursive nature, it can find such patterns again after the common substrings are replaced by shorter codes, because these codes again form patterns with repeated substrings. This goes on until there is almost just a bit of meta data and an "ur-pattern".
Compression is fascinating in many ways. For instance, since there are a fixed number of files of a certain size and some bigger files are made smaller, some smaller files must be made bigger by the compression! Of course, this could be as simple as attaching a header or flag which says "I could not compress this. Here is the old file verbatim." But that is still a bit longer than the original!
You can think about the compressed size of some file as approximating the amount of information (in the Shannon sense[1]) there is in the file. A perfect compression would reduce the file to exactly the size of the amount of information it contains.
This is the idea behind Kolmogorov complexity[0], that the complexity of a string (finite or infinite) can be measured, relative to a programming language, as the the length of shortest program which produces it.
Precisely computing the Kolmogorov complexity of a given string could be very difficult, though. In general, it is uncomputable because we cannot decide if a given program will output a given string.
It's also always relative to some specific programming language, it's not an intrinsic property of strings. (You can of course, convert between to another programming language where it's simpler, but then you incur the (constant) cost of the transpiler from language 1 to language 2.)
And by adding a constant to the specification of your programming language, any sequence can have complexity 1! (But of course not every sequence can have its own constant.)
Yes, but the connection is not clear from what I wrote. I keep intending to make a little post about the connection, but I want it to highlight some Haskell code I wrote, but I haven't polished it yet. I want to make an update to it, using my applicative logic library[0][1].
The short version: By representing the game coalgebraically, one can use modal logic to solve it (find a winning strategy for player 1) by brute force.
The old code looks like:
-- Modal operators
e = modal any' (Coalg possible)
a = modal all' (Coalg possible)
-- Test for winning strategy within a limited number of moves.
winning :: Integer -> Player -> State -> Bool
winning 0 _ _ = False
winning n p s = wonAlready || e (a (winning (n-1) p)) s where
wonAlready = (winner s == Just p)
We can translate this into more standard modal logic: Letting ◇ be "There is a move a player could make", and □ be any move a player makes. We define the existence of a winnning strategy for player 1 inductively:
S(0) = ⊥
S(n+1) = W(p₁) ∨ ◇ □ S(n)
Intuitively, you have a winning strategy if you won already, or if there is a move you make, such that whatever move the opponent makes you, you still have a winning strategy.
The above just test for existence, but a slight modification, based on the same logical expression, gets us to a winning strategy:
-- Game data
data Player = P1 | P2
deriving (Eq, Show)
data Color = Red | Blue
deriving (Eq, Show)
data State = S [Color]
deriving (Eq, Show)
data Strategy = Won
| Force Color Strategy
| Choice (Color -> Strategy)
-- Modal operators
e' = modal sany' (Coalg possible')
a' = modal sall' (Coalg possible')
-- Test for winning strategy within a limited number of moves.
winning' :: Integer -> Player -> State -> Maybe Strategy
winning' 0 _ _ = Nothing
winning' n p s = wonAlready <|> (e' (a' (winning' (n-1) p)) s) where
wonAlready = if (winner s == Just p) then Just Won else Nothing
very interesting, once I realized that the longer sequences were wrapping on my mobile. At first, it appeared that the “top” line was player 1 in the “bottom” line was player 2
https://gowers.wordpress.com/2013/08/23/determinacy-of-borel...
The article is also notable for its consistency in spelling "lose" as "loose".
One could interpret the outcome of the game as a number by ○ being the digit 0 and ● being 1. For fun we could also say that if there is a repeating subsequence at the end (someone lost), then that is repeated infinitely. I suggest this because any won game has a sub-string repeated three times at the end, so we might as well repeat it to infinity!
Say the example game, ● ● ○ ○ ● ● ○ ○ ● ● ○ ● ● ○ ● ○ ○ ● ● ○ ○ ● ● ○ ○ ●, would be 0.11001100110110100110011001, or perhaps 0.11001100110110(1001) where the parenthesis express infinite repetition. If we choose the first, it is 768160/959951 and the second would be 65553/81920.
In any case, a won game would be a rational number, while a game which goes on forever would be an irrational! One could then wonder which irrational numbers are represented by such games.
Without having a proof ready at hand, I am quite sure that the `generate` sequence from my post represents a transcendental number.
Seems like there should be a bijection there no?
My brain hurts when thinking about that. How could 512 bytes be enough to store ~3 million bytes? I know that compression is basically about finding patterns and this sequences should be very compressible.
This is not exactly what happens here, but many compression algorithms work by recognising that certain substrings are very common, and give them a "shorter code". In this game, there are some quite long such strings, giving a good compression rate. Furthermore, because of the recursive nature, it can find such patterns again after the common substrings are replaced by shorter codes, because these codes again form patterns with repeated substrings. This goes on until there is almost just a bit of meta data and an "ur-pattern".
Compression is fascinating in many ways. For instance, since there are a fixed number of files of a certain size and some bigger files are made smaller, some smaller files must be made bigger by the compression! Of course, this could be as simple as attaching a header or flag which says "I could not compress this. Here is the old file verbatim." But that is still a bit longer than the original!
[1] https://arxiv.org/pdf/1612.09316
Precisely computing the Kolmogorov complexity of a given string could be very difficult, though. In general, it is uncomputable because we cannot decide if a given program will output a given string.
[0]: https://en.wikipedia.org/wiki/Kolmogorov_complexity
(For the initiated, I should mention that it is related to Thue–Morse sequences.)
The short version: By representing the game coalgebraically, one can use modal logic to solve it (find a winning strategy for player 1) by brute force.
The old code looks like:
We can translate this into more standard modal logic: Letting ◇ be "There is a move a player could make", and □ be any move a player makes. We define the existence of a winnning strategy for player 1 inductively: Intuitively, you have a winning strategy if you won already, or if there is a move you make, such that whatever move the opponent makes you, you still have a winning strategy.[0]: https://hakon.gylterud.net/programming/applicative-logic.htm...
[1]: https://github.com/typeterrorist/applicative-logic/blob/moda... – this is a branch with the modal logic operators defined.