Reversing the sense of the title: analysis of coinduction by function spaces.

## Some isomorphisms

Simple cases ("linear").
  nu x. a * x         = a ^ (mu x. 1 + x)
nu x. a * x^2       = a ^ (mu x. 1 + 2*x)
nu x. a * x^b       = a ^ (mu x. 1 + b*x)

Lift 'em.
  (nu f. \x-> x*f x) a       = a ^ (mu x. 1 + x)
(nu f. \x-> x*(f x)^2) a   = a ^ (mu x. 1 + 2*x)
(nu f. \x-> x*(f x)^b) a   = a ^ (mu x. 1 + b*x)

Thorsten's trick.
  (nu f. \x-> x*f^2 x) a = a ^ (mu x. 1 + x^2)

The generalised Thorsten trick ("polynomial").
  (nu f. \x-> x * f x * f^2 x) a   = a ^ (mu x. 1 + x + x^2)
(nu f. \x-> x * f x * ... * f^n x) a = a ^ (mu x. 1 + x + ... + x^n)
(nu f. \x-> x^c0 * (f x)^c1 * ... * (f^n x)^cn) a = a ^ (mu x. c0 + c1*x + ... + cn*(x^n))

Using exponential notation  on the left
  a ^ (nu f. 1 + f + .. + f^n) = a ^ (mu x. 1 + x + ... + x^n)

(There is too much overloading of "^" to do it with coefficient types.)

•  -- (nu f. \x-> x * f (f x)) y = y ^ (mu x. 1 + x^2)
         data A x  = A_case x (A (A x))
data B = L | Sp B B
a_app :: A x -> B -> x       -- both typings ...
a_lam :: (B -> x) -> A x     -- ... are necessary
a_app (A_case x stuff) =
\t-> case t of L        -> x
Sp tl tr -> a_app (a_app stuff tl) tr
a_lam f = A_case (f L) (a_lam (\tl->a_lam (\tr->f (Sp tl tr))))

There seems to be another isomorphism:
         a_app' (F_case x stuff) =
\t -> case t of L        -> x
Sp tl tr -> a_app' (a_app' stuff tr) tl
a_lam' f = A_case (f L) (a_lam' (\tl->a_lam' (\tr->f (Sp tr tl))))

An unexpected way to update a function.
         update :: A x -> B -> x -> A x
update (A_case x f) t y =
case t of L -> A_case y f
Sp tl tr -> A_case x (update f tl (update (a_app f tl) tr y))

•  -- (nu f. \x-> x * f x * f (f x)) y = y ^ (mu x. 1 + x + x^2)
         data G x = G_case x (G x) (G (G x)) (G (G (G x)))
data H   = H0 | H1 H | H2 H H | H3 H H H
g_app :: G x -> H -> x
g_lam :: (H -> x) -> G x
g_app (G_case x gx ggx gggx) =
\h-> case h of
H0 -> x
H1 h -> g_app gx h
H2 hl hr -> g_app (g_app ggx hl) hr
H3 hl hm hr -> g_app (g_app (g_app gggx hl) hm) hr
g_lam f = G_case
(f H0)
(g_lam (\h->f (H1 h)))
(g_lam (\hl->g_lam (\hr->f (H2 hl hr))))
(g_lam (\hl->g_lam (\hm->g_lam (\hr->f(H3 hl hm hr)))))

•  -- (nu f. \x-> x * (f (f x))^a) y = y ^ (mu x. 1 + a * x^2)
        data J a x = J_case x (a -> J a (J a x))
data K a = Kl | Ks a (K a) (K a)
j_app :: J a x -> K a -> x
j_lam :: (K a -> x) -> J a x
j_app (J_case x f) =
\t -> case t of Kl -> x
Ks a tl tr -> j_app (j_app (f a) tl) tr
j_lam f = J_case (f Kl)
(\a -> j_lam (\tl -> j_lam (\tr -> f (Ks a tl tr))))

• A very long one. (Due to Martin Wehr.)
{-
(nu f (* 3 place *).
\ x y z-> x^3*y*z*(f (f x x z) (f x z y) (f z y x))) a b c

~= (a, b, c) => mu (x,y,x) .
( 3 + x*x + x*y + y*x + z*z ,  (* occs of x *)
1 + y*z + z*y,               (* occs of y *)
1 + x*z + y*y + z*x)         (* occs of z *)

where (a,b,c)=>(d,e,f)  abbreviates a^d * b^e * c^f
-}

It should be noted that the constructors in the inductively defined types below Mx, My, Mz are systematically named with digit-strings that represent the position of a variable name in the the right-hand side of the definition of N.
data N x y z = C x x x y z (N (N x x z) (N x z y) (N z y x))

data Mx = C1 | C2 | C3
| C6_1_1 Mx Mx
| C6_1_2 Mx My
| C6_2_1 My Mx
| C6_3_3 Mz Mz

data My = C4
| C6_2_3 My Mz
| C6_3_2 Mz My

data Mz = C5
| C6_1_3 Mx Mz
| C6_2_2 My My
| C6_3_1 Mz Mx

-- isos

ntomx :: N x y z -> Mx -> x
ntomy :: N x y z -> My -> y
ntomz :: N x y z -> Mz -> z
mton  :: (Mx -> x, My -> y, Mz ->z) -> N x y z

mton (fx,fy,fz) = C (fx C1) (fx C2) (fx C3) (fy C4) (fz C5)
(mton (\mx1->mton (\mx2->fx (C6_1_1 mx1 mx2),
\my2->fx (C6_1_2 mx1 my2),
\mz2->fz (C6_1_3 mx1 mz2)),
\my1->mton (\mx2->fx (C6_2_1 my1 mx2),
\my2->fz (C6_2_2 my1 my2),
\mz2->fy (C6_2_3 my1 mz2)),
\mz1->mton (\mx2->fz (C6_3_1 mz1 mx2),
\my2->fy (C6_3_2 mz1 my2),
\mz2->fx (C6_3_3 mz1 mz2))))

ntomx (C x1 x2 x3 y z n_nnn) mx = case mx of C1 -> x1
C2 -> x2
C3 -> x3
C6_1_1 mx1 mx2 -> ntomx (ntomx n_nnn mx1) mx2
C6_1_2 mx1 my2 -> ntomy (ntomx n_nnn mx1) my2
C6_2_1 my1 mx2 -> ntomx (ntomy n_nnn my1) mx2
C6_3_3 mz1 mz2 -> ntomz (ntomz n_nnn mz1) mz2
ntomy (C x1 x2 x3 y z n_nnn) mx = case mx of C4 -> y
C6_2_3 my1 mz2 -> ntomz (ntomy n_nnn my1) mz2
C6_3_2 mz1 my2 -> ntomy (ntomz n_nnn mz1) my2
ntomz (C x1 x2 x3 y z n_nnn) mx = case mx of C5 -> z
C6_1_3 mx1 mz2 -> ntomz (ntomx n_nnn mx1) mz2
C6_2_2 my1 my2 -> ntomy (ntomy n_nnn my1) my2
C6_3_1 mz1 mx2 -> ntomx (ntomz n_nnn mz1) mx2


Note that digit-string is a position in a type expression, which has a tree structure. Styimulated by this Martin Wehr has the idea that one can "lift" the same idea to the level of types, where in general the type expressions can be coinductively defined, and the positions can be arbitrary concrete data structures. I find it hard to say where this might lead. Martin has a strong intuition of the general process, though I think he finds it hard to give a "closed form" description of it.
• This example is due to Martin Wehr. (I have corrected his rotten Haskell!!) I'm not sure I see it, but Martin says that it allows one to have coproducts' in the coinductive lhs. Hmmm. Is it *really* a coproduct in the categorical sense?? All the same it is interesting for having mutual nu's and mu's.
-- coinductive dt's
data F x = C (F (G x)) x
data G y = D y (F y) (G (G y))

-- inductive dt's
data Fd  = C1_f_g Fd Gd | C2
data Gd  = D1 | D2_f Fd | D3_g_g Gd Gd

-- We define the simultaneous isomorphisms by mutual recursion

f_form :: (Fd -> a) -> F a
g_form :: (Gd -> b) -> G b

f_form fa = C (f_form (\fd-> g_form (\gd-> fa (C1_f_g fd gd)))) (fa C2)

g_form ga = D (ga D1)
(f_form (\fd-> ga (D2_f fd)))
(g_form (\gd-> g_form(\gd'-> ga (D3_g_g gd gd'))))

-- The reverse is defined again by mutual recursion

f_addr :: F a -> Fd -> a
g_addr :: G b -> Gd -> b

C2              -> x

D1               -> y
D2_f fd          -> f_addr fy fd

{- Martin's explanation.

(\nu F + G. \lambda X. FG X * X
|\lambda Y. Y * F Y * G^2 Y) A|B ~=
(\mu X + Y. X*Y | 1+ X + Y^2) => A | B

where I use notation (F + G)A|B ~= C+D => A |B
as abreviation of the two simultanious isomorphisms
F A ~= A^C   AND   G B ~= B^D
-}



## Some (non-dependent) questions

• A speculation, guided by the symbols:
         a ^ (nu f. mu g. 1 + f * g)  = a ^ (mu x. nu y. x * y)

[Probably not (?) the obvious alternative
         a ^ (nu f, g. f * g)  = a ^ (mu x, y. 1 + x * y)

because (mu x, y. 1 + x*y) = mu x. 1 + x^2 ] If I understand the word "regular" (as used by Richard Bird), regular datatypes are gotten from polynomial functors by closing under (least?) fixed points.
• What if one lifts the right hand side to higher types/kinds? For example, (nu ?? ) f a = a ^ M f where M f = mu g.\x-> x + f (g x).
• Countable branching. A propos of countably branching trees, there are two ways to define them:
•  CBT1 = mu x. 1 + x^N 
•  CBT2 = mu x. 1 + Str x 
Whats the solution to:  nu ?? = a ^ CBT 
• Is (nu D. \ x -> x * (D (D x)^b)) y isomorphic to some function space y ^ ???

## Some (dependent) questions

• Coinductive counterpart of W-type.
          nu x. (Sigma a : A)x ^ B a = ??

"Andreas (who just joined me) found a possible solution to your problem using two interleaved very dependent types:
          { f | f : {l | l : mu X.1 + B(f(l))} -> A }

"
• Consider the predicate transformer nu Phi. \P->P \cap (Phi^2 P) applied to an arbitrary predicate X. (Predicates over some state-space S). This is analogous to the type functor nu F. \x->x * F^2 x. (Just replace types by type-valued functions, ie. predicates.) The former is to the latter as ?? is to x ^ mu p. 1 + p^2. What would it take for this "ratio" to have a solution?
• Tree sets. Given A, B, C, d as in Petersonn-Synek trees
         ?? nu .. x .. a ..??  = x ^ ((mu P : A -> Set. \a-> (Sigma b : B a, Pi : C a b)P(d a b c)) a)

• The coinductively defined predicate
         (nu P : A -> Set. \a-> (Pi b : B a, Sigma : C a b)P(d a b c))

crops up in the type of certain kinds of "reactive strategy". If one thinks of the equations above as analyses of their left hand sides using well-founded induction (and function spaces), can one give such an analysis of this predicate? (I would be very interested in this.)

## Infinite terms. (Tait 1967. Consult Helmut S.)

The "infinite terms" used by various proof-theorists have a stream-like, coinductive flavour which emerges when one looks at how the recursor in Gödel's T is translated. Maybe it is possible to generalise these systems of infinitary terms, so as to handle directly more general kinds of concrete data-types?

First, here is the isomorphism between streams and functions on the natural numbers.

Str x = Case_Str x (Str x)
Nat   = Z | S Nat
lam_Str :: (N -> a) -> Str a
app_Str :: Str a -> N -> a
lam_Str f = Case_Str (f Z) (f . S)
app_Str (Case_Str x xs) n = case n of Z -> x
S p -> app_Str xs p

One usually writes (xs !! n) instead of app_Str xs n.

  R' : A -> Str (A -> A) -> Str A
R' a (b:bs) = a : R' (b a) bs   -- co

R  : A -> (N -> A -> A) -> N -> A
R a bf n = case n of 0   -> a
S m -> bf p (R a bf p)

R a bf n = R' a (lam_Str bf) !! n

Cumulative product ((*) = flipped composition) and sum (lifted (*))
  cump : Str (A -> A) -> Str (A -> A)
cump (b:bs) = id : map (b*) (cump bs)   -- * is exp. notn.
cums : Str ((A -> A)->A->A) -> Str ((A -> A)->A->A)
cums (b:bs) = zero : map (b+) (cums bs)   -- * is exp. notn.
where zero flip const

Then
  R a (bs!!) n = (cump bs !! n) a


## What should one be doing about coinduction in type theory?

The first thing is to formulate schemes of coinduction within dependent type theory. There must be quite a history of this. Didn't Mendler do it in dependent types? Some people who have written about coinduction. (Mostly for simple types, or set theory.)

Then, one can try to figure out the metatheory of such systems, using whatever principles on the meta-level. (The usual dreary things: church-rosser, subject reduction, ...)

One can also try to find models of such systems, which can be constructed within (well-founded) type theory. (Comparable to Aczel's graph-theoretic models of non-well-founded set-theory. Anton and I came to feel while writing our paper that we should really be developing a (type-)theory of graphs. Perhaps this is not a routine reworking of the classical theory.)

For me, the interest of all this is connected with interactive programming in type theory.

• Real (lazy) functional programs (written to do something, rather than think about) are full of codata. It seems to arise whenever one writes anything state-oriented. (I don't mean specifically I/O.) One sometimes also uses to it memo-ise something.
• On the theoretical side, coinduction crops up all over the place. I have in mind particularly a certain simulation relation (not an equivalence relation, but a partial order) between states in an interactive system. There are also "reactive" strategies (not to win but) to avoid losing a game.
One needs access to a wide variety of coinductively defined types, justified (modelled, analysed, encoded) in well-founded (and predicative) type theory.

Is the direction of this "justification" reversible? (Suggested by the corecursive simulation of natrec, in the infinitary terms stuff. )

## Footnotes

     a ^ b       = b a
a ^ (b * c) = (a ^ b) ^ c
a ^ (b + c) = a ^ b * a ^ c
a ^ 0       = id


The following may not be relevant, but for the moment I think there is a chance. It was pointed out to me by Conor McBride.

1. Consider binary wellfounded trees:
     B = 1 + B^2

If we remove a single (not necessarily immediate) subtree from a binary tree, we are left with a structure on type P, where
     P = 1 + 2*B*P

2. In general, if we consider n-ary wellfounded trees:
     T = 1 + T^n

and we remove a single (not necessarily immediate) subtree from such a tree, we are left with a structure on type Q, where
     Q = 1 + n*T^(n-1)*Q

Note that there is a kind of formal differentiation in the "business" part of the type expression. (Conor's observation extended to cover familiar rules of differentiation such as D(S*T) = S*D(T) + D(S)*T`, but for the moment, I cannot recall exactly how.)

Peter Hancock