# Computation with truth tables and linear logic

I explore correspondences between classical linear logic and computation. Truth tables seem to correspond with additive fragment of linear logic

Linear logic is revealed when contraction and weakening are rejected as structural rules. In a programming language these appear as duplication and deletion and their variations.

Additive operators admit an interpretation as choices. `A & B` and `A + B` behave as if the whole proposition would be either A or B. With the first operator the prover can choose which choice is taken. The second operator behaves as if it was alternative scenarios for an event.

Now, lets have a little drill with truth tables.

``````A | B | A ∧ B | A ∨ B
``````---------------------
``````0 | 0 |     0 |     0
``````0 | 1 |     0 |     1
``````1 | 0 |     0 |     1
````1 | 1 |     1 |     1````

The table describes what the propositions can take as values. In the whole it's describing constraints of some system. The first row says the system is satisfiable when the four propositions are valued zero.

You can join truth tables and construct larger tables. For example, lets join two truth tables now, treating `A ∧ B` and `E ∨ F` as separate relations.

``````A | B | A ∧ B      E | F | E ∨ F
``````-------------      -------------
``````0 | 0 |     0      0 | 0 |     0
``````0 | 1 |     0      0 | 1 |     1
``````1 | 0 |     0      1 | 0 |     1
````1 | 1 |     1      1 | 1 |     1````

Denote that `A ∧ B = E`. Now you can take a cartesian product of these tables and discard every row where `A ∧ B ≠ E`. You would end up with this table below with 8 rows in it.

``````A | B | C | (A ∧ B) ∨ C
``````-----------------------
``````0 | 0 | 0 |           0
``````0 | 0 | 1 |           1
``````0 | 1 | 0 |           0
``````0 | 1 | 1 |           1
``````1 | 0 | 0 |           0
``````1 | 0 | 1 |           1
``````1 | 1 | 0 |           1
````1 | 1 | 1 |           1````

Truth tables can be treated like relations, but you see there will be twice as much rows when you add an input because you take a cartesian product.

But you don't need to take a cartesian product. You can also just index the tables and provide them marked as products.

``````A | B      F | E ∨ F
``````----- ×    --------- E = 0
``````0 | 0      0 |     0
``````0 | 1      1 |     1
``````1 | 0
``````----- ×    --------- E = 1
``````1 | 1      0 |     1
````           1 |     1````

Now our table has only 5 rows even if it has more structure. Going further, you could tie `G ∧ H = B` into the table.

``````A | B      F | E ∨ F            G | H | G ∧ H
``````----- ×    --------- E = 0 ×    -------------
``````0 | 0      0 |     0            0 | 0 |     0
``````0 | 1      1 |     1            0 | 1 |     0
``````1 | 0                           1 | 0 |     0
``````----- ×    --------- E = 1      1 | 1 |     1
``````1 | 1      0 |     1
````           1 |     1````

You can index the table with the equality constraint and get a table with 7 rows in it.

``````A      F | E ∨ F            G | H
``````- ×    --------- E = 0 ×    ----- B = 0
``````0      0 |     0            0 | 0
``````1      1 |     1            0 | 1
``````                            1 | 0
``````- ×    --------- E = 0 ×    ----- B = 1
``````0      0 |     0            1 | 1
``````       1 |     1
``````- ×    --------- E = 1
``````1      0 |     1
````       1 |     1````

The corresponding table with cartesian products expanded would take 16 rows.

Note how `E = 1 ∧ B = 0` is unsatisfiable and resulted in drop of a branch during indexing. This algorithm resembles a constraint solver and likely requires the most-general-unifier procedure for satisfying the constraints. In that it has more resemblance to Hindley-milner than the previous typechecking fragment I presented few weeks ago.

Any truth table is also a valid linear logic proposition. Rows form a `(&)` expression while columns form a smaller expression. For example, the truth table for negation `A, NOT_A` could be expressed as:

``(A:0 ∧ NOT_A:1) & (A:1 ∧ NOT_A:0)``

The semantics and the interpretations of linear logic have mattered more than the axiomatizations and deduction rules so far. I should ensure that the above algorithm obeys some system of axioms since the algorithm has not been derived directly from them. I could treat it as a superpositioning, eg.

``````A & A
``````─────
````  A````

Followed by choice-select:

``````  A
``````─────
````A & B````

The above rules would seem to be enough to rationalize the use of the given algorithm.

There are more additive rule axioms that are available but not used so far. I show the remaining rules.

There is a dual for the choice-select, which seem to be a bit like assuming that there were more choices available than what there actually is.

``````A + B
``````─────
````  A````

Additionally there are rules to transition structures on the additive operators' boundaries, a bit like the multiplicative 's' rule that played major role in handling of multiplicative operators.

``````(A ∨ C) + (B ∨ C)
``````─────────────────
``````   (A + B) ∨ C
``````
``````   (A & B) ∧ C
``````─────────────────
````(A ∧ C) & (B ∧ C)````

Semantics behind the first rule show that the disjunctive elements can be used as resources to prove the A and B. The resource duplicates without real contraction happening, because we are speculating over possible scenarios.

The second rule explains that if we make two expressions same within a choice, then it no longer is a choice and can be moved apart to be proved individually. This can be achieved many ways, but the dual introduction rule is obvious.

The source for these rules: Kaustuv Chaudhuri, Nicolas Guenot, Lutz Straßburger: The Focused Calculus of Structures. CSL 2011: 159-173

Major difference in the resulting calculus of truth table relations in comparison to simply typed lambda calculus. The case -statement is absent. It's not needed because it's the same thing as the constructors. This duality seem to be common in linear logic and its applications.