Interaction and Concurrency

Concurrent algorithms are difficult to reason about. It has been suggested that reasoning about concurrency would be easier with the help of linear logic, session types and process/channel -models. This corresponds to intuitionistic logic, simple typing and lambda calculus.

There is a great video on the subject I would suggest people to watch. It's the "A Rehabilitation of Message-passing Concurrency" -talk from a recently kept papers-we-love -conference.

Most programs have to run concurrently with other programs that interact with them. Therefore it is nice to see that people are working on things such as Symmetric Interaction Calculus

The constructs built over intuitionistic logic seem inherently non-concurrent. I still wonder if this results from the logic or whether there is some other reason behind it? Communication channels seem to require that the ends are uniquely referenced, and they change their state when they are communicated with, which seems to cause the major need for linear logic with concurrency.

Concurrent models may also help with an age-old problem of mutability in purely functional programming languages. Mutable structures are natural in linear logic and the simultaneous use by multiple processes is conveyed in the same process/channel model as before.

Similar posts