|Week 1 | |||Week 5 | |||Week 9|
|Week 2 | |||Week 6 | |||Week 10|
|Week 3 | |||Week 7 | |||Week 11|
|Week 4 | |||Week 8 | |||Week 12|
Lecture was led by Nika. We approach the question “what is computable?” by means of diving into Untyped λ-Calculus (or simply “lambda-calculus”), discussing how to construct functions from scratch by only using a syntactical definition of λ-terms and the adoption of the Barendregt convention, as well as variables, application and abstraction. We defined what it means for a term to be a redex, as well as the one-step β-reduction (→β) from a Basis (Substitution) and a Compatibility statement, thus providing the necessary ingredients to form a notion of β-convertibility. We posed exercises for the following week in order to practice operating with λ-terms.
Used texts: Nederpelt & Geuvers (2014); Chapter 1.
Lecture was led by Nika. In this week’s lecture we learned how to formally define β-conversion by means of introducing many-step β-reduction (↠β) and extending it as a reflexive, transitive and symmetric relation. We discussed the Church-Rosser Theorem and β-normal form (β-nf for short), and explored their mutual implications. In particular, we considered Ω as a striking example of a non-normalizing λ-term and how non-normalizing forms, in the context of computation, become especially interesting for programs that don’t terminate (for example, an OS). Following thins, we reviewed the exercises from the previous week, in particular the combinators S, K and I, which were heavily important in them. Seizing the opportunity, we discussed the correspondence between λ-Calculus and Combinatory logic, and the unitary universal programming language uniquely formed by the combinatory X. Afterwards we learned how math is introduced into λ-Calculus by means of Church Numerals and constructed functions for basic arithmetic, namely addition and multiplication, leaving succession and exponentiation included as exercises for the following week. Finally, we discussed lazy evaluation, the If…else… function, and noted that Booleans can be easily expressed in λ-Calculus.
Used texts: Smullyan, R. To Mock a Mockingbird and Other Logic Puzzles: Including an Amazing Adventure in Combinatory Logic (1985) Nederpelt & Geuvers (2014); Chapter 1.
The session was kicked off by Aleix, with a presentation on the historical development and relevance of λ-Calculus. The rest of the session was led by Ana and Nika. It was dedicated to explore the possibilities and usefulness of the Fixed-point Combinator Y, concretely as a method for recursion. Together with the Iszero function and the predecessor function we constructed factorial function (which β-normalizes) and the Liar’s Paradox (which doesn’t β-normalize). Finding λ-terms for ¬, ˄, ˅, ↔ and xor were left as exercises for the following week, as well as finding functions for using Lists (crucial tool for programming) in λ-Calculus, that is: nil, const, reverse, head, filter and tail.
Used texts: Smullyan, R. To Mock a Mockingbird and Other Logic Puzzles: Including an Amazing Adventure in Combinatory Logic (1985) Cardone & Hindley (2006); Chapters 1-4. Nederpelt & Geuvers (2014); Chapter 1.
Lecture was led by Nika and Ana. Types were introduced into λ-Calculus with typing à la Church (instead typing à la Curry) in order to solve two fundamental problems with Untyped λ-Calculus: that of infinite functions and the lack of definition for domain and codomain. We syntactically defined types as constructed from type variables (‘α’, ‘β’, ‘γ’, ‘δ’, … ‘ρ’, ‘σ’, ‘τ’, ‘υ’, …) and from a right associative arrow (‘→’) from a type to a type (for example: ‘σ→τ’ or ‘(σ→τ)→ ρ’). λ-terms were redefined in light of their new companion. We then defined the notion of a context and learned the deduction (Type inference) for proving the type of a given λ-term. Accordingly, we related the variable, application, and abstraction rules to the Generation Lemma, an important lemma for conjecturing a type of a λ-term. Exercises in Type inference were posed for the following week.
Used texts: Nederpelt & Geuvers (2014); Chapter 2.
Lab: Nika helped us correct the previous week’s exercises and operate thoroughly on lists. Furthermore, we commented on the difficulty of formulating the predecessor function.
Used texts: Nederpelt & Geuvers (2014); Chapter 1.
Lecture was led by Ana. In this session, we establish the fundamentals and incentives of Intuitionistic Logic. We began by means of contraposition to Classical Logic, in particular, the fact that Intuitionistic Logic doesn’t assume the Law of Excluded Middle (ϑ˅¬ϑ) and the fact that Intuitionistic Logic proves a proposition by providing a construction, which has a special interest for computing. We remarked that in Intuitionistic Logic proofs are stronger while at the same time more restricted than in Classical Logic. We discussed the Brouwer-Heyting-Kolmogorov interpretation as a way of deciding what is considered a good construction (by defining how it’s used, rather than what it means) in Intuitionistic Logic. Following some exercises and exploration of its functioning “behind the scenes”, we introduced some basic syntax, including the redefinition of ‘¬ϑ’ as ‘ϑ→⊥’. We learned of the Glivenko Theorem and how Intuitionistic Propositional Logic has an “embedding” of Classical Propositional Logic (if classical prop. logic proves ϑ, intuitionistic prop. logic proves ¬¬ϑ). Also, we noted that De Morgan’s laws don’t all hold in Intuitionistic Logic, and reviewed Currying. Finally we defined Intuitionistic Logic’s deduction rules and tried them out with some case example, with more being left as exercises.
Used texts: Sorensen & Urzyczyn (1998); sections 2.1 and 2.2
Meeting with the company (Formal Vindications, S.L.)
Lab session with computers: led by Nika. Nika guided us throughout some fundamental of Coq’s purpose (proof assistance), usage and basic syntax. We thus saw how Coq assigns types. By accompanying examples with explanations, Nika taught us how we can define Sets inductively, define functions using ‘Fixpoint’, and how we can state and prove propositions, lemmas or theorems. For the purpose of integrating and furthering our learning, Nika began tasks in Coq and left us to complete them, gradually leave more to us and progressing to more difficult tasks. In this process, it becomes clear to the learner how Coq assists proofs by identifying goal and sub-goals of the proof, and how to use ‘tactics’. In sum, we defined natural numbers, lists, a plus function, and some trivial proofs. The session naturally led to an interesting discussion between Nika and Joost on what can be stated and what can be proven in Coq.
Session led by Ana. This week we learned all about induction, specifically inductive definitions and inductive proofs. As a basic example, we looked at the inductive definition of natural numbers, as a set of two statement. By considering the two statements as an inductive definition of natural numbers, we are saying that natural numbers are the smallest possible set in which both statements are true. Once inductive definitions were settled, the rest of the session was focused primarily on inductive proofs, and inductive definitions were stated instrumentally for this purpose. As a basic example of a proof regarding natural numbers, we proved inductively that ∀n(n+0=n). In order for us to learn about induction in the context of Typed λ-Calculus, Ana defined types inductively and then offered a Lemma which we had to help prove, namely, that if σ is a type, then it has an even number of brackets. To help our comprehension on the topic, and in the interest of leading us into a fascinating and productive direction, we reviewed the inductive definition we had given of Type deduction and set out to explain how we would approach proving that proofs have a certain property. Last, not without first defining domain, ‘dom(Γ)’ and the free variables of a term, ’FV(M)’, we set out to inductively prove the Base Lemma, that is, effectively proving a property of proofs, leaving its completion for the following week’s homework.
Session was led by Ana.
In this week’s session was fundamentally separated in two different parts. The first of which was primarily centered on extending notions of Untyped lambda-calculus to typed lambda-calculus, while the second part was focused on a first approach to the Curry-Howard isomorphism. For extending notions, we started by specifically discussing how types behave for subterms and for substitutions of variables for subterms of the same type: that is, the Subterm lemma and the Substitution lemma. We set out to see how we would go about to prove them. Afterwards, Ana redefined β-reduction for Church terms, and we discussed the Subject Reduction lemma, by which β-reducing typable terms we do not affect their typability or type; and the Strong Normalization Theorem, by which typable terms are strongly normalizing. For stating the Curry-Howard isomorphism, Ana defined a restricted version of Intuitionistic Propositional Calculus, named IPC(→), with the restricted syntax of only the ‘→’ connective. We saw the deduction rules for IPC(→) and talked about the Conservativity prop. for IPC(→) under “normal” Intuitionistic Propositional Calculus, that is, if it can prove ϑ, and ϑ is in IPC(→), then it is also provable in IPC(→). Having defined IPC(→), we launched into stating the Curry-Howard isomorphism with the historical note that we were following the steps which led to the full theory, by noting that this fragment of Intuitionistic Propositional Logic behaves the same as types.
Session was led by Nika.
The session was kicked off by a short presentation by Eric on the differences between Church and Curry typing for specifically lambda calculus, and in the larger picture.
Following this, the session was dedicated to using Coq, now with the Curry-Howard Isomorphism in mind. At first we saw how logical propositions are defined and used in Coq, basically objects that can have proofs, as of type Prop, in order to get an intuition of Intuitionistic propositional logic in Coq. We defined two special propositions: True (as provable) and False (as unprovable) (note that they are not the same as the boolean definition of true and false), and played with some basic proofs. After, Nika used the fact that “proofs are (essencially) programs” to show us how Coq can transform the proofs (we had to come up with) for theorems (she presented us with) into programs. Afterwards, Nika and Ana provided us with tautologies to prove in order to obtain more programs. The session resulted not only in a tying together of many of the course's material up until this point, but was also a great learning experience in Coq syntax and in Coq's many possible uses.
Session led by Nika.
In this session Nika drew us a map of where we are within the scheme of ther theory, and how it all fits together. We learned Untyped λ-Calculus in order to the learn Typed λ-Calculus. Then we learned IPC(→) in order to put them together in the Curry-Howard Isomorphism, which in turn was necessary to understand the spineof what is being done within Coq. But Coq doesn’t use IPC(→) with Typed λ-Calculus. Instead, it is highly strengthened on both sides, so to say. That is, it uses a stronger functional programming language than Typed λ-Calculus, and a stronger corresponding logic than IPC(→). Nika showed us briefly how Typed λ-Calculus can be extended to different abstract functional programming languages that instead of only being terms depending on terms, can be of terms depending on types (λ2), types depending on terms (λω), and types depending on types (λP). In the class we saw specifically how types depending on terms (λ2), with its new deduction rules. Nika explained how Typed λ-Calculus can be extended in all the three new dimensions to produce λC, which with induction added to it produces the theoretical basis for Coq. We also dicussed IPC(→) extended to ∀IPC.
The last session begun with a presentation by Miquel on how to extend lambda terms and their types in order for the typing system to correspond fully with IPC, and not only IPC restricted to ‘→’.
The rest of the session was a Lab led by Ana. In it, using cards, she explained a common sense sorting algorithm for lists called injection sorting, and guided us in defining it in Coq, while detailing the steps required. Following this, we came up with a way of stating the proposition that a given lists is sorted (which can either be true or false) and we came up with a way of stating the proposition that a resulting list on which this algorithm is applied is sorted (and has the same elements as the input list) and went on to proving it. Finally, we saw the Curry-Howard Isomorphism’s implication in action: that is, from the proof we extracted the verified program. We commented that this constitutes a new programming paradigm, and that with it, programs are safely proven with respect to their specification. Any unreliability of a program can only reside in what the actual specification is in respect to the human conception of it. With this, we concluded the course.