Commit bcbcb23a authored by REMY Didier's avatar REMY Didier

Merge branch 'master' of gitlab.inria.fr:fpottier/mpri-2.4-public

parents b60a97fc 275bf495
......@@ -74,7 +74,7 @@ We also show the limits of dependently-typed functional programming.
## Project
The programming project is not yet available.
The programming project assignment is [available](project/2018-2019/sujet.pdf).
## Approximate syllabus
......@@ -153,11 +153,16 @@ The programming project is not yet available.
* (06/12/2019) ML and type inference ([Introduction](slides/yrg-00-introduction.pdf)), ([ML](slides/yrg-01-type-inference.pdf)), ([programming exercise](ocaml/yrg/ml/README.md))
* (13/12/2019) Subtyping ([support](slides/yrg-02-subtyping.pdf), [DIY](slides/yrg-02-diy-lambda-calculus-with-subtyping.pdf))
* (20/12/2019) GADTs ([support](slides/yrg-03-gadts.pdf)), [LambdaPI](slides/yrg-03-diy-lambda-pi.pdf)
* (10/01/2020) Functional correctness
* (17/01/2020) Effects and resources
* (10/01/2020) Type inference and GADTs ([support](slides/yrg-04-type-inference-reloaded.pdf)),
* (17/01/2020) Functional Correctness ([support](slides/yrg-04-functional-correctness.pdf))
### Dependently-typed Functional Programming
This lecture will involve some hands-on experience. To this end, it is
necessary to bring a laptop on which Agda (version 2.5.4.1 or higher)
is installed. A quick installation guide as well as further pointers
can be found [here](agda/00-agda/Warmup.lagda.rst).
* [Guidelines](agda/Index.lagda.rst)
* (24/01/2020) [Effectful functional programming](slides/pedagand-01.pdf) ([Source](agda/01-effectful/Monad.lagda.rst)).
* (31/01/2020) [Dependent functional programming](slides/pedagand-02.pdf) ([Source](agda/02-dependent/Indexed.lagda.rst), [McCompiler.v](coq/McCompiler.v)).
......@@ -167,7 +172,7 @@ The programming project is not yet available.
## Evaluation of the course
Two written exams (a partial exam on Friday Nov 29 and a final exam on
Two written exams (a partial exam on Friday Nov 22 or 29 and a final exam on
February 28 or March 6) and one programming project or several programming
exercises are used to evaluate the students that follow the full
course. Only the partial exam will count to grade students who split the
......@@ -180,6 +185,8 @@ available with solutions:
- mid-term exam 2018-2019:
[A simple object encoding](http://gallium.inria.fr/~remy/mpri/exams/partiel-2018-2019.pdf)
- final exam 2017-2018:
[Static differentiation](https://gitlab.inria.fr/fpottier/mpri-2.4-public/blob/master/exams/final-2017-2018.pdf)
- mid-term exam 2017-2018:
[Encoding call-by-name into call-by-value; extensible records](http://gallium.inria.fr/~remy/mpri/exams/partiel-2017-2018.pdf)
([Coq solution of part 1](coq/LambdaCalculusEncodingCBNIntoCBV.v)).
......
;; Work around restriction on literate agda vs. holes (issue #2836)
((nil . ((eval . (eval-after-load "agda2"
'(defun agda2-literate-p () ""
(equal (file-name-extension (buffer-file-name)) "lagda")))))))
......@@ -9,7 +9,7 @@ Warming up
Cheatsheets:
- `Install Agda`_
- `Retrieve the Agda Standard Library <https://github.com/agda/agda-stdlib/archive/v0.14.tar.gz>`_
- `Retrieve the Agda Standard Library <https://github.com/agda/agda-stdlib/archive/v0.17.tar.gz>`_
- `Setup the Standard Library`_
- Get used to the `Emacs mode`_
- RTF `Agda Manual`_ if necessary
......@@ -69,10 +69,10 @@ to begin your exploration with a bird-eye view of the project.
.. References:
.. _`Install Agda`: http://agda.readthedocs.io/en/v2.5.3/getting-started/installation.html
.. _`Setup the Standard Library`: http://agda.readthedocs.io/en/v2.5.2/tools/package-system.html#example-using-the-standard-library
.. _`Emacs mode`: http://agda.readthedocs.io/en/latest/tools/emacs-mode.html
.. _`Agda manual`: https://agda.readthedocs.io/en/v2.5.3/
.. _`Install Agda`: https://agda.readthedocs.io/en/v2.5.4.1/getting-started/installation.html
.. _`Setup the Standard Library`: https://agda.readthedocs.io/en/v2.5.4.1/tools/package-system.html#example-using-the-standard-library
.. _`Emacs mode`: http://agda.readthedocs.io/en/v2.5.4.1/tools/emacs-mode.html
.. _`Agda manual`: https://agda.readthedocs.io/en/v2.5.4.1/
.. _`Evolution of a Typechecker`: https://github.com/pedagand/typechecker-evolution
.. TODO: any other useful resources for setting things up?
......
......@@ -87,7 +87,7 @@ Nowadays:
open import Data.Unit hiding (setoid ; _≟_)
open import Data.Nat renaming (_*_ to _*ℕ_)
open import Data.Nat.DivMod
open import Data.Fin hiding (_+_ ; raise ; _-_)
open import Data.Fin hiding (_+_ ; raise ; _-_ ; _≟_)
open import Data.Product
open import Function
......@@ -337,9 +337,9 @@ function! For which ``_>>=_`` amounts to composition and ``return`` is
the identity function. Monads can be understood as offering "enhanced"
functions, presenting a suitable notion of composition and identity
*as well as* effectful operations. For the programmer, this means that
we have ``let _ = _ in _`` for pure functions and ``_>>=_`` for
effectful functions, both subject to (morally) the same laws of
function composition.
we have ``let x = e₁ in e₂ ≅ e₁ (λ x → e₂)`` for pure functions and
``let! x = e₁ in e₂ ≅ e₁ >>= λ x → e₂`` for effectful functions, both
subject to (morally) the same laws of function composition.
--------------------------------
......@@ -616,10 +616,12 @@ form. This is captured by two statement, a *soundness* result and a
lemmas (whose proof is left as an exercise)::
pf-sound : ∀{A} → (p : StateF A) → p ∼ ⌈ norm p ⌉
pf-sound = {!!}
pf-sound {A} = {!!}
where open import Relation.Binary.EqReasoning (setoid A)
pf-complete : ∀ {A} {p q : StateF A} → p ∼ q → ∀{s} → eval p s ≡ eval q s
pf-complete = {!!}
where open ≡-Reasoning
so as to focus on the overall architecture of the proof.
......
......@@ -9,7 +9,7 @@
open import Data.Nat hiding (_*_ ; _≤_)
open import Data.Maybe
open import Data.Product
open import Data.List hiding (_++_)
open import Data.List hiding (_++_ ; tail ; lookup)
open import Data.String
open import Function hiding (id ; const)
......@@ -872,9 +872,9 @@ The Rising Sea
infix 30 _⊢Nf_
infix 30 _⊢Ne_
infix 40 __
infix 40 _⟦⊢⟧_
infix 45 _⟦⇒⟧_
infix 50 _⟦×⟧_
infix 50 _⟦*⟧_
infix 30 _⊩_
......@@ -985,11 +985,10 @@ with renaming operation::
_⊢ : context → Set
ren : ∀ {Γ Δ} → Γ ⊇ Δ → Δ ⊢ → Γ ⊢
An implication in ``Sem`` is a family of implications for each context::
__ : (P Q : Sem) → Set
P Q = ∀ {Γ} → Γ ⊢P → Γ ⊢Q
_⟦⊢⟧_ : (P Q : Sem) → Set
P ⟦⊢⟧ Q = ∀ {Γ} → Γ ⊢P → Γ ⊢Q
where open Sem P renaming (_⊢ to _⊢P)
open Sem Q renaming (_⊢ to _⊢Q)
......@@ -1023,25 +1022,35 @@ the normal forms of type unit::
⟦unit⟧ : Sem
⟦unit⟧ = Nf̂ unit
⟦tt⟧ : ∀ {P} → P ⟦unit⟧
⟦tt⟧ : ∀ {P} → P ⟦⊢⟧ ⟦unit⟧
⟦tt⟧ ρ = tt
Similarly, we will interpret the ``_*_`` type as a product in
``Sem``, defined in a pointwise manner::
_⟦×⟧_ : Sem → Sem → Sem
P ⟦×⟧ Q = record { _⊢ = λ Γ → Γ ⊢P × Γ ⊢Q
; ren = λ { wk (x , y) → ( ren-P wk x , ren-Q wk y ) } }
_⟦*⟧_ : Sem → Sem → Sem
P ⟦*⟧ Q = record { _⊢ = λ Γ → Γ ⊢P × Γ ⊢Q
; ren = λ { wk (x , y) → ( ren-P wk x , ren-Q wk y ) } }
where open Sem P renaming (_⊢ to _⊢P ; ren to ren-P)
open Sem Q renaming (_⊢ to _⊢Q ; ren to ren-Q)
⟦pair⟧ : ∀ {P Q R} → P ⟶ Q → P ⟶ R → P ⟶ Q ⟦×⟧ R
⟦pair⟧ : ∀ {P Q R} →
P ⟦⊢⟧ Q →
P ⟦⊢⟧ R →
-------------
P ⟦⊢⟧ Q ⟦*⟧ R
⟦pair⟧ a b ρ = a ρ , b ρ
⟦fst⟧ : ∀ {P Q R} → P ⟶ Q ⟦×⟧ R → P ⟶ Q
⟦fst⟧ : ∀ {P Q R} →
P ⟦⊢⟧ Q ⟦*⟧ R →
-------------
P ⟦⊢⟧ Q
⟦fst⟧ p ρ = proj₁ (p ρ)
⟦snd⟧ : ∀ {P Q R} → P ⟶ Q ⟦×⟧ R → P ⟶ R
⟦snd⟧ : ∀ {P Q R} →
P ⟦⊢⟧ Q ⟦*⟧ R →
-------------
P ⟦⊢⟧ R
⟦snd⟧ p ρ = proj₂ (p ρ)
We may be tempted to define the exponential in a pointwise manner too:
......@@ -1068,15 +1077,15 @@ Interlude: Yoneda lemma
open Sem T renaming (_⊢ to _⊢T ; ren to ren-T)
Let ``_⊢T : context → Set`` be a semantics objects together with its
Let ``_⊢T : context → Set`` be a semantics object together with its
weakening operator ``ren-T : Γ ⊇ Δ → Δ ⊢T → Γ ⊢T``. By mere
application of ``ren-T``, we can implement::
ψ : Γ ⊢T → (∀ {Δ} → Δ ⊇ Γ → Δ ⊢T)
ψ t wk = ren-T wk t
were the ``∀ {Δ} →`` quantifier of the codomain type must be
understood in polymorphic sense. Surprisingly (perhaps), we can go
where the ``∀ {Δ} →`` quantifier of the codomain type must be
understood in a polymorphic sense. Surprisingly (perhaps), we can go
from the polymorphic function back to a single element, by providing
the ``id`` continuation::
......@@ -1099,18 +1108,18 @@ prove the isomorphism.
A slightly more abstract way of presenting this isomorphism consists
in noticing that any downward-closed set of context forms a valid
semantics objects. ``φ`` and ``ψ`` can thus be read as establishing an
semantics object. ``φ`` and ``ψ`` can thus be read as establishing an
isomorphism between the object ``Γ ⊢T`` and the morphisms in ``⊇[ Γ ]
T``::
⟦⊢⟧ T``::
⊇[_] : context → Sem
⊇[ Γ ] = record { _⊢ = λ Δ → Δ ⊇ Γ
; ren = λ wk₁ wk₂ → wk₂ ∘wk wk₁ }
ψ' : Γ ⊢T → ⊇[ Γ ] T
ψ' : Γ ⊢T → ⊇[ Γ ] ⟦⊢⟧ T
ψ' t wk = ren-T wk t
φ' : ⊇[ Γ ] T → Γ ⊢T
φ' : ⊇[ Γ ] ⟦⊢⟧ T → Γ ⊢T
φ' k = k id
......@@ -1137,7 +1146,7 @@ in particular, that it satisfies the following isomorphism for all ``R
.. code-block:: guess
R ⟶ P ⟦⇒⟧ Q ≡ R ⟦×⟧ P ⟶ Q
R ⟦⊢⟧ P ⟦⇒⟧ Q ≡ R ⟦*⟧ P ⟦⊢⟧ Q
We denote ``_⊢P⟦⇒⟧Q`` its action on contexts. Let ``Γ : context``. We
have the following isomorphisms:
......@@ -1145,9 +1154,9 @@ have the following isomorphisms:
.. code-block:: guess
Γ ⊢P⟦⇒⟧Q ≡ ∀ {Δ} → Δ ⊇ Γ → Δ ⊢P⟦⇒⟧Q -- by ψ
≡ ⊇[ Γ ] ⟶ P⟦⇒⟧Q -- by the alternative definition ψ'
≡ ⊇[ Γ ] ⟦×⟧ P ⟶ Q -- by definition of an exponential
≡ ∀ {Δ} → Δ ⊇ Γ → Δ ⊢P → Δ ⊢Q -- by unfolding definition of ⟦×⟧, ⟶ and currying
≡ ⊇[ Γ ] ⟦⊢⟧ P⟦⇒⟧Q -- by the alternative definition ψ'
≡ ⊇[ Γ ] ⟦*⟧ P ⟦⊢⟧ Q -- by definition of an exponential
≡ ∀ {Δ} → Δ ⊇ Γ → Δ ⊢P → Δ ⊢Q -- by unfolding definition of ⟦*⟧, ⟦⊢⟧ and currying
As in the definition of ``Y``, it is easy to see that this last member
can easily be equipped with a renaming: we therefore take it as the
......@@ -1155,45 +1164,51 @@ can easily be equipped with a renaming: we therefore take it as the
_⟦⇒⟧_ : Sem → Sem → Sem
P ⟦⇒⟧ Q = record { _⊢ = λ Γ → ∀ {Δ} → Δ ⊇ Γ → Δ ⊢P → Δ ⊢Q
; ren = λ wk₁ k wk₂ → k (wk₁ ∘wk wk₂) }
; ren = λ wk₁ k wk₂ → k (wk₁ ∘wk wk₂) }
where open Sem P renaming (_⊢ to _⊢P)
open Sem Q renaming (_⊢ to _⊢Q)
⟦lam⟧ : ∀ {P Q R} → P ⟦×⟧ Q ⟶ R → P ⟶ Q ⟦⇒⟧ R
⟦lam⟧ : ∀ {P Q R} →
P ⟦*⟧ Q ⟦⊢⟧ R →
-------------
P ⟦⊢⟧ Q ⟦⇒⟧ R
⟦lam⟧ {P} η p = λ wk q → η (ren-P wk p , q)
where open Sem P renaming (ren to ren-P)
⟦app⟧ : ∀ {P Q R} → P ⟶ Q ⟦⇒⟧ R → P ⟶ Q → P ⟶ R
⟦app⟧ : ∀ {P Q R} →
P ⟦⊢⟧ Q ⟦⇒⟧ R →
P ⟦⊢⟧ Q →
-------------
P ⟦⊢⟧ R
⟦app⟧ η μ = λ px → η px id (μ px)
**Remark:** The above construction of the exponential is taken from
MacLane & Moerdijk's `Sheaves in Geometry and Logic`_ (p.46).
At this stage, we have enough structure to interpret the types::
⟦_⟧ : type → Sem
⟦ unit ⟧ = ⟦unit⟧
⟦ S ⇒ T ⟧ = ⟦ S ⟧ ⟦⇒⟧ ⟦ T ⟧
⟦ A * B ⟧ = ⟦ A ⟧ ⟦×⟧ ⟦ B ⟧
⟦ A * B ⟧ = ⟦ A ⟧ ⟦*⟧ ⟦ B ⟧
To interpret contexts, we need a terminal object::
To interpret contexts, we also need a terminal object::
⊤̂ : Sem
⊤̂ = record { _⊢ = λ _ → ⊤
; ren = λ _ _ → tt }
⟦⊤⟧ : Sem
⟦⊤⟧ = record { _⊢ = λ _ → ⊤
; ren = λ _ _ → tt }
⟦_⟧C : (Γ : context) → Sem
⟦ ε ⟧C = ⊤̂
⟦ Γ ▹ T ⟧C = ⟦ Γ ⟧C ⟦×⟧ ⟦ T ⟧
⟦ ε ⟧C = ⟦⊤⟧
⟦ Γ ▹ T ⟧C = ⟦ Γ ⟧C ⟦*⟧ ⟦ T ⟧
As usual, a type in context will be interpreted as a morphism between
their respective interpretations. The interpreter then takes the
syntactic object to its semantical counterpart::
_⊩_ : context → type → Set
Γ ⊩ T = ⟦ Γ ⟧C ⟦ T ⟧
Γ ⊩ T = ⟦ Γ ⟧C ⟦⊢⟧ ⟦ T ⟧
lookup : ∀ {Γ T} → T ∈ Γ → Γ ⊩ T
lookup here (_ , v) = v
......@@ -1208,7 +1223,7 @@ syntactic object to its semantical counterpart::
eval {Γ} (fst {A}{B} p) = ⟦fst⟧ {⟦ Γ ⟧C}{⟦ A ⟧}{⟦ B ⟧} (eval p)
eval {Γ} (snd {A}{B} p) = ⟦snd⟧ {⟦ Γ ⟧C}{⟦ A ⟧}{⟦ B ⟧} (eval p)
Reify and reflect are defined at a given syntactic context, we
Reify and reflect are defined for a given syntactic context, we
therefore introduce suitable notations::
[_]⊩_ : context → type → Set
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
The authors of this programming project are:
Assignment (sujet.pdf) Yann Régis-Gianas.
Source code (src) Yann Régis-Gianas.
This programming project is distributed under an Apache 2.0 license.
let parse_input_source_file source_filename = Lexing.(
let cin = open_in source_filename in
let buf = Lexing.from_channel cin in
buf.lex_curr_p <- { buf.lex_curr_p with pos_fname = source_filename };
Parser.program Lexer.token buf
)
This diff is collapsed.
.PHONY: all clean
all:
dune build joujou.exe
ln -sf _build/default/joujou.exe joujou
clean:
dune clean
rm -f *~ joujou
This diff is collapsed.
open Source
open Target
open Typechecker
(** [source_to_categories translates a [source] in a [target] language
made of categorical combinators. *)
let source_to_categories : Source.program -> Target.program = fun source ->
failwith "Student! This is your job!"
(ocamllex lexer)
(menhir
(modules parser)
(flags --explain)
)
(executable
(name joujou)
(libraries base pprint)
(flags -rectypes)
)
(lang dune 1.4)
(using menhir 2.0)
let main =
IO.parse_input_source_file (Options.get_source_file ())
|> Typechecker.program
|> Source.remove_locations_in_program
|> Compiler.source_to_categories
|> Simplifier.rewrite
|> OCamlGenerator.generate_functor (Options.target_file ())
{ (* Emacs, use -*- tuareg -*- to open this file. *)
open Parser
let enter_newline lexbuf =
Lexing.new_line lexbuf;
lexbuf
}
let newline = ('\010' | '\013' | "\013\010")
let blank = [' ' '\009' '\012']
let id = ['a'-'z''0'-'9''_']+
let decimal = ['0'-'9']+
let fn = decimal ("." decimal?)?
rule token = parse
| newline { enter_newline lexbuf |> token }
| blank { token lexbuf }
| "(*" { comment 0 lexbuf }
| "->" { ARROW }
| ":" { COLON }
| "," { COMMA }
| "=" { EQUAL }
| "fun" { FUN }
| "let" { LET }
| "in" { IN }
| "fst" { FST }
| "snd" { SND }
| "float" { FLOAT }
| "(" { LPAREN }
| ")" { RPAREN }
| "*" { TIMES }
| "-" { MINUS }
| "+" { PLUS }
| fn as f { LFLOAT (float_of_string f) }
| id as s { ID s }
| eof { EOF }
| _ as c {
Printf.eprintf "Lexing error at %d: Unexpected `%c'."
lexbuf.Lexing.lex_curr_pos c;
exit 1
}
and comment depth = parse
| "*)" {
if depth = 0 then token lexbuf else comment (depth - 1) lexbuf
}
| "(*" {
comment (depth + 1) lexbuf
}
| eof {
Printf.eprintf "Unexpected EOF inside comments.";
exit 1
}
| _ {
comment depth lexbuf
}
open PPrintEngine
open Target
let string_of doc =
let b = Buffer.create 31 in
PPrintEngine.ToBuffer.pretty 0.8 120 b doc;
Buffer.contents b
let h_term = Hashtbl.create 13
let h_ok = Hashtbl.create 13
let fresh_name =
let r = ref 0 in
fun () -> incr r; Printf.sprintf "t_%d" !r
let defs = ref []
let introduce_defs () =
String.concat "\n"
(List.rev_map
(fun (x, d) -> Printf.sprintf "let %s = %s\n" x (string_of d))
!defs)
let memogen h f =
if not !Options.compact then f
else fun x ->
try
Hashtbl.find h x
with
| Not_found ->
let y = f x in
let v = fresh_name () in
defs := (v, y) :: !defs;
Hashtbl.add h x (string v);
string v
let string_of_term t =
let rec term x = (memogen h_term @@ function
| App (a, b) -> group (term a ^^ break 1 ^^ mayparen_term b)
| t -> atom t) x
and atom = function
| Identity ok -> apply "id" [ok]
| Curry (oka, okb, okc) -> apply "curry" [oka; okb; okc]
| UnCurry (oka, okb, okc) -> apply "uncurry" [oka; okb; okc]
| Apply (oka, okb) -> apply "apply" [oka; okb]
| Fork (oka, okb, okc) -> apply "fork" [oka; okb; okc]
| Exl (oka, okb) -> apply "exl" [oka; okb]
| Exr (oka, okb) -> apply "exr" [oka; okb]
| Compose (oka, okb, okc) -> apply "compose" [oka; okb; okc]
| It ok -> apply "it" [ok]
| UnitArrow ok -> apply "unit_arrow" [ok]
| Literal l -> string (Source.string_of_literal l)
| Primitive p -> string (primitive p)
| _ -> assert false (* By definition of term. *)
and mayparen_term = function
| (Literal _ | Primitive _) as t -> term t
| t -> PPrintCombinators.parens (nest 2 (term t))
and apply id oks =
if not !Options.without_oks then
group (nest 2 (string id ^^ break 1
^^ PPrintCombinators.separate_map (break 1) ok oks))
else
string id
and primitive = Source.(function
| Add -> "addC"
| Mul -> "mulC"
| Neg -> "negC"
| Sin -> "sinC"
| Cos -> "cosC"
| Inv -> "invC"
| Exp -> "expC"
)
and ok x = (memogen h_ok @@ function
| OkFloat ->
string "ok_float"
| OkUnit ->
string "ok_unit"
| OkPair (ok1, ok2) ->
group (PPrintCombinators.parens (
string "ok_pair"
^^ break 1 ^^ ok ok1
^^ break 1 ^^ ok ok2
))
| OkArrow (ok1, ok2) ->
group (PPrintCombinators.parens (
string "ok_arrow"
^^ break 1 ^^ ok ok1
^^ break 1 ^^ ok ok2
))) x
in
let b = Buffer.create 31 in
PPrintEngine.ToBuffer.pretty 0.8 120 b (nest 2 (term t));
Buffer.contents b
let string_of_definition ((Source.Id x, _), t) =
Printf.sprintf "let %s = C.(\n%s\n)" x (string_of_term t)
let string_of_program t =
let defs = List.map string_of_definition t in
introduce_defs ()
^ String.concat "\n" defs
let generate_functor filename target =
let cout = open_out filename in
Printf.fprintf cout "\
open Category
module Make (C : sig
include CartesianClosedCat
val ok_float: float ok
include NumCat
with type t := float
with type ('a, 'b) k := ('a, 'b) k
include FloatingCat
with type t := float
with type ('a, 'b) k := ('a, 'b) k
end) = \
struct
include CartesianCatDerivedOperations (C)
open C
%s
end
" (string_of_program target);
close_out cout
let typecheck_only = ref false
let without_oks = ref false
let simplify = ref true
let compact = ref false
let options = Arg.([
"--typecheck", Unit (fun () -> typecheck_only := true),
" Typecheck and stop.";
"--no-ok", Unit (fun () -> without_oks := true),
" Do not produce 'ok' witnesses.";
"--no-simplify", Unit (fun () -> simplify := false),
" Do not apply the simplifier.";
"--compact", Unit (fun () -> compact := true),
" Compactify the generated OCaml code."
])
let source_file =
ref ""
let msg = "joujou [options] source_file"
let get_source_file () =
if !source_file = "" then (Arg.usage options msg; exit 1);
!source_file
let target_file () =
Filename.chop_extension (get_source_file ()) ^ ".ml"
let analyse =
Arg.(parse (align options) ((:=) source_file) msg)
%{ (* Emacs, use -*- tuareg -*- to open this file. *)
open Source
let parsing_error pos msg =
Printf.eprintf "%s:\n %s\n" (Position.string_of_pos pos) msg;
exit 1
let var_or_primitive = function
| Id "cos" -> Primitive Cos
| Id "sin" -> Primitive Sin
| Id "exp" -> Primitive Exp
| Id "inv" -> Primitive Inv
| Id "neg" -> Primitive Neg
| x -> Var x
let last xs = List.(hd (rev xs))
let rec tuple = function
| [] -> assert false (* By syntax. *)
| [x] -> x
| x :: xs -> Position.(
let pos = join (position x) (position (last xs)) in
with_pos pos (Pair (tuple xs, x))
)
let tuple xs = tuple (List.rev xs)
%}
%token EOF
%token FUN LET IN FLOAT FST SND
%token ARROW LPAREN RPAREN COLON EQUAL COMMA
%token TIMES PLUS MINUS
%token<string> ID
%token<float> LFLOAT
%type<Source.term'> term
%start<Source.program_with_locations> program
%right ARROW IN
%left PLUS
%left TIMES
%%
program: a=definition* EOF
{
a
}
| error {
parsing_error (Position.lex_join $startpos $endpos) "Syntax error."
}
term: a=abstraction {
a
}
| d=definition IN t=located(term)
{
make_let (fst d) (snd d) t
}
| lhs=located(term) b=located(binop) rhs=located(term)
{
let with_pos = Position.(with_pos (join (position lhs) (position b))) in
let b = Position.(with_pos (position b) (Primitive (value b))) in
App (with_pos (App (b, lhs)), rhs)
}
| t=simple_term
{