Skip to content

Commit 9ec5ff6

Browse files
authoredApr 16, 2024··
Consolidate READMEs (#3812)
Part of #3800
1 parent b0d4ee9 commit 9ec5ff6

File tree

3 files changed

+72
-219
lines changed

3 files changed

+72
-219
lines changed
 

‎Dockerfile

-53
This file was deleted.

‎README.md

+72-49
Original file line numberDiff line numberDiff line change
@@ -1,56 +1,42 @@
1-
# The Kore Language
1+
# Symbolic Execution Engine for the K Framework
22

3-
Kore is the "core" part of the K framework.
3+
## Structure of this project
44

5-
## What is Kore all about?
5+
This project implements the tools for symbolically executing programs in languages specified using the [K Framework]. The symbolic execution is performed at the level of `Kore` --- an intermediate representation produced by the K compiled from a language specification.
66

7-
In short, we need a formal semantics of K.
8-
In K, users can define formal syntax and semantics of
9-
programming languages as K definitions, and automatically obtain
10-
parsers, interpreters, compilers, and various verification tools
11-
for their languages.
12-
Therefore K is a language-independent framework.
7+
The K framework is a term rewriting-based specification language, and therefore its symbolic execution backend implements a symbolic term rewriter. Such a rewriter is implemented by the `kore-rpc-booster` executable --- a JSON RPC-based server that implements the [KORE RPC protocol].
138

14-
Thanks to years of research in matching logic and reachability
15-
logic, we know that all K does can be nicely formalized as
16-
logic reasoning in matching logic.
17-
To give K a formal semantics, we only need to formally specify
18-
the underlying matching logic theories with which K does reasoning.
19-
In practice, these underlying theories are complex and often
20-
infinite, and it is tricky to specify infinite theories without
21-
a carefully designed formal specification language.
22-
And Kore is such a language.
9+
The `kore-rpc` executable is a legacy version of the RPC-based rewriter that implements the same protocol. Finally, `kore-exec` and `kore-repl` are the deprecated non-RPC entry points.
2310

24-
## Structure of this project
11+
Note that this project is low-level and is not intended to be a user-facing tool. For the Python-based theorem prover based on the KORE RPC protocol, see the [pyk] package in the [K Framework] repository.
12+
13+
### Kompiling a K definition and running the RPC server
2514

26-
The [docs] directory contains a collection of documents
27-
that describe the mathematical foundation of Kore and a BNF grammar
28-
that defines the syntax of Kore language. See
29-
[/docs/introduction.md](https://github.com/runtimeverification/haskell-backend/blob/master/docs/introduction.md)
30-
for an overview of the components of Kore.
15+
The `kore-rpc-booster` binary takes a `kore` file definition, parses and internalises it and then launches an RPC server, which executes requests agains this definition. It additionally accepts a path to a dynamic library compiled by the LLVM backend, which is used for simplification of bool sorted terms. In order to build the kore definition and the shared library out of a K definition, first call
16+
17+
```
18+
kompile --llvm-kompile-type c my_defintion.k
19+
```
3120

32-
The `kore` project is an implementation in Haskell of a Kore parser and symbolic execution engine,
33-
for use with the [K Framework] as a backend.
21+
and then launch the server via
22+
23+
```
24+
kore-rpc-booster ./my_defintion-kompiled/definition.kore --module MY-DEFINITION --llvm-backend-library ./my_defintion-kompiled/interpreter
25+
```
3426

3527
## Building
3628

37-
Besides [git], you will need [stack] or [cabal] to build `kore`.
29+
Besides [git], you will need [stack] or [cabal] to build the project.
3830

3931
```sh
40-
stack build kore
32+
stack build all
4133
# or
42-
cabal build kore
34+
cabal build all
4335
```
4436

4537
You may pass `--fast` to `stack build` or `-O0` to `cabal build` in order to
4638
disable compiler optimizations and make build faster at the cost of slower runtime.
4739

48-
Using [make]:
49-
50-
```sh
51-
make all # builds all binaries
52-
```
53-
5440
## Developing
5541

5642
Developers will require all the dependencies listed above,
@@ -68,27 +54,15 @@ You can install/have access to K by either:
6854
* using [kup]
6955
* using a pre-built binary (see the releases page in the K repository)
7056
* if you use Nix, see the section below
71-
* using the `Dockerfile` to run the integration tests inside a container
7257
* or by just building K from source
7358

74-
### Running integration tests with Docker
75-
76-
Use `docker.sh` to run commands inside the container:
77-
78-
``` sh
79-
./docker/build.sh # run once when dependencies change
80-
./docker/run.sh make all # build the backend
81-
./docker/run.sh make test # run all tests
82-
./docker/run.sh make -C test/imp test # run all tests in test/imp
83-
```
84-
8559
### Recommended dependencies
8660

8761
For setting up a development environment, we recommend:
8862

8963
- [direnv] to make the project's tools available in shells and editors.
9064
- [ghcup] or [Nix] for managing required Haskell tooling
91-
- [hlint] for compliance with project guidelines.
65+
- [hlint] and [fourmolu] for compliance with project guidelines.
9266

9367
### Running Haskell Language Server
9468

@@ -100,11 +74,15 @@ set up your [Nix] environment to build the server.
10074
Prequisite: build the project with either Stack or Cabal.
10175

10276
Instructions on integrating with VSCode:
103-
1. Install the [Haskell extension]
77+
1. Install the [Haskell extension]
10478
2. Go to `Extension Settings` and pick `GHCup` in the `Manage HLS` dropdown
10579
3. (Optional) Manually set the GHCup executable path
10680
4. (Extra) Set `Formatting Provider` to `fourmolu` for correct formatting
10781

82+
You may also need to install the [nix-env-selector](https://marketplace.visualstudio.com/items?itemName=arrterian.nix-env-selector) extension.
83+
84+
The `nix-env-selector` extension may prompt for the workspace to be re-loaded. Once re-loaded, HLS should start working.
85+
10886
### Developing with Nix
10987

11088
To build and run nix based packages at RV, please follow these instructions to set up nix:
@@ -178,6 +156,18 @@ nix run .#format Foo.hs
178156
We provide a development nix shell with a suitable development environment and
179157
a binary cache at [runtimeverification.cachix.org]. The development can be launched via `nix develop` and then calling `stack build/test/etc`.
180158

159+
### Nix-direnv
160+
161+
Using a version of direnv that works with nix (https://github.com/nix-community/nix-direnv) allows seamless loading and unloading of the nix shell, which adds all the required packages such as `cabal`, `hpack`, `fourmolu`, etc. Use the above link to install `nix-direnv`, making sure to hook direnv into whichever shell you are using (https://direnv.net/docs/hook.html). You can use the default nix shell (currently GHC version 9.6.4) by running
162+
163+
```bash
164+
echo "use flake" > .envrc
165+
```
166+
167+
Finally, run `direnv allow` inside the repo folder to load up the nix shell.
168+
169+
Note that only `cabal` currently works within the nix shell and since it does not support the HPack `package.yaml` file format, any changes to this file will require running `hpack` before they are picked up by cabal.
170+
181171
### Upgrading dependencies
182172

183173
When one of the package description files (`kore.cabal`, `kore-rpc-types.cabal`) changes, or when upgrading to a newer `stack` resolver, the dependencies need to be consolidated to avoid accidental breakage from incompatible up-stream updates. We use a `cabal.project.freeze` file to pin the dependencies to what the current `stack` resolver is using.
@@ -216,6 +206,36 @@ integration-shell:me@somewhere:haskell-backend$ exit
216206
me@somewhere:haskell-backend$
217207
```
218208

209+
### Integration/Performance tests of downstream projects
210+
211+
#### scripts/performance-tests-{kevm, kontrol, mx}.sh
212+
213+
Call these scripts from the root of the repo to obtain performance numbers for the KEVM and Kontrol test suites. These are necessary for any new feature which is expected to modify the perfromance of the booster and the timings should be includedf in the PR.
214+
215+
216+
#### scripts/booster-analysis.sh
217+
218+
This scipt can be used with any folder containing bug reports to build an anlysis of fallback/abort reasons in the booster. To obtain bug reports, first run `PYTEST_PARALLEL=8 scripts/performance-tests-kevm.sh --bug-report`, which will generate tarballs for all the tests and drop them into `scripts/bug-reports/`. Then call `scripts/booster-analysis.sh scripts/booster-analysis.sh scripts/bug-reports/kevm-v1.0.417-main`
219+
220+
221+
### Generating an integration test from a bug-report.tar.gz
222+
223+
1) Rename `bug-report.tar.gz` to something more descriptive like `issue-123.tar.gz`
224+
2) Copy the tar file into `test/rpc-integration/`
225+
3) Run `./generateDirectoryTest.sh issue-123.tar.gz`. This will copy the definition files into `resources/` and rpc commands into `test-issue-123/`
226+
4) Run the test via `./runDirectoryTest test-issue-123`
227+
228+
Note that it is also possible to run a test directly from a tarball with `runDirectoryTest.sh`, skipping the unpacking step. This is useful in automated workflows that involve running several tarballs.
229+
230+
### Pretty printing KORE JSON
231+
232+
There is a simple utility called pretty which can be used to pretty print a KORE JSON term from a bug report, which does not contain the original K definition:
233+
234+
```
235+
cabal run pretty -- ../definition.kore <(jq '.result.state' XXX_response.json)
236+
```
237+
238+
219239
220240

221241
[docs]: https://github.com/runtimeverification/haskell-backend/tree/master/docs
@@ -229,10 +249,13 @@ me@somewhere:haskell-backend$
229249
[haskell-language-server]: https://github.com/haskell/haskell-language-server
230250
[language server]: https://langserver.org/
231251
[hlint]: https://github.com/ndmitchell/hlint
252+
[fourmolu]: https://github.com/fourmolu/fourmolu
232253
[runtimeverification.cachix.org]: https://runtimeverification.cachix.org/
233254
[Nix]: https://nixos.org
234255
[entr]: https://github.com/eradman/entr
235256
[fd]: https://github.com/sharkdp/fd
236257
[kup]: https://github.com/runtimeverification/k/releases/latest
237258
[ghcup]: https://www.haskell.org/ghcup/
238259
[Haskell extension]: https://marketplace.visualstudio.com/items?itemName=haskell.haskell
260+
[KORE RPC protocol]: ./docs/2022-07-18-JSON-RPC-Server-API.md
261+
[pyk]: https://github.com/runtimeverification/k/tree/master/pyk

‎booster/README.md

-117
Original file line numberDiff line numberDiff line change
@@ -5,120 +5,3 @@ A simpler and faster version of [K-Framework's haskell-backend](../haskell-backe
55
* A simple rewrite engine that focuses on the K configuration as the main term
66
* Aims to solve _easy and common_ rewrites quickly, rather than _all_ rewrites completely
77
* Reverts to the standard backend for complex unification and term simplification
8-
9-
## Kompiling a K definition and running the RPC server
10-
11-
The `kore-rpc-booster` binary takes a `kore` file definition, parses and internalises it and then launches an RPC server, which executes requests agains this definition. It additionally accepts a path to a dynamic library compiled by the LLVM backend, which is used for simplification of bool sorted terms. In order to build the kore definition and the shared library out of a K definition, first call
12-
13-
```
14-
kompile --llvm-kompile-type c my_defintion.k
15-
```
16-
17-
and then launch the server via
18-
19-
```
20-
kore-rpc-booster ./my_defintion-kompiled/definition.kore --module MY-DEFINITION --llvm-backend-library ./my_defintion-kompiled/interpreter
21-
```
22-
23-
24-
## Development
25-
26-
### Package structure
27-
28-
At the moment, all code lives in a single package. This might change in the future as the software grows in terms of features.
29-
30-
### Building
31-
32-
The software can be built with `stack` or `cabal`.
33-
34-
```sh
35-
$ stack build
36-
# or
37-
$ cabal build
38-
```
39-
40-
* Prerequisites: `stack` or `cabal`, `ghc-9.2.7`
41-
* Stackage resolver: `lts-20.20` (`ghc-9.2.7`)
42-
43-
### Nix
44-
45-
There are several things you can do, to make the development via nix as seamless as possible.
46-
47-
48-
#### Nix shell
49-
50-
To open the nix shell, run `nix develop`. You will need nix version 2.4 or newer with nix flakes enabled.
51-
52-
53-
#### Nix-direnv
54-
55-
Using a version of direnv that works with nix (https://github.com/nix-community/nix-direnv) allows seamless loading and unloading of the nix shell, which adds all the required packages such as `cabal`, `hpack`, `fourmolu`, etc. Use the above link to install `nix-direnv`, making sure to hook direnv into whichever shell you are using (https://direnv.net/docs/hook.html). You can use the default nix shell (currently GHC version 9.2.7) by running
56-
57-
```bash
58-
echo "use flake ." > .envrc
59-
```
60-
61-
Finally, run `direnv allow` inside the repo folder to load up the nix shell.
62-
63-
Note that only `cabal` currently works within the nix shell and since it does not support the HPack `package.yaml` file format, any changes to this file will require running `hpack` before they are picked up by cabal.
64-
65-
### scripts/update-haskell-backend.sh
66-
67-
To bump the version of the haskell-backend consistently within the project (i.e. in nix, cabal.project and stack.yaml) call
68-
69-
```
70-
nix run .#update-haskell-backend
71-
```
72-
73-
you can optionally pass a commit hash above if you don't want master.
74-
75-
### scripts/performance-tests-{kevm, kontrol}.sh
76-
77-
Call these scripts from the root of the repo to obtain performance numbers for the KEVM and Kontrol test suites. These are necessary for any new feature which is expected to modify the perfromance of the booster and the timings should be includedf in the PR.
78-
79-
80-
### scripts/booster-analysis.sh
81-
82-
This scipt can be used with any folder containing bug reports to build an anlysis of fallback/abort reasons in the booster. To obtain bug reports, first run `PYTEST_PARALLEL=8 scripts/performance-tests-kevm.sh --bug-report`, which will generate tarballs for all the tests and drop them into `scripts/bug-reports/`. Then call `scripts/booster-analysis.sh scripts/booster-analysis.sh scripts/bug-reports/kevm-v1.0.417-main`
83-
84-
### HLS in VSCode
85-
86-
To get HLS working in VSCode, install these two extensions:
87-
88-
https://marketplace.visualstudio.com/items?itemName=arrterian.nix-env-selector
89-
https://marketplace.visualstudio.com/items?itemName=haskell.haskell
90-
91-
The `nix-env-selector` extension may prompt for the workspace to be re-loaded. Once re-loaded, HLS should start working.
92-
93-
## Eventlog tracing
94-
95-
Besides compiling the backend with profiling mode, we can also enable a targeted profiling mode by emitting useful debug events into the eventlog. So far we can emit/trace the following:
96-
97-
* LLVM backend API calls - using `--trace llvm-calls +RTS -l-au` we can collect all the LLVM backend API calls the server performs during execution. Running the obtained eventlog through `eventlog-parser` will produce an `llvm_calls.c` file of the form:
98-
99-
```c
100-
kore_symbol* sym_m5952705877914568462 = kore_symbol_new("LblnotBool'Unds'");
101-
kore_pattern* pat_m7294887483024610198 = kore_composite_pattern_from_symbol(sym_m5952705877914568462);
102-
kore_symbol* sym_2859997003983430356 = kore_symbol_new("LblSet'Coln'in");
103-
kore_pattern* pat_7796859658648783000 = kore_composite_pattern_from_symbol(sym_2859997003983430356);
104-
kore_symbol* sym_m6495506210664726973 = kore_symbol_new("inj");
105-
kore_sort* sort_m1174205405547972024 = kore_composite_sort_new("SortId");
106-
...
107-
```
108-
109-
* Timing information in IO - using `--trace timing +RTS -lsu` we can instrument code with `Trace.timeIO "foo" ...` calls which will measure time spent in `...` and attach the label `foo` to this region in the speedscope profile. `eventlog-parser` will produce a JSON file of these calls viewable in the speedscope app.
110-
111-
## Generating an integration test from a bug-report.tar.gz
112-
113-
1) Rename `bug-report.tar.gz` to something more descriptive like `issue-123.tar.gz`
114-
2) Copy the tar file into `test/rpc-integration/`
115-
3) Run `./generateDirectoryTest.sh issue-123.tar.gz`. This will copy the definition files into `resources/` and rpc commands into `test-issue-123/`
116-
4) Run the test via `./runDirectoryTest test-issue-123`
117-
118-
## Pretty printing KORE JSON
119-
120-
There is a simple utility called pretty which can be used to pretty print a KORE JSON term from a bug report, which does not contain the original K definition:
121-
122-
```
123-
cabal run pretty -- ../definition.kore <(jq '.result.state' XXX_response.json)
124-
```

0 commit comments

Comments
 (0)
Please sign in to comment.