First, create a virtualenv using virtualenvwrapper in order to sandbox our Python environment for development:
$ mkvirtualenv my-site
Start all dependent services using docker-compose (this will start PostgreSQL, Elasticsearch 6, RabbitMQ and Redis):
$ docker-compose up -d
Note
Make sure you have enough virtual memory for Elasticsearch in Docker:
# Linux
$ sysctl -w vm.max_map_count=262144
# macOS
$ screen ~/Library/Containers/com.docker.docker/Data/com.docker.driver.amd64-linux/tty
<enter>
linut00001:~# sysctl -w vm.max_map_count=262144
Next, bootstrap the instance (this will install all Python dependencies and build all static assets):
$ ./scripts/bootstrap
Next, create database tables, search indexes and message queues:
$ ./scripts/setup
Start the webserver:
$ ./scripts/server
Start the a background worker:
$ celery -A invenio_app.celery worker -l INFO
Start a Python shell:
$ ./scripts/console
In order to upgrade an existing instance simply run:
$ ./scripts/update
Run the test suite via the provided script:
$ ./scripts/test
By default, end-to-end tests are skipped. You can include the E2E tests like this:
$ env E2E=yes ./scripts/test
For more information about end-to-end testing see pytest-invenio
You can build the documentation with:
$ python setup.py build_sphinx
You can use simulate a full production environment using the
docker-compose.full.yml
. You can start it like this:
$ docker-compose -f docker-compose.full.yml up -d
In addition to the normal docker-compose.yml
, this one will start:
- HAProxy (load balancer)
- Nginx (web frontend)
- UWSGI (application container)
- Celery (background task worker)
- Flower (Celery monitoring)