Commit 2b80952f authored by Ryan Herbert's avatar Ryan Herbert

dev.org, server.org move some details from dev.org to server.org

Some information found in dev.org was more relevant to the server.org
which is directed towards users rather than developers.
parent 9716fc99
......@@ -635,62 +635,6 @@ make functional
- reporter A monitoring utility that can be configured to send
monitoring information to a remote server
** Configuring the Vidjil container
If you are using this environment for use on localhost, everything should
work out of the box, simply skip ahead to the section about building the
image and running the services.
However you may need to further configure the setup in order to make it
available to a whole network.
Here is a list of the configuration files found in the vidjil directory:
conf/conf.js contains various variables for the vidjil browser
conf/defs.py contains various variables for the vidjil server
conf/gzip.conf configuration for gzip in nginx
conf/gzip_static.conf same as the previous but for static resources
conf/uwsgi.ini configuration required to run vidjil with uwsgi
sites/nginx configuration required when running vidjil with nginx
scripts/nginx-entrypoint.sh entrypoint for the nginx
service (not currently in use)
scripts/uwsgi-entrypoint.sh entrypoint for the uwsgi
service. Ensures the owner of some relevant volumes are correct within
the container and starts uwsgi
Here are some notable configuration changes you should consider:
- Change the mysql user/password in docker-compose.yml. You will also
need to change the DB_ADDRESS in conf/defs.py to match it.
- Change the hostname in the nginx configuration vidjil/sites/nginx_conf.
If you are using vidjil on a network, then this might be required.
- Change the default admin password. Login as plop@plop.com password 1234
and go to the following URL: https://<your
hostname>/vidjil/default/user/change_password
- Change the ssl certificates. When building the image vidjil-server
which creates a self-signed certificate for the sake of convenience to
ensure the HTTPS queries work from the start, but this may not be
acceptable for a production environment.
In order to replace certificates the current method is to mount the
certificates to /etc/nginx/ssl with docker volumes in
docker-compose.yml.
- Change the FROM_EMAIL and ADMIN_EMAILS variables in conf/defs.py. These
represent the sender email address and the destination email addresses,
used in reporting patient milestones and server errors.
- Change the database password. In the mysql directory you will find an
entrypoint script which creates the database, the user and set that
user's password.
This is the password you need to match in the defs.py file in the
vidjil configuration.
- Change the volumes in docker-compose.yml. By default all files that
require saving outside of the containers (the database, uploads, vidjil
results and log files) are stored in /opt/vidjil , but you can change
this by editing the paths in the volumes.
- Configure the reporter. Ideally this container should be positioned
on a remote server in order to be able to report on a down server, but we have packed it here for convenience.
** Building and starting the environment
Building the image is simple and can be done prior to editing the
......@@ -714,158 +658,3 @@ make functional
docker-compose up --build
#+END_SRC
** Updating a Docker installation
Usually our docker installation will only require the following:
#+BEGIN_SRC sh
docker pull vidjil/vidjil:latest
#+END_SRC
In some cases you may need to update your docker-compose.yml file or some
of the configuration files. The latest versions are available on our
[[https://github.com/vidjil/vidjil][GitHub]].
* Migrating Data
** Database
The easiest way to perform a database migration is to first extract the
data with the following command:
#+BEGIN_SRC sh
mysqldump -u <user> -p <db> -c --no-create-info > <file>
#+END_SRC
An important element to note here is the --no-create-info we add this
parameter because web2py needs to be allowed to create tables itself
because it keeps track of database migrations and errors will occur if
tables exist which it considers it needs to create.
In order to import the data into an installation you first need to ensure
the tables have been created by Web2py this can be achieved by simply
accessing a non-static page.
/!\ If the database has been initialised from the interface you will
likely encounter primary key collisions or duplicated data, so it is best
to skip the initialisation altogether.
Once the tables have been created, the data can be imported as follows:
#+BEGIN_SRC sh
mysql -u <user> -p <db> < <file>
#+END_SRC
Please note that with this method you should have at least one admin user
that is accessible in the imported data. Since the initialisation is being
skipped, you will not have the usual admin account present.
It is also possible to create a user directly from the database although
this is not the recommended course of action.
** Files
Files can simply be copied over to the new installation, their filenames
are stored in the database and should therefore be accessible as long as
they are in the correct directories.
** Filtering data (soon deprecated)
When extracting data for a given user, the whole database should not be
copied over.
There are two courses of action:
- create a copy of the existing database and remove the users that are
irrelevant. The cascading delete should remove any unwanted data
barring a few exceptions (notably fused_file, groups and sample_set_membership)
- export the relevant data directly from the database. This method
requires multiple queries which will not be detailed here.
Once the database has been correctly extracted, a list of files can be
obtained from sequence_file, fused_file, results_file and analysis_file
with the following query:
#+BEGIN_SRC sql
SELECT <filename field>
FROM <table name>
INTO OUTFILE 'filepath'
FIELDS TERMINATED BY ','
ENCLOSED BY ''
LINES TERMINATED BY '\n'
#+END_SRC
Note: We are managing filenames here which should not contain any
character such as quotes or commas so we can afford to refrain from
enclosing the data with quotes.
This query will output a csv file containing a filename on each line.
Copying the files is now just a matter of running the following script:
#+BEGIN_SRC sh
sh copy_files <file source> <file destination> <input file>
#+END_SRC
** Exporting sample sets
The migrator script allows the export and import of data, whether it be a
single patient/run/set or a list of them, or even all the sample sets
associated to a group.
#+BEGIN_EXAMPLE
usage: migrator.py [-h] [-f FILENAME] [--debug] {export,import} ...
Export and import data
positional arguments:
{export,import} Select operation mode
export Export data from the DB into a JSON file
import Import data from JSON into the DB
optional arguments:
-h, --help show this help message and exit
-f FILENAME Select the file to be read or written to
--debug Output debug information
#+END_EXAMPLE
Export:
#+BEGIN_EXAMPLE
usage: migrator.py export [-h] {sample_set,group} ...
positional arguments:
{sample_set,group} Select data selection method
sample_set Export data by sample-set ids
group Extract data by groupid
optional arguments:
-h, --help show this help message and exit
#+END_EXAMPLE
#+BEGIN_EXAMPLE
usage: migrator.py export sample_set [-h] {patient,run,generic} ID [ID
...]
positional arguments:
{patient,run,generic}
Type of sample
ID Ids of sample sets to be extracted
optional arguments:
-h, --help show this help message and exit
#+END_EXAMPLE
#+BEGIN_EXAMPLE
usage: migrator.py export group [-h] groupid
positional arguments:
groupid The long ID of the group
optional arguments:
-h, --help show this help message and exit
#+END_EXAMPLE
Import:
#+BEGIN_EXAMPLE
usage: migrator.py import [-h] [--dry-run] [--config CONFIG] groupid
positional arguments:
groupid The long ID of the group
optional arguments:
-h, --help show this help message and exit
--dry-run With a dry run, the data will not be saved to the database
--config CONFIG Select the config mapping file
#+END_EXAMPLE
......@@ -90,6 +90,78 @@ However, the following network access are recommended:
These installation instruction are for Ubuntu server 14.04
These instructions are preliminary, other documentation can also be found in [[http://git.vidjil.org/blob/dev/doc/dev.org][dev.org]].
** With Docker
Our docker environment maked use of docker-compose. All Vidjil components
are currently packaged into a single docker image. Individual services are
started by docker compose, such as in this
[[example][https://github.com/vidjil/vidjil/blob/master/docker/docker-compose.yml]].
*** Configuring the Vidjil container
If you are using this environment for use on localhost, everything should
work out of the box.
However you may need to further configure the setup in order to make it
available to a whole network.
Here is a list of the configuration files found in the vidjil directory:
conf/conf.js contains various variables for the vidjil browser
conf/defs.py contains various variables for the vidjil server
conf/gzip.conf configuration for gzip in nginx
conf/gzip_static.conf same as the previous but for static resources
conf/uwsgi.ini configuration required to run vidjil with uwsgi
sites/nginx configuration required when running vidjil with nginx
scripts/nginx-entrypoint.sh entrypoint for the nginx
service (not currently in use)
scripts/uwsgi-entrypoint.sh entrypoint for the uwsgi
service. Ensures the owner of some relevant volumes are correct within
the container and starts uwsgi
Here are some notable configuration changes you should consider:
- Change the mysql user/password in docker-compose.yml. You will also
need to change the DB_ADDRESS in conf/defs.py to match it.
- Change the hostname in the nginx configuration vidjil/sites/nginx_conf.
If you are using vidjil on a network, then this might be required.
- Change the default admin password. Login as plop@plop.com password 1234
and go to the following URL: https://<your
hostname>/vidjil/default/user/change_password
- Change the ssl certificates. When building the image vidjil-server
which creates a self-signed certificate for the sake of convenience to
ensure the HTTPS queries work from the start, but this may not be
acceptable for a production environment.
In order to replace certificates the current method is to mount the
certificates to /etc/nginx/ssl with docker volumes in
docker-compose.yml.
- Change the FROM_EMAIL and ADMIN_EMAILS variables in conf/defs.py. These
represent the sender email address and the destination email addresses,
used in reporting patient milestones and server errors.
- Change the database password. In the mysql directory you will find an
entrypoint script which creates the database, the user and set that
user's password.
This is the password you need to match in the defs.py file in the
vidjil configuration.
- Change the volumes in docker-compose.yml. By default all files that
require saving outside of the containers (the database, uploads, vidjil
results and log files) are stored in /opt/vidjil , but you can change
this by editing the paths in the volumes.
- Configure the reporter. Ideally this container should be positioned
on a remote server in order to be able to report on a down server, but we have packed it here for convenience.
*** Updating a Docker installation
Usually our docker installation will only require the following:
#+BEGIN_SRC sh
docker pull vidjil/vidjil:latest
#+END_SRC
In some cases you may need to update your docker-compose.yml file or some
of the configuration files. The latest versions are available on our
[[https://github.com/vidjil/vidjil][GitHub]].
** Requirements
#+BEGIN_SRC sh
......@@ -483,3 +555,147 @@ These instructions are preliminary, other documentation can also be found in [[h
`cd server/web2py`
`python web2py -S vidjil -M`
`db.auth_user[<user-id].update_record(password=CRYPT(key=auth.settings.hmac_key)('<password>')[0],reset_password_key='')`
* Migrating Data
** Database
The easiest way to perform a database migration is to first extract the
data with the following command:
#+BEGIN_SRC sh
mysqldump -u <user> -p <db> -c --no-create-info > <file>
#+END_SRC
An important element to note here is the --no-create-info we add this
parameter because web2py needs to be allowed to create tables itself
because it keeps track of database migrations and errors will occur if
tables exist which it considers it needs to create.
In order to import the data into an installation you first need to ensure
the tables have been created by Web2py this can be achieved by simply
accessing a non-static page.
/!\ If the database has been initialised from the interface you will
likely encounter primary key collisions or duplicated data, so it is best
to skip the initialisation altogether.
Once the tables have been created, the data can be imported as follows:
#+BEGIN_SRC sh
mysql -u <user> -p <db> < <file>
#+END_SRC
Please note that with this method you should have at least one admin user
that is accessible in the imported data. Since the initialisation is being
skipped, you will not have the usual admin account present.
It is also possible to create a user directly from the database although
this is not the recommended course of action.
** Files
Files can simply be copied over to the new installation, their filenames
are stored in the database and should therefore be accessible as long as
they are in the correct directories.
** Filtering data (soon deprecated)
When extracting data for a given user, the whole database should not be
copied over.
There are two courses of action:
- create a copy of the existing database and remove the users that are
irrelevant. The cascading delete should remove any unwanted data
barring a few exceptions (notably fused_file, groups and sample_set_membership)
- export the relevant data directly from the database. This method
requires multiple queries which will not be detailed here.
Once the database has been correctly extracted, a list of files can be
obtained from sequence_file, fused_file, results_file and analysis_file
with the following query:
#+BEGIN_SRC sql
SELECT <filename field>
FROM <table name>
INTO OUTFILE 'filepath'
FIELDS TERMINATED BY ','
ENCLOSED BY ''
LINES TERMINATED BY '\n'
#+END_SRC
Note: We are managing filenames here which should not contain any
character such as quotes or commas so we can afford to refrain from
enclosing the data with quotes.
This query will output a csv file containing a filename on each line.
Copying the files is now just a matter of running the following script:
#+BEGIN_SRC sh
sh copy_files <file source> <file destination> <input file>
#+END_SRC
** Exporting sample sets
The migrator script allows the export and import of data, whether it be a
single patient/run/set or a list of them, or even all the sample sets
associated to a group.
#+BEGIN_EXAMPLE
usage: migrator.py [-h] [-f FILENAME] [--debug] {export,import} ...
Export and import data
positional arguments:
{export,import} Select operation mode
export Export data from the DB into a JSON file
import Import data from JSON into the DB
optional arguments:
-h, --help show this help message and exit
-f FILENAME Select the file to be read or written to
--debug Output debug information
#+END_EXAMPLE
Export:
#+BEGIN_EXAMPLE
usage: migrator.py export [-h] {sample_set,group} ...
positional arguments:
{sample_set,group} Select data selection method
sample_set Export data by sample-set ids
group Extract data by groupid
optional arguments:
-h, --help show this help message and exit
#+END_EXAMPLE
#+BEGIN_EXAMPLE
usage: migrator.py export sample_set [-h] {patient,run,generic} ID [ID
...]
positional arguments:
{patient,run,generic}
Type of sample
ID Ids of sample sets to be extracted
optional arguments:
-h, --help show this help message and exit
#+END_EXAMPLE
#+BEGIN_EXAMPLE
usage: migrator.py export group [-h] groupid
positional arguments:
groupid The long ID of the group
optional arguments:
-h, --help show this help message and exit
#+END_EXAMPLE
Import:
#+BEGIN_EXAMPLE
usage: migrator.py import [-h] [--dry-run] [--config CONFIG] groupid
positional arguments:
groupid The long ID of the group
optional arguments:
-h, --help show this help message and exit
--dry-run With a dry run, the data will not be saved to the database
--config CONFIG Select the config mapping file
#+END_EXAMPLE
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment