Commit ac08e72d authored by Samir Noir's avatar Samir Noir 🧀
Browse files

Update README and remove useless files

parent e6bbafc0
Pipeline #165410 waiting for manual action with stages
in 20 minutes and 29 seconds
......@@ -33,9 +33,8 @@ You can check the installation by running a few commands to gather information a
### Development process
* The `master` branch has the code for the stable version of g5k-api. This is the version pushed to api-server-v3 servers on Grid'5000
* The `devel` branch has the code for the development version of g5k-api. This is the version pushed to api-server-devel servers on Grid'5000. It is expected that this branch is regularly rebased on the `master` branch
* New features and fixes are expected to be developped in specific branches, and submitted for inclusion using Merge Requests. Fixes to be pushed to production to the `master` branch, triggering a rebased of the `devel` branch after acceptation. New functionnality to be merge on the `devel` branch
* The `master` branch is the main development branch, from which stable release are generated and pushed to api-server-devel first, and then api-server-v3, on Grid'5000
* New features and fixes are expected to be developped in specific branches, and submitted for inclusion using Merge Requests.
### Development environment
......@@ -57,12 +56,6 @@ You can check the installation by running a few commands to gather information a
accounts to be accessible by ssh. By default, it will copy your authorized_keys, but you
can control the keypair used with SSH_KEY=filename_of_private_key
Of course, reality is a bit more complex. You might have troubles with the insecure
certificate of the vagrant box provider. In that case, you'll need to start with
$ vagrant box add --insecure --name debian-jessie-x64-puppet_4 \
And as the application relies on external data sources, you'll need to connect
it with a reference-repository, an OAR database, a kadeploy3 server, and a jabber server
to exercice all its functionnality, in addition to its own backend services that
......@@ -96,7 +89,7 @@ You can check the installation by running a few commands to gather information a
Do not attempt to use the directory directly, as unit test play with the git.rename dir.
* Get access to a OAR database, unsing one of the two methods described hereafter:
* Get access to a OAR database, using one of the two methods described hereafter:
* Get your hands on a copy of an active database, and install it. Don't worry about the
error messages when seeding the development database: most of them come from the fact
......@@ -130,15 +123,15 @@ You can check the installation by running a few commands to gather information a
* To run the server, just enter:
$ ./bin/g5k-api server start -e development
$ bundle exec ./bin/g5k-api server start -e development
* If you require traces on the shell, use
$ ./bin/g5k-api server -V start -e development
$ bundle exec ./bin/g5k-api server -V start -e development
* If you need to be authenticated for some development, use:
$ HTTP_X_API_USER_CN=dmargery WHOAMI=rennes ./bin/g5k-api server start -e development
$ HTTP_X_API_USER_CN=dmargery WHOAMI=rennes bundle exec ./bin/g5k-api server start -e development
* If you want to develop on the UI, using the apache proxy, run your browser on
......@@ -148,15 +141,12 @@ You can check the installation by running a few commands to gather information a
$ firefox
That's it. If you're not too familiar with `rails` 4, have a look at
That's it. If you're not too familiar with `rails`, have a look at
You can also list the available rake tasks and capistrano tasks to see what's
already automated for you:
You can also list the available rake tasks to see what's already automated for you:
$ bundle exec rake -T
$ cap -T
## Testing
......@@ -210,52 +200,6 @@ Updating the OAR2 test db therefore requires either
### Use the build infrastructure
The debian package build is done automatically as a stage in gitlab-ci. See `.gitlab-ci.yaml` and , but only tagged commits get pushed to the repository.
Tasks described in `lib/tasks/packaging.rake` are available to automatically manage version bumping, changelog generation and package building. If you use these tasks, a tag will be created each time version is bumped. Therefore, the `lib/grid5000/version.rb` file should only be changed using these tasks, at the end of a development cycle (if production has version X.Y.Z running, the file will keep that version during the next development cycle and it will only change at the end of the development cycle).
For this to work properly, you need a working .gitconfig.
- You can copy your main .gitconfig into the vagrant box
$ cat ~/.gitconfig | vagrant ssh -- 'cat - > ~/.gitconfig'
- Or you can configure the vagrant box to your needs
vagrant@g5k-local: git config --global "Your Name"
vagrant@g5k-local: git config --global
- You can now name the version you are about to package
vagrant@g5k-local: bundle exec rake package:bump:patch #replace patch by minor or major when appropriate)
- And then build the debian package
vagrant@g5k-local: bundle exec rake package:build:debian
The `package:build:debian` rake task has several arguments:
* NO_COMMIT: when bumping version number, do not commit the version file
* NO_TAG: do not tag the current git commit with the built version. Default is to tag. Has no effect with NO_COMMIT
If everything went ok you should have a package like: `pkg/g5k-api_X.Y.Z-<date of last commit>_amd64.deb`
See the `.gitlab-ci.yml` file for the use of the rake package commands in the gitlab pipelines.
### Debug the build infrastructure
From time to time, someone will have to look into `lib/taks/packaging.rake` to understand why `rake package:build:debian` does not do what is expected or to update the way the package is built. This is what happens when you call the rake task
1. The rake task creates a temporary directory named /tmp/g5k-api_version, and extracts the lasted commited version of your code using `git archive HEAD` to it.
2. The rake task makes sure the build dependencies are installed using `mk-build-deps`, which in turn uses info in the `debian/control` file.
3. The changelog in the extracted version of the sources is updated with information from the latest commits.
4. The rake task finally calls `dpkg-buildpackage -us -uc -d` to generate the package. dpkg-buildpackage then uses the `debian/rules` makefile to go through all the steps needed for packaging. This in turn falls back to `dh` for most steps, using datafiles in the `debian` directory.
* Most tasks use the default implementation relying on the datafile found in the `debian` directory. Of particular interest are `logrotate`, `g5k-api.service`, `dirs`, `g5k-api.install` and `g5k-api.links`.
* The magic happens in the `debian/setup_bundle` script. That script handles all the instructions required so that the gems needed by the application are installed and usable on the target system.
* It will prime the temporary directory from wich the application is packaged with the developper's bundle
* It will run bundle install to setup the gems to package
* It will generate a g5k-api binary so that the application is started in the context of the installed bundle without the user noticing bundler usage. This happens by generating a script to be installed in `/usr/bin` for all ruby executable found in `bin/`
* `debian/setup_bundle`'s work is completed by lines in `debian/dirs` and `debian/g5k-api.install` to setup the final execution context of the application
## Releasing and Installing and new version
* Once you've packaged the new version, you must release it to the APT
# Copyright (c) 2010-2012 INRIA Rennes Bretagne Atlantique by Cyril Rohr (Grid'5000 and BonFIRE projects)
# Copyright (c) 2015-2018 INRIA Rennes Bretagne Atlantique by David Margery (Grid'5000)
ROOT_DIR = File.expand_path('../..', __dir__)
CHANGELOG_FILE = File.join(ROOT_DIR, 'debian', 'changelog')
VERSION_FILE = File.join(ROOT_DIR, 'lib', 'grid5000', 'version.rb')
NAME = ENV['PKG_NAME'] || 'g5k-api'
PACKAGING_DIR = '/tmp/' + NAME + '_' + Grid5000::VERSION
PACKAGES_DIR = File.join(ROOT_DIR, 'pkg')
def lsb_dist_codename
`lsb_release -s -c`.chomp
def date_of_commit(tag_or_commit)
date = `git show --pretty=tformat:"MyDate: %aD" #{tag_or_commit}`.chomp
date = Regexp.last_match(1) if date =~ /MyDate\: (.*)$/
def deb_version_from_date(date)
def deb_version_of_commit(tag_or_commit)
def purged_commits_between(version1, version2)
cmd = 'git log --oneline'
cmd << " #{version1}..#{version2}" unless version1.nil?
commit_logs = `#{cmd}`.split("\n")
purged_logs = commit_logs.reject { |l| l =~ / v#{Grid5000::VERSION}/ }
.reject { |l| l =~ / v#{version2}/ }
.reject { |l| l =~ /Commit version #{Grid5000::VERSION}/ }
def generate_changelog_entry(version, deb_version, logs, author, email, date)
"#{NAME} (#{version.gsub('_', '~')}-#{deb_version}) #{lsb_dist_codename}; urgency=low",
'', { |l| " * #{l}" }.join("\n"),
" -- #{author} <#{email}> #{date}",
def changelog_for_version(version, deb_version, change_logs)
cmd = "git show #{version}"
tagger = `#{cmd}`
if tagger =~ /Tagger\: ([^<]*)<([^>]*)>/
author = Regexp.last_match(1)
email = Regexp.last_match(2)
elsif tagger =~ /Author\: ([^<]*)<([^>]*)>/
author = Regexp.last_match(1)
email = Regexp.last_match(2)
puts "#{cmd} has #{tagger} as output: could not find Tagger or Author"
date = date_of_commit(version)
deb_version = deb_version_from_date(date) if deb_version.nil?
generate_changelog_entry(version, deb_version, change_logs, author, email, date)
def generate_changelog
versions = `git tag`.split("\n")
versions.sort! do |v1, v2|
major1, minor1, rest1 = v1.split('.')
major2, minor2, rest2 = v2.split('.')
if major1 == major2
if minor1 == minor2
patch1, rc1 = rest1.split('_rc')
patch2, rc2 = rest2.split('_rc')
if patch1 == patch2
rc1.to_i <=> rc2.to_i
patch1.to_i <=> patch2.to_i
minor1 <=> monor2
major1 <=> major2
versions.reject! { |v| v !~ /[0-9]+\.[0-9]+\..*/ }
change_logs = []
previous_version = versions.shift
change_logs << changelog_for_version(previous_version, nil, ['First version tagged for packaging'])
versions.each do |version|
purged_logs = purged_commits_between(previous_version, version)
purged_logs = ["Retagged #{previous_version}. No other changes"] if purged_logs.empty?
change_logs << changelog_for_version(version, nil, purged_logs)
previous_version = version
def update_changelog(changelog_file, new_version)
content_changelog = ''
if File.exist?(changelog_file)
changelog =
last_commit = changelog.scan(/\s+\* ([a-f0-9]{7}) /).flatten[0]
deb_version = deb_version_of_commit('HEAD')
purged_logs = purged_commits_between(last_commit, 'HEAD')
if purged_logs.size != 0
user_name = `git config --get`.chomp
if user_name == ''
user_name = ENV['GITLAB_USER_NAME']
if user_name.nil? || user_name == ''
puts 'No git or gitlab user: running in Vagrant box ? Use git config --global "firstname lastname" before bumping version'
user_email = `git config --get`.chomp
if user_email == ''
user_email = ENV['GITLAB_USER_EMAIL']
if user_email.nil? || user_email == ''
puts 'No mail found'
content_changelog = generate_changelog_entry(new_version,
user_email,'%a, %d %b %Y %H:%M:%S %z')), 'w+') do |f|
f << content_changelog + "\n"
f << changelog
warn "Update_changelog called on inexistant file #{changelog_file}"
content_changelog.size > 0
def bump(index)
fragments = Grid5000::VERSION.split('.')
fragments[index] = fragments[index].to_i + 1
((index + 1)..2).each do |i|
fragments[i] = 0
new_version = fragments.join('.')
content_version =
), 'w+') do |f|
f << content_version
changed = update_changelog(CHANGELOG_FILE, new_version)
unless changed
puts 'No real changes except version changes since last version bump. Aborting unless EMPTYBUMP set'
exit(-1) unless ENV['EMPTYBUMP']
puts "Generated changelog for version #{new_version}-#{deb_version}."
unless ENV['NO_COMMIT']
puts 'Committing changelog and version file...'
sh "git commit -m 'Commit version #{new_version}' #{CHANGELOG_FILE} #{VERSION_FILE}"
unless ENV['NO_COMMIT']
puts 'Tagging the release'
sh "git tag -a v#{new_version} -m \"v#{new_version} tagged by rake package:bump:[patch|minor|major]\""
puts 'INFO: git push --follow-tags (push with relevant tags) required for package publication by gitlab CI/CD'
namespace :package do
namespace :bump do
desc 'Increment the patch fragment of the version number by 1'
task :patch do
desc 'Increment the minor fragment of the version number by 1'
task :minor do
desc 'Increment the major fragment of the version number by 1'
task :major do
namespace :build do
desc 'Prepare dirs for building debian package'
task :prepare do
version = Grid5000::VERSION
# prepare the dir where the result will be stored
mkdir_p PACKAGES_DIR.to_s
# make sure no pending changes need to be commited to repository
uncommitted_changes = `git status --untracked-files=no --porcelain`
if uncommitted_changes != ''
# Gemfile.lock will include bundle version used. It changes between debian versions
files = uncommitted_changes.scan(/\w\s(.*)$/).flatten.reject { |f| ['Gemfile.lock'].include?(f) }
if files.size > 0
warn "Unexpected diff in #{files}:"
warn `git diff`
raise "You are building from a directory with uncommited files in git. Please commit pending changes so there is a chance the build can be traked back to a specific state in the repository\n#{uncommitted_changes}"
warn 'Expected diff:'
warn `git diff`
# prepare the build directory
mkdir_p "#{PACKAGING_DIR}/pkg"
# remove previous versions built from the build directory
rm_rf "#{PACKAGING_DIR}/pkg/#{NAME}_*.deb"
# extract the commited state of the repository to the build directory
sh "git archive HEAD > /tmp/#{NAME}_#{version}.tar"
Dir.chdir('/tmp') do
mkdir_p "#{NAME}_#{version}"
sh "tar xf #{NAME}_#{version}.tar -C #{NAME}_#{version}"
sh "rm #{NAME}_#{version}.tar"
desc 'Build debian package'
task debian: :prepare do
sudo = if Process.uid == 0
'sudo '
commands = [
"#{sudo}apt-get -y --no-install-recommends install devscripts build-essential equivs",
"#{sudo}mk-build-deps -ir -t 'apt-get -y --no-install-recommends'"
commands.each do |cmd|
sh cmd
update_changelog(File.join(PACKAGING_DIR, 'debian', 'changelog'), Grid5000::VERSION)
Dir.chdir('/tmp') do
Dir.chdir(PACKAGING_DIR) do
sh 'dpkg-buildpackage -us -uc -d'
sh "cp #{PACKAGING_DIR}/../#{NAME}_#{Grid5000::VERSION}*.deb pkg/"
namespace :changelog do
desc "Generate a changelog from git log and tags and save it in #{CHANGELOG_FILE}"
task :generate do, 'w+') do |f|
f << generate_changelog
desc 'Show what a generated changelog from git log and tags would look like'
task :show do
puts generate_changelog
#!/usr/bin/ruby -w
require 'cute'
require 'pp'
g5k = '', version: '', username: 'dmargery')
JOB_NAME = 'reproduce_bug_9467'.freeze
G5K_SITE = 'rennes'.freeze
G5K_ENV = 'debian9-x64-min'.freeze # environment to deploy
WALLTIME = '00:40:00'.freeze
job = nil
g5k.get_my_jobs(G5K_SITE, %w[waiting running]).each do |j|
pp j
if j['name'] == JOB_NAME
job = j
if job.nil?
puts "No job named #{JOB_NAME} found in #{G5K_SITE} for you. Creating one"
job = g5k.reserve(site: G5K_SITE, nodes: NODES, walltime: WALLTIME, type: :deploy, wait: false,
name: JOB_NAME,
cmd: 'sleep 64600')
puts "Job #{job['uid']} created. Monitor its status with e.g.: oarstat -fj #{job['uid']}"
# for better output, redirect stderr to stdout, make stdout a synchronized output stream
STDOUT.sync = true
while job['state'] !~ /unning/
msg = ''
msg = ". Scheduled start is #{['scheduled_at'])}" if job.has_key?('scheduled_at')
puts "Waiting for job #{job['uid']} of current status #{job['state']}#{msg}"
sleep 1
job = g5k.get_job(G5K_SITE, job['uid'])
pp job
nodes = job['assigned_nodes']
puts "Running on: #{nodes.join(' ')}"
# deploying all nodes, waiting for the end of deployment
g5k.deploy(job, env: G5K_ENV, wait: true)
if (job['deploy'].last['status'] == 'error') || !job['deploy'].last['result'].to_a.all? { |e| e[1]['state'] == 'OK' }
raise 'Deployment ended with error'
# Copyright (c) 2009-2011 Cyril Rohr, INRIA Rennes - Bretagne Atlantique
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# See the License for the specific language governing permissions and
# limitations under the License.
# Custom capistrano file
# How to set an SSH gateway :
# set :gateway, [""]
# puppetmgt -a -c api-g5k::server api-server.`hostname | cut -d"." -f2,2`
# puppetmgt -a -c api-g5k::proxy api-proxy.`hostname | cut -d"." -f2,2`
set :user, 'g5kadmin'
set :gateway, ''
# ssh_options[:verbose] = 0
%w[bordeaux grenoble lille lyon luxembourg nancy reims rennes orsay sophia toulouse].each do |site|
role :app, "api-server.#{site}"
role :devel, "api-server-devel.#{site}"
role :web, "api-proxy.#{site}"
role :puppet, "puppet.#{site}"
role :sql, "mysql.#{site}"
role :oar, "oar-api.#{site}"
namespace :api do
desc 'Run puppet on all servers. Use ROLES env variable to choose on which machines you want to run this command.'
task :update, roles: %i[web app devel puppet sql] do
run "#{sudo} aptitude update && #{sudo} puppetd --test"
#!/usr/bin/env ruby
# (c) 2017 Inria by David Margery ( for the Grid'5000 project
require 'eventmachine'
require 'em-http-request'
require 'json'
require 'pp'
base_url = ''
entry_point = '/sites'
base_url = ''
entry_point = '/stable/sites'
def fetch_link(description, relation)
found = description['links'].find { |l| l['rel'] == relation }
return found['href'] if found
end do
get_params = {
query: { 'branch' => 'master' },
timeout: 20,
head: { 'Accept' => 'application/json', 'Authorization' => %w[dmargery xxxx] }
http ="#{base_url}#{entry_point}").get(get_params)
http.errback { puts "Request failed #{http.response_header.status}"; EM.stop }
http.callback do
puts "Request for list of sites succeeded with code #{http.response_header.status}"
sites = JSON.parse(http.response)
expected_sites = sites['items'].size
expected_sites_net = sites['items'].size
expected_sites_pdu = sites['items'].size
expected_clusters = {}
sites['items'].each do |site|
clusters_url = fetch_link(site, 'clusters')
http_cluster ="#{base_url}#{clusters_url}").get(get_params)
http_cluster.errback { puts "Request to clusters of site #{site['name']} at #{base_url}/#{clusters_url} failed #{http_cluster.response_header.status}"; EM.stop }
http_cluster.callback do
expected_sites -= 1
puts "Request to clusters of site #{site['name']} (#{base_url}/#{clusters_url}) returned #{http_cluster.response_header.status}. #{expected_sites} sites still expected"
clusters = JSON.parse(http_cluster.response)
expected_clusters[site['name']] = clusters['items'].size
clusters['items'].each do |cluster|
nodes_url = fetch_link(cluster, 'nodes')
http_node ="#{base_url}/#{nodes_url}").get(get_params)
http_node.errback { puts "Request to nodes of cluster #{cluster['uid']} at #{base_url}/#{nodes_url} failed #{http_node.response_header.status}"; EM.stop }
http_node.callback do
expected_clusters[site['name']] = expected_clusters[site['name']] - 1
puts "Request to cluster #{cluster['uid']} returned #{http_node.response_header.status}. #{expected_clusters[site['name']]} clusters still expected for #{site['name']}"
if expected_sites == 0 && expected_sites_net == 0 && expected_sites_pdu == 0 && expected_clusters.all? { |_k, v| v == 0 }
nets_url = fetch_link(site, 'network_equipments')
http_net ="#{base_url}/#{nets_url}").get(get_params)
http_net.errback { puts "Request to network_equipments of site #{site['name']} at #{base_url}/#{nets_url} failed #{http_net.response_header.status}"; EM.stop }
http_net.callback do
expected_sites_net -= 1
puts "Request to network_equipments of site #{site['name']} returned #{http_net.response_header.status}. #{expected_sites_net} sites still expected"
pdus_url = fetch_link(site, 'pdus')
http_pdu ="#{base_url}/#{pdus_url}").get(get_params)
http_pdu.errback { puts "Request to pdus of site #{site['name']} at #{base_url}/#{pdus_url} failed #{http_pdu.response_header.status}"; EM.stop }
http_pdu.callback do
expected_sites_pdu -= 1
puts "Request to pdus of site #{site['name']} returned #{http_pdu.response_header.status}. #{expected_sites_pdu} sites still expected"
puts 'Finished'
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment