Mentions légales du service

Skip to content
Snippets Groups Projects
Commit 30457b12 authored by Lucas Nussbaum's avatar Lucas Nussbaum
Browse files

[dev] add a data_loader alternative to input_loader

parent 4bfb968e
Branches
No related tags found
No related merge requests found
# Load a hierarchy of JSON file into a Ruby hash
require 'refrepo/hash/hash'
def load_data_hierarchy
global_hash = {} # the global data structure
directory = File.expand_path("../../data/grid5000/", File.dirname(__FILE__))
Dir.chdir(directory) do
# Recursively list the .yaml files.
# The order in which the results are returned depends on the system (http://ruby-doc.org/core-2.2.3/Dir.html).
# => List deepest files first as they have lowest priority when hash keys are duplicated.
list_of_files = Dir['**/*.json'].sort_by { |x| -x.count('/') }
list_of_files.each do |filename|
# Load JSON
file_hash = JSON::parse(IO::read(filename))
# Inject the file content into the global_hash, at the right place
path_hierarchy = File.dirname(filename).split('/') # Split the file path (path relative to input/)
path_hierarchy = [] if path_hierarchy == ['.']
if ['nodes', 'network_equipments'].include?(path_hierarchy.last)
# it's a node or a network_equipment, add the uid
path_hierarchy << file_hash['uid']
end
file_hash = Hash.from_array(path_hierarchy, file_hash) # Build the nested hash hierarchy according to the file path
global_hash = global_hash.deep_merge(file_hash) # Merge global_hash and file_hash. The value for entries with duplicate keys will be that of file_hash
# Expand the hash. Done at each iteration for enforcing priorities between duplicate entries:
# ie. keys to be expanded have lowest priority on existing entries but higher priority on the entries found in the next files
global_hash.expand_square_brackets(file_hash)
end
end
return global_hash
end
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment