Class: MiGA::RemoteDataset
- Defined in:
- lib/miga/remote_dataset.rb,
lib/miga/remote_dataset/base.rb,
lib/miga/remote_dataset/download.rb
Overview
MiGA representation of datasets with data in remote locations.
Defined Under Namespace
Constant Summary
Constants included from MiGA
CITATION, VERSION, VERSION_DATE, VERSION_NAME
Instance Attribute Summary collapse
-
#db ⇒ Object
readonly
Database storing the dataset.
-
#ids ⇒ Object
readonly
Array of IDs of the entries composing the dataset.
-
#metadata ⇒ Object
readonly
Internal metadata hash.
-
#universe ⇒ Object
readonly
Universe of the dataset.
Attributes included from Common::Net
Class Method Summary collapse
-
.download(*params) ⇒ Object
Returns String.
-
.download_ftp(opts) ⇒ Object
Download data using the FTP protocol.
-
.download_get(opts) ⇒ Object
(also: download_rest)
Download data using the GET method.
-
.download_net(opts) ⇒ Object
Redirects to
download_get
ordownload_ftp
, depending on the URI’s protocol. -
.download_opts(universe, db, ids, format, file = nil, extra = {}, obj = nil) ⇒ Object
Return hash of options used internally for the getter methods, including by
download
. -
.download_post(opts) ⇒ Object
Download data using the POST method.
-
.download_uri(uri, headers = {}) ⇒ Object
Download the given
URI
and return the result regardless of response code. -
.download_url(url, headers = {}) ⇒ Object
Download the given
url
and return the result regardless of response code. - .longest_common_prefix(strs) ⇒ Object
-
.ncbi_asm_acc2id(acc, retrials = 3) ⇒ Object
Translate an NCBI Assembly Accession (
acc
) to corresponding internal NCBI ID, with up toretrials
retrials if the returned JSON document does not conform to the expected format. -
.ncbi_asm_get(opts) ⇒ Object
Download data from NCBI Assembly database using the REST method.
-
.ncbi_gb_get(opts) ⇒ Object
Download data from NCBI GenBank (nuccore) database using the REST method.
-
.ncbi_map(id, dbfrom, db) ⇒ Object
Looks for the entry
id
indbfrom
, and returns the linked identifier indb
(or nil). -
.ncbi_taxonomy_dump? ⇒ Boolean
Is a local NCBI Taxonomy dump available?.
-
.taxonomy_from_ncbi_dump(id) ⇒ Object
Get the MiGA::Taxonomy object for the lineage of the taxon with TaxID
id
using the local NCBI Taxonomy dump. - .UNIVERSE ⇒ Object
-
.use_ncbi_taxonomy_dump(path, cli = nil) ⇒ Object
Path to a directory with a recent NCBI Taxonomy dump to use instead of making API calls to NCBI servers, which can be obtained at: ftp.ncbi.nih.gov/pub/taxonomy/taxdump.tar.gz.
Instance Method Summary collapse
-
#get_gtdb_taxonomy ⇒ Object
Get GTDB taxonomy as MiGA::Taxonomy.
-
#get_metadata(metadata_def = {}) ⇒ Object
Get metadata from the remote location.
-
#get_ncbi_taxid ⇒ Object
Get NCBI Taxonomy ID.
-
#get_ncbi_taxonomy ⇒ Object
Get NCBI taxonomy as MiGA::Taxonomy.
-
#get_type_status(metadata) ⇒ Object
Get the type material status and return an (updated)
metadata
hash. -
#initialize(ids, db, universe) ⇒ RemoteDataset
constructor
Initialize MiGA::RemoteDataset with
ids
in databasedb
fromuniverse
. -
#ncbi_asm_json_doc ⇒ Object
Get the JSON document describing an NCBI assembly entry.
-
#save_to(project, name = nil, is_ref = true, metadata_def = {}) ⇒ Object
Save dataset to the MiGA::Project
project
identified withname
. -
#update_metadata(dataset, metadata = {}) ⇒ Object
Updates the MiGA::Dataset
dataset
with the remotely available metadata, and optionally the Hashmetadata
.
Methods included from Download
#database_hash, #download, #download_headers, #download_opts, #download_params, #download_payload, #download_uri, #universe_hash
Methods inherited from MiGA
CITATION, CITATION_ARRAY, DEBUG, DEBUG_OFF, DEBUG_ON, DEBUG_TRACE_OFF, DEBUG_TRACE_ON, FULL_VERSION, LONG_VERSION, VERSION, VERSION_DATE, #advance, debug?, debug_trace?, initialized?, #like_io?, #num_suffix, rc_path, #result_files_exist?, #say
Methods included from Common::Path
Methods included from Common::Format
#clean_fasta_file, #seqs_length, #tabulate
Methods included from Common::Net
#download_file_ftp, #http_request, #known_hosts, #main_server, #net_method, #normalize_encoding, #remote_connection
Methods included from Common::SystemCall
Constructor Details
#initialize(ids, db, universe) ⇒ RemoteDataset
Initialize MiGA::RemoteDataset with ids
in database db
from universe
.
131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 |
# File 'lib/miga/remote_dataset.rb', line 131 def initialize(ids, db, universe) ids = [ids] unless ids.is_a? Array @ids = (ids.is_a?(Array) ? ids : [ids]) @db = db.to_sym @universe = universe.to_sym @metadata = {} @metadata[:"#{universe}_#{db}"] = ids.join(',') @@UNIVERSE.keys.include?(@universe) or raise "Unknown Universe: #{@universe}. Try: #{@@UNIVERSE.keys}" @@UNIVERSE[@universe][:dbs].include?(@db) or raise "Unknown Database: #{@db}. Try: #{@@UNIVERSE[@universe][:dbs].keys}" @_ncbi_asm_json_doc = nil # FIXME: Part of the +map_to+ support: # unless @@UNIVERSE[@universe][:dbs][@db][:map_to].nil? # MiGA::RemoteDataset.download # end end |
Instance Attribute Details
#db ⇒ Object (readonly)
Database storing the dataset.
121 122 123 |
# File 'lib/miga/remote_dataset.rb', line 121 def db @db end |
#ids ⇒ Object (readonly)
Array of IDs of the entries composing the dataset.
123 124 125 |
# File 'lib/miga/remote_dataset.rb', line 123 def ids @ids end |
#metadata ⇒ Object (readonly)
Internal metadata hash
125 126 127 |
# File 'lib/miga/remote_dataset.rb', line 125 def @metadata end |
#universe ⇒ Object (readonly)
Universe of the dataset.
119 120 121 |
# File 'lib/miga/remote_dataset.rb', line 119 def universe @universe end |
Class Method Details
.download(*params) ⇒ Object
Returns String. The prequired parameters (params
) are identical to those of download_opts
(see for details)
37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
# File 'lib/miga/remote_dataset/download.rb', line 37 def download(*params) opts = download_opts(*params) doc = send(opts[:_fun], opts) unless opts[:file].nil? ofh = File.open(opts[:file], 'w') unless opts[:file] =~ /\.([gb]?z|tar|zip|rar)$/i doc = normalize_encoding(doc) end ofh.print doc ofh.close end doc end |
.download_ftp(opts) ⇒ Object
Download data using the FTP protocol. Supported opts
(Hash) include: universe
(mandatory): Symbol db
: Symbol ids
: Array of String format
: String extra
: Hash
150 151 152 153 |
# File 'lib/miga/remote_dataset/download.rb', line 150 def download_ftp(opts) u = @@UNIVERSE[opts[:universe]] net_method(:ftp, u[:uri][opts]) end |
.download_get(opts) ⇒ Object Also known as: download_rest
Download data using the GET method. Supported opts
(Hash) include: universe
(mandatory): Symbol db
: Symbol ids
: Array of String format
: String extra
: Hash
123 124 125 126 |
# File 'lib/miga/remote_dataset/download.rb', line 123 def download_get(opts) u = @@UNIVERSE[opts[:universe]] download_uri(u[:uri][opts], u[:headers] ? u[:headers][opts] : {}) end |
.download_net(opts) ⇒ Object
Redirects to download_get
or download_ftp
, depending on the URI’s protocol
158 159 160 161 162 163 164 165 |
# File 'lib/miga/remote_dataset/download.rb', line 158 def download_net(opts) u = @@UNIVERSE[opts[:universe]] if u[:scheme][opts] == 'ftp' download_ftp(opts) else download_get(opts) end end |
.download_opts(universe, db, ids, format, file = nil, extra = {}, obj = nil) ⇒ Object
Return hash of options used internally for the getter methods, including by download
. The prepared request is for data from the universe
in the database db
with IDs ids
and in format
. If passed, it saves the result in file
. Additional parameters specific to the download method can be passed using extra
. The obj
can also be passed as MiGA::RemoteDataset or MiGA::Dataset
15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
# File 'lib/miga/remote_dataset/download.rb', line 15 def download_opts( universe, db, ids, format, file = nil, extra = {}, obj = nil) universe_hash = @@UNIVERSE[universe] database_hash = universe_hash.dig(:dbs, db) getter = database_hash[:getter] || :download action = database_hash[:method] || universe_hash[:method] # Clean IDs ids = # Return options { universe: universe, db: db, ids: ids.is_a?(Array) ? ids : [ids], format: format, file: file, obj: obj, extra: (database_hash[:extra] || {}).merge(extra), _fun: :"#{getter}_#{action}" } end |
.download_post(opts) ⇒ Object
Download data using the POST method. Supported opts
(Hash) include: universe
(mandatory): Symbol db
: Symbol ids
: Array of String format
: String extra
: Hash
135 136 137 138 139 140 141 |
# File 'lib/miga/remote_dataset/download.rb', line 135 def download_post(opts) u = @@UNIVERSE[opts[:universe]] uri = u[:uri][opts] payload = u[:payload] ? u[:payload][opts] : '' headers = u[:headers] ? u[:headers][opts] : {} net_method(:post, uri, payload, headers) end |
.download_uri(uri, headers = {}) ⇒ Object
Download the given URI
and return the result regardless of response code. Attempts download up to three times before raising Net::ReadTimeout.
174 175 176 |
# File 'lib/miga/remote_dataset/download.rb', line 174 def download_uri(uri, headers = {}) net_method(:get, uri, headers) end |
.download_url(url, headers = {}) ⇒ Object
Download the given url
and return the result regardless of response code. Attempts download up to three times before raising Net::ReadTimeout.
181 182 183 |
# File 'lib/miga/remote_dataset/download.rb', line 181 def download_url(url, headers = {}) download_uri(URI.parse(url), headers) end |
.longest_common_prefix(strs) ⇒ Object
207 208 209 210 211 212 |
# File 'lib/miga/remote_dataset/download.rb', line 207 def longest_common_prefix(strs) return '' if strs.empty? min, max = strs.minmax idx = min.size.times { |i| break i if min[i] != max[i] } min[0...idx] end |
.ncbi_asm_acc2id(acc, retrials = 3) ⇒ Object
Translate an NCBI Assembly Accession (acc
) to corresponding internal NCBI ID, with up to retrials
retrials if the returned JSON document does not conform to the expected format
83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 |
# File 'lib/miga/remote_dataset.rb', line 83 def ncbi_asm_acc2id(acc, retrials = 3) return acc if acc =~ /^\d+$/ search_doc = MiGA::Json.parse( download(:ncbi_search, :assembly, acc, :json), symbolize: false, contents: true ) out = (search_doc['esearchresult']['idlist'] || []).first if out.nil? raise MiGA::RemoteDataMissingError.new( "NCBI Assembly Accession not found: #{acc}" ) end return out rescue JSON::ParserError, MiGA::RemoteDataMissingError => e # Note that +JSON::ParserError+ is being rescued because the NCBI backend # may in some cases return a malformed JSON response indicating that the # "Search Backend failed". The issue with the JSON payload is that it # includes two tab characters (\t\t) in the error message, which is not # allowed by the JSON specification and causes a parsing error # (see https://www.rfc-editor.org/rfc/rfc4627#page-4) if retrials <= 0 raise e else MiGA::MiGA.DEBUG("#{self}.ncbi_asm_acc2id - RETRY #{retrials}") retrials -= 1 retry end end |
.ncbi_asm_get(opts) ⇒ Object
Download data from NCBI Assembly database using the REST method. Supported opts
(Hash) include: obj
(mandatory): MiGA::RemoteDataset ids
(mandatory): String or Array of String file
(mandatory): String, assembly saved here extra
: Hash, passed to download format
: String, ignored
60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 |
# File 'lib/miga/remote_dataset/download.rb', line 60 def ncbi_asm_get(opts) require 'tempfile' require 'zip' zipped = download( :ncbi_datasets_download, :genome, opts[:ids], :zip, nil, opts[:extra], opts[:obj] ) zip_tmp = Tempfile.new(['asm', '.zip'], encoding: zipped.encoding.to_s) zip_tmp.print(zipped) zip_tmp.close o = '' ofh = opts[:file] ? File.open(opts[:file], 'w') : nil Zip::File.open(zip_tmp.path) do |zfh| zfh.each do |entry| if entry.file? && entry.name =~ /_genomic\.fna$/ DEBUG "Extracting: #{entry.name}" entry.get_input_stream do |ifh| cont = MiGA::MiGA.normalize_encoding(ifh.read).chomp + "\n" ofh&.print(cont) o += cont end end end end ofh&.close File.unlink(zip_tmp.path) o end |
.ncbi_gb_get(opts) ⇒ Object
Download data from NCBI GenBank (nuccore) database using the REST method. Supported opts
(Hash) are the same as #download_rest and #ncbi_asm_get.
94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 |
# File 'lib/miga/remote_dataset/download.rb', line 94 def ncbi_gb_get(opts) # Simply use defaults, but ensure that the URL can be properly formed o = download_rest(opts.merge(universe: :ncbi, db: :nuccore)) return o unless o.strip.empty? begin MiGA::MiGA.DEBUG 'Empty sequence, attempting download as NCBI assembly' opts[:format] = :fasta ncbi_asm_get(opts) rescue => e raise e unless opts[:obj]&.&.dig(:ncbi_wgs) MiGA::MiGA.DEBUG e.to_s end MiGA::MiGA.DEBUG 'Empty sequence, attempting download as WGS records' a, b = opts[:obj].[:ncbi_wgs].split('-', 2) pref = longest_common_prefix([a, b]) rang = a[pref.size .. -1].to_i .. b[pref.size .. -1].to_i ids = rang.map { |k| "%s%0#{a.size - pref.size}i" % [pref, k] } download_rest(opts.merge(universe: :ncbi, db: :nuccore, ids: ids)) end |
.ncbi_map(id, dbfrom, db) ⇒ Object
Looks for the entry id
in dbfrom
, and returns the linked identifier in db
(or nil).
188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 |
# File 'lib/miga/remote_dataset/download.rb', line 188 def ncbi_map(id, dbfrom, db) doc = download(:ncbi_map, dbfrom, id, :json, nil, db: db) return if doc.empty? begin tree = MiGA::Json.parse(doc, contents: true) rescue => e sleep 5 # <- Usually caused by busy servers: BLOB ID IS NOT IMPLEMENTED DEBUG "RETRYING after: #{e}" doc = download(:ncbi_map, dbfrom, id, :json, nil, db: db) return if doc.empty? tree = MiGA::Json.parse(doc, contents: true) end tree&.dig(:linksets, 0, :linksetdbs, 0, :links, 0) end |
.ncbi_taxonomy_dump? ⇒ Boolean
Is a local NCBI Taxonomy dump available?
62 63 64 |
# File 'lib/miga/remote_dataset.rb', line 62 def ncbi_taxonomy_dump? (@ncbi_taxonomy_names ||= nil) ? true : false end |
.taxonomy_from_ncbi_dump(id) ⇒ Object
Get the MiGA::Taxonomy object for the lineage of the taxon with TaxID id
using the local NCBI Taxonomy dump.
69 70 71 72 73 74 75 76 77 |
# File 'lib/miga/remote_dataset.rb', line 69 def taxonomy_from_ncbi_dump(id) id = id.to_i unless id.is_a? Integer MiGA::Taxonomy.new(ns: 'ncbi').tap do |tax| while @ncbi_taxonomy_names[id] tax << { @ncbi_taxonomy_names[id][1] => @ncbi_taxonomy_names[id][0] } id = @ncbi_taxonomy_names[id][2] end end end |
.UNIVERSE ⇒ Object
6 7 8 |
# File 'lib/miga/remote_dataset/base.rb', line 6 def UNIVERSE @@UNIVERSE end |
.use_ncbi_taxonomy_dump(path, cli = nil) ⇒ Object
Path to a directory with a recent NCBI Taxonomy dump to use instead of making API calls to NCBI servers, which can be obtained at: ftp.ncbi.nih.gov/pub/taxonomy/taxdump.tar.gz
The cli
parameter, if passed, should be a MiGA::Cli object that will be used to report advance in the reading. Other objects can be passed, minimally supporting the MiGA::Cli#say and MiGA::Cli#advance method interfaces
24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 |
# File 'lib/miga/remote_dataset.rb', line 24 def use_ncbi_taxonomy_dump(path, cli = nil) raise "Directory doesn't exist: #{path}" unless File.directory?(path) # Structure: { TaxID => ["name", "rank", parent TaxID] } MiGA::MiGA.DEBUG "Loading NCBI Taxonomy dump: #{path}" @ncbi_taxonomy_names = {} # Read names.dmp File.open(file = File.join(path, 'names.dmp')) do |fh| read = 0 size = File.size(file) fh.each do |ln| cli&.advance('- names.dmp:', read += ln.size, size) row = ln.split(/\t\|\t?/) next unless row[3] == 'scientific name' @ncbi_taxonomy_names[row[0].to_i] = [row[1].strip] end cli&.say end # Read nodes.dmp File.open(file = File.join(path, 'nodes.dmp')) do |fh| read = 0 size = File.size(file) fh.each do |ln| cli&.advance('- nodes.dmp:', read += ln.size, size) row = ln.split(/\t\|\t?/) child = row[0].to_i parent = row[1].to_i @ncbi_taxonomy_names[child][1] = row[2] @ncbi_taxonomy_names[child][2] = parent unless parent == child end cli&.say end end |
Instance Method Details
#get_gtdb_taxonomy ⇒ Object
Get GTDB taxonomy as MiGA::Taxonomy
260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 |
# File 'lib/miga/remote_dataset.rb', line 260 def get_gtdb_taxonomy gtdb_genome = [:gtdb_assembly] or return doc = MiGA::Json.parse( MiGA::RemoteDataset.download( :gtdb, :genome, gtdb_genome, 'taxon-history' ), contents: true ) lineage = { ns: 'gtdb' } lineage.merge!(doc.first) # Get only the latest available classification release = lineage.delete(:release) @metadata[:gtdb_release] = release lineage.transform_values! { |v| v.gsub(/^\S__/, '') } MiGA.DEBUG "Got lineage from #{release}: #{lineage}" MiGA::Taxonomy.new(lineage) end |
#get_metadata(metadata_def = {}) ⇒ Object
Get metadata from the remote location.
190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 |
# File 'lib/miga/remote_dataset.rb', line 190 def ( = {}) .each { |k, v| @metadata[k] = v } return @metadata if @metadata[:bypass_metadata] case universe when :ebi, :ncbi, :web # Get taxonomy @metadata[:tax] = get_ncbi_taxonomy when :gtdb # Get taxonomy @metadata[:tax] = get_gtdb_taxonomy when :seqcode # Taxonomy already defined # Copy IDs over to allow additional metadata linked @metadata[:ncbi_asm] = @metadata[:seqcode_asm] @metadata[:ncbi_nuccore] = @metadata[:seqcode_nuccore] end if [:get_ncbi_taxonomy] tax = get_ncbi_taxonomy tax&.add_alternative(@metadata[:tax].dup, false) if @metadata[:tax] @metadata[:tax] = tax end @metadata[:get_ncbi_taxonomy] = nil @metadata = get_type_status() end |
#get_ncbi_taxid ⇒ Object
Get NCBI Taxonomy ID.
219 220 221 |
# File 'lib/miga/remote_dataset.rb', line 219 def get_ncbi_taxid send("get_ncbi_taxid_from_#{universe}") end |
#get_ncbi_taxonomy ⇒ Object
Get NCBI taxonomy as MiGA::Taxonomy
238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 |
# File 'lib/miga/remote_dataset.rb', line 238 def get_ncbi_taxonomy tax_id = get_ncbi_taxid or return if self.class.ncbi_taxonomy_dump? return self.class.taxonomy_from_ncbi_dump(tax_id) end lineage = { ns: 'ncbi' } doc = MiGA::RemoteDataset.download(:ncbi, :taxonomy, tax_id, :xml) doc.scan(%r{<Taxon>(.*?)</Taxon>}m).map(&:first).each do |i| name = i.scan(%r{<ScientificName>(.*)</ScientificName>}).first.to_a.first rank = i.scan(%r{<Rank>(.*)</Rank>}).first.to_a.first rank = nil if rank.nil? || rank == 'no rank' || rank.empty? rank = 'dataset' if lineage.size == 1 && rank.nil? lineage[rank] = name unless rank.nil? || name.nil? end MiGA.DEBUG "Got lineage: #{lineage}" MiGA::Taxonomy.new(lineage) end |
#get_type_status(metadata) ⇒ Object
Get the type material status and return an (updated) metadata
hash.
226 227 228 229 230 231 232 233 234 |
# File 'lib/miga/remote_dataset.rb', line 226 def get_type_status() if [:ncbi_asm] get_type_status_ncbi_asm() elsif [:ncbi_nuccore] get_type_status_ncbi_nuccore() else end end |
#ncbi_asm_json_doc ⇒ Object
Get the JSON document describing an NCBI assembly entry.
280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 |
# File 'lib/miga/remote_dataset.rb', line 280 def ncbi_asm_json_doc return @_ncbi_asm_json_doc unless @_ncbi_asm_json_doc.nil? if db == :assembly && %i[ncbi gtdb seqcode].include?(universe) [:ncbi_asm] ||= ids.first end return nil unless [:ncbi_asm] ncbi_asm_id = self.class.ncbi_asm_acc2id([:ncbi_asm]) txt = nil 3.times do txt = self.class.download(:ncbi_summary, :assembly, ncbi_asm_id, :json) txt.empty? ? sleep(1) : break end doc = MiGA::Json.parse(txt, symbolize: false, contents: true) return if doc.nil? || doc['result'].nil? || doc['result'].empty? @_ncbi_asm_json_doc = doc['result'][ doc['result']['uids'].first ] url_dir = @_ncbi_asm_json_doc['ftppath_genbank'] if url_dir [:web_assembly_gz] ||= '%s/%s_genomic.fna.gz' % [url_dir, File.basename(url_dir)] end @_ncbi_asm_json_doc end |
#save_to(project, name = nil, is_ref = true, metadata_def = {}) ⇒ Object
Save dataset to the MiGA::Project project
identified with name
. is_ref
indicates if it should be a reference dataset, and contains metadata_def
. If metadata_def
includes metadata_only: true, no input data is downloaded.
154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 |
# File 'lib/miga/remote_dataset.rb', line 154 def save_to(project, name = nil, is_ref = true, = {}) name ||= ids.join('_').miga_name project = MiGA::Project.new(project) if project.is_a? String MiGA::Dataset.exist?(project, name) and raise "Dataset #{name} exists in the project, aborting..." @metadata = () udb = @@UNIVERSE[universe][:dbs][db] @metadata["#{universe}_#{db}"] = ids.join(',') unless @metadata[:metadata_only] respond_to?("save_#{udb[:stage]}_to", true) or raise "Unexpected error: Unsupported stage #{udb[:stage]} for #{db}." send "save_#{udb[:stage]}_to", project, name, udb end dataset = MiGA::Dataset.new(project, name, is_ref, ) project.add_dataset(dataset.name) unless @metadata[:metadata_only] result = dataset.add_result(udb[:stage], true, is_clean: true) result.nil? and raise 'Empty dataset: seed result not added due to incomplete files.' result.clean! result.save end dataset end |
#update_metadata(dataset, metadata = {}) ⇒ Object
Updates the MiGA::Dataset dataset
with the remotely available metadata, and optionally the Hash metadata
.
182 183 184 185 186 |
# File 'lib/miga/remote_dataset.rb', line 182 def (dataset, = {}) = () .each { |k, v| dataset.[k] = v } dataset.save end |