Module: OpenTox::Algorithm

Includes:
OpenTox
Included in:
Dataset, FeatureSelection, Fminer, Generic, Lazar, Substructure, Model::Lazar
Defined in:
lib/algorithm.rb,
lib/utils.rb

Overview

Wrapper for OpenTox Algorithms

Defined Under Namespace

Modules: Dataset, FeatureSelection, Neighbors, Similarity, Substructure Classes: BBRC, Fminer, Generic, LAST, Lazar, StructuralClustering

Instance Attribute Summary

Attributes included from OpenTox

#metadata, #uri

Class Method Summary collapse

Instance Method Summary collapse

Methods included from OpenTox

#add_metadata, all, #delete, #initialize, #load_metadata, sign_in, text_to_html

Class Method Details

.effect(occurrences, db_instances) ⇒ Object

Effect calculation for classification. It is assumed that the elements of the arrays match each other pairwise

Parameters:

  • Array (Array)

    of occurrences per class (in the form of Enumerables).

  • Array (Array)

    of database instance counts per class.



473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
# File 'lib/utils.rb', line 473

def self.effect(occurrences, db_instances)
  max=0
  max_value=0
  nr_o = self.sum_size(occurrences)
  nr_db = db_instances.to_scale.sum

  occurrences.each_with_index { |o,i| # fminer outputs occurrences sorted reverse by activity.
    actual = o.size.to_f/nr_o
    expected = db_instances[i].to_f/nr_db
    if actual > expected
      if ((actual - expected) / actual) > max_value
       max_value = (actual - expected) / actual # 'Schleppzeiger'
        max = i
      end
    end
  }
  max
end

.gauss(x, sigma = 0.3) ⇒ Float

Gauss kernel

Returns:

  • (Float)


417
418
419
420
# File 'lib/utils.rb', line 417

def self.gauss(x, sigma = 0.3) 
  d = 1.0 - x.to_f
  Math.exp(-(d*d)/(2*sigma*sigma))
end

.get_cdk_descriptors(params) ⇒ Object

Calculate CDK physico-chemical descriptors via Ambit – DO NOT OVERLOAD Ambit. @param required: :compounds, :pc_type, :task, :step optional: :descriptor @return array of Ambit result uri, piecewise (1st: base, 2nd: SMILES, 3rd+: features, hash smiles to inchi, array of field descriptions



289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
# File 'lib/utils.rb', line 289

def self.get_cdk_descriptors(params)

  ambit_result_uri = [] # 1st pos: base uri, then features
  smiles_to_inchi = {}
  task_weights = {"electronic"=> 4, "topological"=> 19, "constitutional"=> 12, "geometrical"=> 3, "hybrid"=> 2, "cpsa"=> 1 }
  task_weights.keys.each { |pc_type| task_weights.delete(pc_type) if (params[:pc_type] && (!params[:pc_type].split(",").include?(pc_type)))}
  task_sum = Float task_weights.values.sum
  task_weights.keys.each { |pc_type| task_weights[pc_type] /= task_sum }
  task_weights.keys.each { |pc_type| task_weights[pc_type] *= params[:step] }
  

  # extract wanted descriptors from config file and parameters
  pc_descriptors = YAML::load_file(@keysfile)

  ids = pc_descriptors.collect { |id, info| 
      "#{info[:pc_type]}:::#{id}" if info[:lib] == "cdk" && params[:pc_type].split(",").include?(info[:pc_type]) && (!params[:descriptor] || id == params[:descriptor])
  }.compact

  if ids.size > 0
    ids.sort!
    ids.collect! { |id| id.split(":::").last }

    # create dataset at Ambit
    begin
      params[:compounds].each do |n|
        cmpd = OpenTox::Compound.new(n)
        smiles_string = cmpd.to_smiles
        smiles_to_inchi[smiles_string] = URI.encode_www_form_component(cmpd.to_inchi)
      end
      smi_file = Tempfile.open(['pc_ambit', '.csv']) ; smi_file.puts( "SMILES\n" + smiles_to_inchi.keys.join("\n") ) ; smi_file.flush
      ambit_ds_uri = OpenTox::RestClientWrapper.post(@ambit_ds_service_uri, {:file => File.new(smi_file.path)}, {:content_type => "multipart/form-data", :accept => "text/uri-list"} )
      ambit_result_uri = [ ambit_ds_uri + "?" ] # 1st pos: base uri, then features
    rescue Exception => e
      LOGGER.debug "#{e.class}: #{e.message}"
      LOGGER.debug "Backtrace:\n\t#{e.backtrace.join("\n\t")}"
    ensure
      smi_file.close! if smi_file
    end
    # get SMILES feature URI
    ambit_smiles_uri = OpenTox::RestClientWrapper.get(
      ambit_ds_uri + "/features", 
      {:accept=> "text/uri-list"} 
    ).chomp
    ambit_result_uri << ("feature_uris[]=" + URI.encode_www_form_component(ambit_smiles_uri) + "&")
    # always calculate 3D (http://goo.gl/Tk81j), then get results
    OpenTox::RestClientWrapper.post(
      @ambit_mopac_model_uri, 
      {:dataset_uri => ambit_ds_uri}, 
      {:accept => "text/uri-list"} 
    ) 
    current_cat = ""
    ids.each_with_index do |id, i|
      old_cat = current_cat; current_cat = pc_descriptors[id][:pc_type]
      params[:task].progress(params[:task].[OT.percentageCompleted] + task_weights[old_cat]) if params[:task] && old_cat != current_cat && old_cat != ""
      algorithm = Algorithm::Generic.new(@ambit_descriptor_algorithm_uri+id)
      result_uri = algorithm.run({:dataset_uri => ambit_ds_uri})
      ambit_result_uri << result_uri.split("?")[1] + "&"
      LOGGER.debug "Ambit (#{ids.size}): #{i+1}"
    end
    params[:task].progress(params[:task].[OT.percentageCompleted] + task_weights[current_cat]) if params[:task]
    #LOGGER.debug "Ambit result: #{ambit_result_uri.join('')}"
  end

  [ ambit_result_uri, smiles_to_inchi, ids ]

end

.get_jl_descriptors(params) ⇒ Object

Calculate Joelib2 physico-chemical descriptors. @param required: :compounds, :pc_type, :task, optional: :descriptor @return CSV, array of field ids, array of field descriptions



201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
# File 'lib/utils.rb', line 201

def self.get_jl_descriptors(params)

  master = nil
  s = params[:rjb]; raise "No Java environment" unless s

  # Load keys, enter CSV headers
  begin
    csvfile = Tempfile.open(['jl_descriptors-','.csv'])

    pc_descriptors = YAML::load_file(@keysfile)
    ids = pc_descriptors.collect{ |id, info| 
      id if info[:lib] == "joelib" && params[:pc_type].split(",").include?(info[:pc_type]) && (!params[:descriptor] || id == params[:descriptor])
    }.compact


    if ids.length > 0
      csvfile.puts((["SMILES"] + ids).join(","))

      # remember inchis
      inchis = params[:compounds].collect { |c_uri| 
        cmpd = OpenTox::Compound.new(c_uri)
        URI.encode_www_form_component(cmpd.to_inchi)
      }

      # Process compounds
      params[:compounds].each_with_index { |c_uri, c_idx| 
        cmpd = OpenTox::Compound.new(c_uri)
        inchi = cmpd.to_inchi
        sdf_data = cmpd.to_sdf

        infile = Tempfile.open(['jl_descriptors-in-','.sdf'])
        outfile_path = infile.path.gsub(/jl_descriptors-in/,"jl_descriptors-out")

        begin
          infile.puts sdf_data
          infile.flush
          s.new(infile.path, outfile_path) # runs joelib
                
          row = [inchis[c_idx]]
          ids.each_with_index do |k,i| # Fill row
            re = Regexp.new(k)
            open(outfile_path) do |f|
              f.each do |line|
                if @prev == k
                  entry = line.chomp
                  val = nil
                  if OpenTox::Algorithm.numeric?(entry)
                    val = Float(entry)
                    val = nil if val.nan?
                    val = nil if (val && val.infinite?)
                  end
                  row << val
                  break
                end
                @prev = line.gsub(/^.*types./,"").gsub(/count./,"").gsub(/>/,"").chomp if line =~ re
              end
            end
          end
          LOGGER.debug "Compound #{c_idx+1} (#{inchis.size}), #{row.size} entries"
          csvfile.puts(row.join(","))
          csvfile.flush

        rescue Exception => e
          LOGGER.debug "#{e.class}: #{e.message}"
          LOGGER.debug "Backtrace:\n\t#{e.backtrace.join("\n\t")}"
        ensure
          File.delete(infile.path.gsub(/\.sdf/,".numeric.sdf"))
          File.delete(outfile_path)
          infile.close!
        end
      }
      master = CSV::parse(File.open(csvfile.path, "rb").read)
    end

  rescue Exception => e
    LOGGER.debug "#{e.class}: #{e.message}"
    LOGGER.debug "Backtrace:\n\t#{e.backtrace.join("\n\t")}"
  ensure
    [ csvfile].each { |f| f.close! }
  end

  [ master, ids ]

end

.get_ob_descriptors(params) ⇒ Object

Calculate OpenBabel physico-chemical descriptors. @param required: :compounds, :pc_type, :task, optional: :descriptor @return CSV, array of field ids, array of field descriptions



131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
# File 'lib/utils.rb', line 131

def self.get_ob_descriptors(params)

  master = nil

  begin
    csvfile = Tempfile.open(['ob_descriptors-','.csv'])

    pc_descriptors = YAML::load_file(@keysfile)
    ids = pc_descriptors.collect{ |id, info| 
      id if info[:lib] == "openbabel" && params[:pc_type].split(",").include?(info[:pc_type]) && (!params[:descriptor] || id == params[:descriptor])
    }.compact

    if ids.length > 0
      csvfile.puts((["SMILES"] + ids).join(","))
      
      # remember inchis
      inchis = params[:compounds].collect { |c_uri| 
        URI.encode_www_form_component(OpenTox::Compound.new(c_uri).to_inchi)
      }

      # Process compounds
      obmol = OpenBabel::OBMol.new
      obconversion = OpenBabel::OBConversion.new
      obconversion.set_in_and_out_formats 'inchi', 'can'

      inchis.each_with_index { |inchi, c_idx| 
        row = [inchis[c_idx]]
        obconversion.read_string(obmol, URI.decode_www_form_component(inchi))
        ids.each { |name|
          if obmol.respond_to?(name.underscore)
            val = eval("obmol.#{name.underscore}") if obmol.respond_to?(name.underscore) 
          else
            if name != "nF" && name != "spinMult" && name != "nHal" && name != "logP"
              val = OpenBabel::OBDescriptor.find_type(name.underscore).predict(obmol)
            elsif name == "nF"
              val = OpenBabel::OBDescriptor.find_type("nf").predict(obmol)
            elsif name == "spinMult" || name == "nHal" || name == "logP"
              val = OpenBabel::OBDescriptor.find_type(name).predict(obmol)
            end
          end
          if OpenTox::Algorithm.numeric?(val)
            val = Float(val)
            val = nil if val.nan?
            val = nil if (val && val.infinite?)
          end
          row << val
        }
        LOGGER.debug "Compound #{c_idx+1} (#{inchis.size}), #{row.size} entries"
        csvfile.puts(row.join(","))
        csvfile.flush
      }
      master = CSV::parse(File.open(csvfile.path, "rb").read)
    end

  rescue Exception => e
    LOGGER.debug "#{e.class}: #{e.message}"
    LOGGER.debug "Backtrace:\n\t#{e.backtrace.join("\n\t")}"
  ensure
    csvfile.close!
  end

  [ master, ids ]

end

.isnull_or_singular?(array) ⇒ Boolean

For symbolic features

Parameters:

  • Array (Array)

    to test, must indicate non-occurrence with 0.

Returns:

  • (Boolean)

    Whether the feature is singular or non-occurring or present everywhere.



426
427
428
429
430
431
# File 'lib/utils.rb', line 426

def self.isnull_or_singular?(array)
  nr_zeroes = array.count(0)
  return (nr_zeroes == array.size) ||    # remove non-occurring feature
         (nr_zeroes == array.size-1) ||  # remove singular feature
         (nr_zeroes == 0)                # also remove feature present everywhere
end

.load_ds_csv(ambit_result_uri, smiles_to_inchi, single_ids, subjectid = nil) ⇒ Object

Load dataset via CSV @param Ambit result uri, piecewise (1st: base, 2nd: SMILES, 3rd+: features @param keys: SMILES, values: InChIs @param field descriptions, one for each feature @return CSV, array of field ids, array of field descriptions



362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
# File 'lib/utils.rb', line 362

def self.load_ds_csv(ambit_result_uri, smiles_to_inchi, single_ids, subjectid=nil)
  
  master=nil
  ids=[]
  ambit_ids=[]

  if ambit_result_uri.size > 0
    (1...ambit_result_uri.size).collect { |idx|
      curr_uri = ambit_result_uri[0] + ambit_result_uri[idx]
      #LOGGER.debug "Requesting #{curr_uri}"
      csv_data = CSV.parse( OpenTox::RestClientWrapper.get(curr_uri, {:accept => "text/csv", :subjectid => subjectid}) )
      if csv_data[0] && csv_data[0].size>1
        if master.nil? # This is the smiles entry
          (1...csv_data.size).each{ |idx| csv_data[idx][1] = smiles_to_inchi[csv_data[idx][1]] }
          master = csv_data
          next
        else
          index_uri = csv_data[0].index("SMILES")
          csv_data.map {|i| i.delete_at(index_uri)} if index_uri #Removes additional SMILES information

          nr_cols = (csv_data[0].size)-1
          LOGGER.debug "Merging #{nr_cols} new columns"
          ids += Array.new(nr_cols, single_ids[idx-2])
          master.each {|row| nr_cols.times { row.push(nil) }  } # Adds empty columns to all rows
          csv_data.each do |row|
            temp = master.assoc(row[0]) # Finds the appropriate line in master
            ((-1*nr_cols)..-1).collect.each { |idx|
              temp[idx] = row[nr_cols+idx+1] if temp # Updates columns if line is found
            }
          end
        end
      end
    }

    index_uri = master[0].index("Compound")
    master.map {|i| i.delete_at(index_uri)}
    master[0].each {|cell| cell.chomp!(" ")}
    master[0][0] = "Compound" #"SMILES" 
    index_smi = master[0].index("SMILES")
    master.map {|i| i.delete_at(index_smi)} if index_smi
    master[0][0] = "SMILES" 
    ambit_ids=master[0].collect {|header| header.to_s.gsub(/[\/.\\\(\)\{\}\[\]]/,"_")}
    ambit_ids.shift
  end
   
  #LOGGER.debug "-------- AM: Writing to dumpfile"
  #File.open("/tmp/test.csv", 'w') {|f| f.write( master.collect {|r| r.join(",")}.join("\n") ) }
 
  [ master, ids, ambit_ids ]

end

.min_frequency(training_dataset, per_mil) ⇒ Object

Minimum Frequency return [Integer] min-frequency

Parameters:

  • per-mil (Integer)

    value



463
464
465
466
467
# File 'lib/utils.rb', line 463

def self.min_frequency(training_dataset,per_mil)
  minfreq = per_mil * training_dataset.compounds.size.to_f / 1000.0 # AM sugg. 8-10 per mil for BBRC, 50 per mil for LAST
  minfreq = 2 unless minfreq > 2
  Integer (minfreq)
end

.numeric?(value) ⇒ Boolean

Numeric value test @param value

Returns:

  • (Boolean)

    Whether value is a number



437
438
439
# File 'lib/utils.rb', line 437

def self.numeric?(value)
  true if Float(value) rescue false
end

.pc_descriptors(params) ⇒ Object

Calculate physico-chemical descriptors. @param required: :dataset_uri, :pc_type, :rjb, :task, :add_uri, optional: :descriptor, :lib, :subjectid @return dataset uri



19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
# File 'lib/utils.rb', line 19

def self.pc_descriptors(params)

  ds = OpenTox::Dataset.find(params[:dataset_uri],params[:subjectid])
  compounds = ds.compounds.collect
  task_weights = {"joelib"=> 20, "openbabel"=> 1, "cdk"=> 50 }
  task_weights.keys.each { |step| task_weights.delete(step) if (params[:lib] && (!params[:lib].split(",").include?(step)))}
  task_weights["load"] = 10
  task_sum = Float task_weights.values.sum
  task_weights.keys.each { |step| task_weights[step] /= task_sum }
  task_weights.keys.each { |step| task_weights[step] = (task_weights[step]*100).floor }
  
  jl_master=nil
  cdk_master=nil
  ob_master=nil


  # # # openbabel (via ruby bindings)
  if !params[:lib] || params[:lib].split(",").include?("openbabel")
    ob_master, ob_ids = get_ob_descriptors( { :compounds => compounds, :pc_type => params[:pc_type], :descriptor => params[:descriptor] } ) 
    params[:task].progress(params[:task].[OT.percentageCompleted] + task_weights["openbabel"]) if params[:task]
  end


  # # # joelib (via rjb)
  if !params[:lib] || params[:lib].split(",").include?("joelib")
    jl_master, jl_ids = get_jl_descriptors( { :compounds => compounds, :rjb => params[:rjb], :pc_type => params[:pc_type], :descriptor => params[:descriptor] } ) 
    params[:task].progress(params[:task].[OT.percentageCompleted] + task_weights["joelib"]) if params[:task]
  end


  # # # cdk (via REST)
  if !params[:lib] || params[:lib].split(",").include?("cdk")
    ambit_result_uri, smiles_to_inchi, cdk_ids = get_cdk_descriptors( { :compounds => compounds, :pc_type => params[:pc_type], :task => params[:task], :step => task_weights["cdk"], :descriptor => params[:descriptor] } )
    #LOGGER.debug "Ambit result uri for #{params.inspect}: '#{ambit_result_uri.to_yaml}'"
    cdk_master, cdk_ids, ambit_ids = load_ds_csv(ambit_result_uri, smiles_to_inchi, cdk_ids )
    params[:task].progress(params[:task].[OT.percentageCompleted] + task_weights["load"]) if params[:task]
  end

  # # # fuse CSVs ("master" structures)
  if jl_master && cdk_master
    nr_cols = (jl_master[0].size)-1
    LOGGER.debug "Merging #{nr_cols} new columns"
    cdk_master.each {|row| nr_cols.times { row.push(nil) }  }
    jl_master.each do |row|
      temp = cdk_master.assoc(row[0]) # Finds the appropriate line in master
      ((-1*nr_cols)..-1).collect.each { |idx|
        temp[idx] = row[nr_cols+idx+1] if temp # Updates columns if line is found
      }
    end
    master = cdk_master
  else # either jl_master or cdk_master nil
    master = jl_master || cdk_master
  end

  if ob_master && master
    nr_cols = (ob_master[0].size)-1
    LOGGER.debug "Merging #{nr_cols} new columns"
    master.each {|row| nr_cols.times { row.push(nil) }  } # Adds empty columns to all rows
    ob_master.each do |row|
      temp = master.assoc(row[0]) # Finds the appropriate line in master
      ((-1*nr_cols)..-1).collect.each { |idx|
        temp[idx] = row[nr_cols+idx+1] if temp # Updates columns if line is found
      }
    end
  else # either ob_master or master nil
    master = ob_master || master
  end

  if master

    ds = OpenTox::Dataset.find( 
      OpenTox::RestClientWrapper.post(
        File.join(CONFIG[:services]["opentox-dataset"]), master.collect { |row| row.join(",") }.join("\n"), {:content_type => "text/csv", :subjectid => params[:subjectid]}
      ),params[:subjectid]
    ) 

    # # # add feature metadata
    pc_descriptors = YAML::load_file(@keysfile)
    ambit_ids && ambit_ids.each_with_index { |id,idx|
      raise "Feature not found" if ! ds.features[File.join(ds.uri, "feature", id.to_s)]
      ds.(File.join(ds.uri, "feature", id.to_s),{DC.description => "#{pc_descriptors[cdk_ids[idx]][:name]} [#{pc_descriptors[cdk_ids[idx]][:pc_type]}, #{pc_descriptors[cdk_ids[idx]][:lib]}]"})
      ds.(File.join(ds.uri, "feature", id.to_s),{DC.creator => @ambit_descriptor_algorithm_uri + cdk_ids[idx]})
      ds.(File.join(ds.uri, "feature", id.to_s),{OT.hasSource => params[:dataset_uri]})
    }
    ob_ids && ob_ids.each { |id|
      raise "Feature not found" if ! ds.features[File.join(ds.uri, "feature", id.to_s)]
      ds.(File.join(ds.uri, "feature", id.to_s),{DC.description => "#{pc_descriptors[id][:name]} [#{pc_descriptors[id][:pc_type]}, #{pc_descriptors[id][:lib]}]"})
      creator_uri = ds.uri.gsub(/\/dataset\/.*/, "/algorithm/pc")
      creator_uri += "/#{id}" if params[:add_uri]
      ds.(File.join(ds.uri, "feature", id.to_s),{DC.creator => creator_uri})
      ds.(File.join(ds.uri, "feature", id.to_s),{OT.hasSource => params[:dataset_uri]})
    }
    jl_ids && jl_ids.each { |id|
      raise "Feature not found" if ! ds.features[File.join(ds.uri, "feature", id.to_s)]
      ds.(File.join(ds.uri, "feature", id.to_s),{DC.description => "#{pc_descriptors[id][:name]} [#{pc_descriptors[id][:pc_type]}, #{pc_descriptors[id][:lib]}]"})
      creator_uri = ds.uri.gsub(/\/dataset\/.*/, "/algorithm/pc")
      creator_uri += "/#{id}" if params[:add_uri]
      ds.(File.join(ds.uri, "feature", id.to_s),{DC.creator => creator_uri})
      ds.(File.join(ds.uri, "feature", id.to_s),{OT.hasSource => params[:dataset_uri]})
    }

    ds.save(params[:subjectid])
  else
    raise OpenTox::BadRequestError.new "No descriptors matching your criteria found."
  end

end

.sum_size(array) ⇒ Integer

Sum of an array for Arrays.

Parameters:

  • Array (Array)

    with values

Returns:

  • (Integer)

    Sum of size of values



453
454
455
456
457
# File 'lib/utils.rb', line 453

def self.sum_size(array)
  sum=0
  array.each { |e| sum += e.size }
  return sum
end

.zero_variance?(array) ⇒ Boolean

For symbolic features

Parameters:

  • Array (Array)

    to test, must indicate non-occurrence with 0.

Returns:

  • (Boolean)

    Whether the feature has variance zero.



445
446
447
# File 'lib/utils.rb', line 445

def self.zero_variance?(array)
  return array.uniq.size == 1
end

Instance Method Details

#run(params = nil, waiting_task = nil) ⇒ String

Execute algorithm with parameters, consult OpenTox API and webservice documentation for acceptable parameters

Parameters:

  • params (optional, Hash) (defaults to: nil)

    Algorithm parameters

  • waiting_task (optional, OpenTox::Task) (defaults to: nil)

    (can be a OpenTox::Subtask as well), progress is updated accordingly

Returns:

  • (String)

    URI of new resource (dataset, model, …)



22
23
24
25
# File 'lib/algorithm.rb', line 22

def run(params=nil, waiting_task=nil)
  LOGGER.info "Running algorithm '"+@uri.to_s+"' with params: "+params.inspect
  RestClientWrapper.post(@uri, params, {:accept => 'text/uri-list'}, waiting_task).to_s
end

#to_rdfxmlapplication/rdf+xml

Get OWL-DL representation in RDF/XML format

Returns:

  • (application/rdf+xml)

    RDF/XML representation



29
30
31
32
33
# File 'lib/algorithm.rb', line 29

def to_rdfxml
  s = Serializer::Owl.new
  s.add_algorithm(@uri,@metadata)
  s.to_rdfxml
end