Module: Faust2Ruby
- Defined in:
- lib/faust2ruby.rb,
lib/faust2ruby/ast.rb,
lib/faust2ruby/lexer.rb,
lib/faust2ruby/parser.rb,
lib/faust2ruby/version.rb,
lib/faust2ruby/ir_builder.rb,
lib/faust2ruby/library_mapper.rb,
lib/faust2ruby/ruby_generator.rb
Defined Under Namespace
Modules: AST, LibraryMapper Classes: Error, IRBuilder, Lexer, ParseError, Parser, RubyGenerator
Constant Summary collapse
- VERSION =
"0.1.0"
Class Method Summary collapse
-
.convert_file(input_path, output_path = nil, **options) ⇒ Object
Convert Faust file to Ruby DSL.
-
.parse(source) ⇒ AST::Program
Parse Faust source and return the AST.
-
.to_ruby(source, **options) ⇒ String
Convert Faust source code to Ruby DSL code.
-
.tokenize(source) ⇒ Array<Lexer::Token>
Tokenize Faust source and return tokens.
Class Method Details
.convert_file(input_path, output_path = nil, **options) ⇒ Object
Convert Faust file to Ruby DSL.
72 73 74 75 76 77 78 79 80 81 |
# File 'lib/faust2ruby.rb', line 72 def self.convert_file(input_path, output_path = nil, **) source = File.read(input_path) ruby_code = to_ruby(source, **) if output_path File.write(output_path, ruby_code) else ruby_code end end |
.parse(source) ⇒ AST::Program
Parse Faust source and return the AST.
45 46 47 48 49 50 51 52 53 54 |
# File 'lib/faust2ruby.rb', line 45 def self.parse(source) parser = Parser.new(source) program = parser.parse unless parser.errors.empty? raise ParseError, "Parse errors:\n#{parser.errors.join("\n")}" end program end |
.to_ruby(source, **options) ⇒ String
Convert Faust source code to Ruby DSL code.
28 29 30 31 32 33 34 35 36 37 38 |
# File 'lib/faust2ruby.rb', line 28 def self.to_ruby(source, **) parser = Parser.new(source) program = parser.parse unless parser.errors.empty? raise ParseError, "Parse errors:\n#{parser.errors.join("\n")}" end generator = RubyGenerator.new() generator.generate(program) end |
.tokenize(source) ⇒ Array<Lexer::Token>
Tokenize Faust source and return tokens.
61 62 63 64 |
# File 'lib/faust2ruby.rb', line 61 def self.tokenize(source) lexer = Lexer.new(source) lexer.tokenize end |