This commit is contained in:
herbertkoelman
2013-01-11 00:43:24 +01:00
20 changed files with 19819 additions and 156 deletions

View File

@@ -14,9 +14,12 @@ For disambiguating between files with common extensions, we use a [bayesian clas
In the actual GitHub app we deal with `Grit::Blob` objects. For testing, there is a simple `FileBlob` API. In the actual GitHub app we deal with `Grit::Blob` objects. For testing, there is a simple `FileBlob` API.
Linguist::FileBlob.new("lib/linguist.rb").language.name #=> "Ruby" ```ruby
Linguist::FileBlob.new("bin/linguist").language.name #=> "Ruby" Linguist::FileBlob.new("lib/linguist.rb").language.name #=> "Ruby"
Linguist::FileBlob.new("bin/linguist").language.name #=> "Ruby"
```
See [lib/linguist/language.rb](https://github.com/github/linguist/blob/master/lib/linguist/language.rb) and [lib/linguist/languages.yml](https://github.com/github/linguist/blob/master/lib/linguist/languages.yml). See [lib/linguist/language.rb](https://github.com/github/linguist/blob/master/lib/linguist/language.rb) and [lib/linguist/languages.yml](https://github.com/github/linguist/blob/master/lib/linguist/languages.yml).
@@ -24,7 +27,7 @@ See [lib/linguist/language.rb](https://github.com/github/linguist/blob/master/li
The actual syntax highlighting is handled by our Pygments wrapper, [pygments.rb](https://github.com/tmm1/pygments.rb). It also provides a [Lexer abstraction](https://github.com/tmm1/pygments.rb/blob/master/lib/pygments/lexer.rb) that determines which highlighter should be used on a file. The actual syntax highlighting is handled by our Pygments wrapper, [pygments.rb](https://github.com/tmm1/pygments.rb). It also provides a [Lexer abstraction](https://github.com/tmm1/pygments.rb/blob/master/lib/pygments/lexer.rb) that determines which highlighter should be used on a file.
We typically run on a prerelease version of Pygments, [pygments.rb](https://github.com/tmm1/pygments.rb), to get early access to new lexers. The [lexers.yml](https://github.com/github/linguist/blob/master/lib/linguist/lexers.yml) file is a dump of the lexers we have available on our server. We typically run on a prerelease version of Pygments, [pygments.rb](https://github.com/tmm1/pygments.rb), to get early access to new lexers. The [languages.yml](https://github.com/github/linguist/blob/master/lib/linguist/languages.yml) file is a dump of the lexers we have available on our server.
### Stats ### Stats
@@ -32,10 +35,11 @@ The Language Graph you see on every repository is built by aggregating the langu
The repository stats API can be used on a directory: The repository stats API can be used on a directory:
project = Linguist::Repository.from_directory(".") ```ruby
project.language.name #=> "Ruby" project = Linguist::Repository.from_directory(".")
project.languages #=> { "Ruby" => 0.98, project.language.name #=> "Ruby"
"Shell" => 0.02 } project.languages #=> { "Ruby" => 0.98, "Shell" => 0.02 }
```
These stats are also printed out by the binary. Try running `linguist` on itself: These stats are also printed out by the binary. Try running `linguist` on itself:
@@ -46,7 +50,9 @@ These stats are also printed out by the binary. Try running `linguist` on itself
Checking other code into your git repo is a common practice. But this often inflates your project's language stats and may even cause your project to be labeled as another language. We are able to identify some of these files and directories and exclude them. Checking other code into your git repo is a common practice. But this often inflates your project's language stats and may even cause your project to be labeled as another language. We are able to identify some of these files and directories and exclude them.
Linguist::FileBlob.new("vendor/plugins/foo.rb").vendored? # => true ```ruby
Linguist::FileBlob.new("vendor/plugins/foo.rb").vendored? # => true
```
See [Linguist::BlobHelper#vendored?](https://github.com/github/linguist/blob/master/lib/linguist/blob_helper.rb) and [lib/linguist/vendor.yml](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml). See [Linguist::BlobHelper#vendored?](https://github.com/github/linguist/blob/master/lib/linguist/blob_helper.rb) and [lib/linguist/vendor.yml](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml).
@@ -54,7 +60,9 @@ See [Linguist::BlobHelper#vendored?](https://github.com/github/linguist/blob/mas
Not all plain text files are true source files. Generated files like minified js and compiled CoffeeScript can be detected and excluded from language stats. As an extra bonus, these files are suppressed in Diffs. Not all plain text files are true source files. Generated files like minified js and compiled CoffeeScript can be detected and excluded from language stats. As an extra bonus, these files are suppressed in Diffs.
Linguist::FileBlob.new("underscore.min.js").generated? # => true ```ruby
Linguist::FileBlob.new("underscore.min.js").generated? # => true
```
See [Linguist::BlobHelper#generated?](https://github.com/github/linguist/blob/master/lib/linguist/blob_helper.rb). See [Linguist::BlobHelper#generated?](https://github.com/github/linguist/blob/master/lib/linguist/blob_helper.rb).
@@ -80,6 +88,6 @@ Almost all bug fixes or new language additions should come with some additional
### Testing ### Testing
Sometimes getting the tests running can be to much work especially if you don't have much Ruby experience. Its okay, be lazy and let our build bot [Travis](http://travis-ci.org/#!/github/linguist) run the tests for you. Just open a pull request and the bot will start cranking away. Sometimes getting the tests running can be too much work, especially if you don't have much Ruby experience. Its okay, be lazy and let our build bot [Travis](http://travis-ci.org/#!/github/linguist) run the tests for you. Just open a pull request and the bot will start cranking away.
Heres our current build status, which is hopefully green: [![Build Status](https://secure.travis-ci.org/github/linguist.png?branch=master)](http://travis-ci.org/github/linguist) Heres our current build status, which is hopefully green: [![Build Status](https://secure.travis-ci.org/github/linguist.png?branch=master)](http://travis-ci.org/github/linguist)

View File

@@ -11,7 +11,7 @@ Gem::Specification.new do |s|
s.add_dependency 'charlock_holmes', '~> 0.6.6' s.add_dependency 'charlock_holmes', '~> 0.6.6'
s.add_dependency 'escape_utils', '~> 0.2.3' s.add_dependency 'escape_utils', '~> 0.2.3'
s.add_dependency 'mime-types', '~> 1.19' s.add_dependency 'mime-types', '~> 1.19'
s.add_dependency 'pygments.rb', '>= 0.3.0' s.add_dependency 'pygments.rb', '~> 0.3.7'
s.add_development_dependency 'mocha' s.add_development_dependency 'mocha'
s.add_development_dependency 'json' s.add_development_dependency 'json'
s.add_development_dependency 'rake' s.add_development_dependency 'rake'

View File

@@ -40,7 +40,6 @@ ASP:
- .ascx - .ascx
- .ashx - .ashx
- .asmx - .asmx
- .asp
- .aspx - .aspx
- .axd - .axd
@@ -52,15 +51,12 @@ ActionScript:
aliases: aliases:
- as3 - as3
primary_extension: .as primary_extension: .as
extensions:
- .as
Ada: Ada:
type: programming type: programming
color: "#02f88c" color: "#02f88c"
primary_extension: .adb primary_extension: .adb
extensions: extensions:
- .adb
- .ads - .ads
ApacheConf: ApacheConf:
@@ -85,8 +81,6 @@ Arc:
color: "#ca2afe" color: "#ca2afe"
lexer: Text only lexer: Text only
primary_extension: .arc primary_extension: .arc
extensions:
- .arc
Arduino: Arduino:
type: programming type: programming
@@ -102,14 +96,10 @@ Assembly:
aliases: aliases:
- nasm - nasm
primary_extension: .asm primary_extension: .asm
extensions:
- .asm
Augeas: Augeas:
type: programming type: programming
primary_extension: .aug primary_extension: .aug
extensions:
- .aug
AutoHotkey: AutoHotkey:
type: programming type: programming
@@ -127,37 +117,27 @@ Batchfile:
- bat - bat
primary_extension: .bat primary_extension: .bat
extensions: extensions:
- .bat
- .cmd - .cmd
Befunge: Befunge:
primary_extension: .befunge primary_extension: .befunge
extensions:
- .befunge
BlitzMax: BlitzMax:
primary_extension: .bmx primary_extension: .bmx
extensions:
- .bmx
Boo: Boo:
type: programming type: programming
color: "#d4bec1" color: "#d4bec1"
primary_extension: .boo primary_extension: .boo
extensions:
- .boo
Brainfuck: Brainfuck:
primary_extension: .b primary_extension: .b
extensions: extensions:
- .b
- .bf - .bf
Bro: Bro:
type: programming type: programming
primary_extension: .bro primary_extension: .bro
extensions:
- .bro
C: C:
type: programming type: programming
@@ -174,8 +154,6 @@ C#:
aliases: aliases:
- csharp - csharp
primary_extension: .cs primary_extension: .cs
extensions:
- .cs
C++: C++:
type: programming type: programming
@@ -199,8 +177,6 @@ C-ObjDump:
type: data type: data
lexer: c-objdump lexer: c-objdump
primary_extension: .c-objdump primary_extension: .c-objdump
extensions:
- .c-objdump
C2hs Haskell: C2hs Haskell:
type: programming type: programming
@@ -209,13 +185,10 @@ C2hs Haskell:
aliases: aliases:
- c2hs - c2hs
primary_extension: .chs primary_extension: .chs
extensions:
- .chs
CMake: CMake:
primary_extension: .cmake primary_extension: .cmake
extensions: extensions:
- .cmake
- .cmake.in - .cmake.in
filenames: filenames:
- CMakeLists.txt - CMakeLists.txt
@@ -223,8 +196,6 @@ CMake:
CSS: CSS:
ace_mode: css ace_mode: css
primary_extension: .css primary_extension: .css
extensions:
- .css
Ceylon: Ceylon:
type: programming type: programming
@@ -234,8 +205,6 @@ Ceylon:
ChucK: ChucK:
lexer: Java lexer: Java
primary_extension: .ck primary_extension: .ck
extensions:
- .ck
Clojure: Clojure:
type: programming type: programming
@@ -243,7 +212,6 @@ Clojure:
color: "#db5855" color: "#db5855"
primary_extension: .clj primary_extension: .clj
extensions: extensions:
- .clj
- .cljs - .cljs
CoffeeScript: CoffeeScript:
@@ -270,7 +238,6 @@ ColdFusion:
primary_extension: .cfm primary_extension: .cfm
extensions: extensions:
- .cfc - .cfc
- .cfm
Common Lisp: Common Lisp:
type: programming type: programming
@@ -279,7 +246,6 @@ Common Lisp:
- lisp - lisp
primary_extension: .lisp primary_extension: .lisp
extensions: extensions:
- .lisp
- .lsp - .lsp
- .ny - .ny
@@ -292,15 +258,12 @@ Cpp-ObjDump:
lexer: cpp-objdump lexer: cpp-objdump
primary_extension: .cppobjdump primary_extension: .cppobjdump
extensions: extensions:
- .cppobjdump
- .c++objdump - .c++objdump
- .cxx-objdump - .cxx-objdump
Cucumber: Cucumber:
lexer: Gherkin lexer: Gherkin
primary_extension: .feature primary_extension: .feature
extensions:
- .feature
Cython: Cython:
type: programming type: programming
@@ -309,22 +272,18 @@ Cython:
extensions: extensions:
- .pxd - .pxd
- .pxi - .pxi
- .pyx
D: D:
type: programming type: programming
color: "#fcd46d" color: "#fcd46d"
primary_extension: .d primary_extension: .d
extensions: extensions:
- .d
- .di - .di
D-ObjDump: D-ObjDump:
type: data type: data
lexer: d-objdump lexer: d-objdump
primary_extension: .d-objdump primary_extension: .d-objdump
extensions:
- .d-objdump
Darcs Patch: Darcs Patch:
search_term: dpatch search_term: dpatch
@@ -332,7 +291,6 @@ Darcs Patch:
- dpatch - dpatch
primary_extension: .darcspatch primary_extension: .darcspatch
extensions: extensions:
- .darcspatch
- .dpatch - .dpatch
Dart: Dart:
@@ -345,7 +303,6 @@ Delphi:
primary_extension: .pas primary_extension: .pas
extensions: extensions:
- .lpr - .lpr
- .pas
DCPU-16 ASM: DCPU-16 ASM:
type: programming type: programming
@@ -353,29 +310,22 @@ DCPU-16 ASM:
primary_extension: .dasm16 primary_extension: .dasm16
extensions: extensions:
- .dasm - .dasm
- .dasm16
aliases: aliases:
- dasm16 - dasm16
Diff: Diff:
primary_extension: .diff primary_extension: .diff
extensions:
- .diff
Dylan: Dylan:
type: programming type: programming
color: "#3ebc27" color: "#3ebc27"
primary_extension: .dylan primary_extension: .dylan
extensions:
- .dylan
Ecere Projects: Ecere Projects:
type: data type: data
group: JavaScript group: JavaScript
lexer: JSON lexer: JSON
primary_extension: .epj primary_extension: .epj
extensions:
- .epj
Ecl: Ecl:
type: programming type: programming
@@ -390,15 +340,12 @@ Eiffel:
lexer: Text only lexer: Text only
color: "#946d57" color: "#946d57"
primary_extension: .e primary_extension: .e
extensions:
- .e
Elixir: Elixir:
type: programming type: programming
color: "#6e4a7e" color: "#6e4a7e"
primary_extension: .ex primary_extension: .ex
extensions: extensions:
- .ex
- .exs - .exs
Elm: Elm:
@@ -423,7 +370,6 @@ Erlang:
color: "#949e0e" color: "#949e0e"
primary_extension: .erl primary_extension: .erl
extensions: extensions:
- .erl
- .hrl - .hrl
F#: F#:
@@ -433,7 +379,6 @@ F#:
search_term: ocaml search_term: ocaml
primary_extension: .fs primary_extension: .fs
extensions: extensions:
- .fs
- .fsi - .fsi
- .fsx - .fsx
@@ -455,7 +400,6 @@ FORTRAN:
- .f03 - .f03
- .f08 - .f08
- .f77 - .f77
- .f90
- .f95 - .f95
- .for - .for
- .fpp - .fpp
@@ -464,8 +408,6 @@ Factor:
type: programming type: programming
color: "#636746" color: "#636746"
primary_extension: .factor primary_extension: .factor
extensions:
- .factor
Fancy: Fancy:
type: programming type: programming
@@ -473,7 +415,6 @@ Fancy:
primary_extension: .fy primary_extension: .fy
extensions: extensions:
- .fancypack - .fancypack
- .fy
filenames: filenames:
- Fakefile - Fakefile
@@ -481,8 +422,15 @@ Fantom:
type: programming type: programming
color: "#dbded5" color: "#dbded5"
primary_extension: .fan primary_extension: .fan
Forth:
type: programming
primary_extension: .fth
color: "#341708"
lexer: Text only
extensions: extensions:
- .fan - .forth
- .fth
GAS: GAS:
type: programming type: programming
@@ -493,22 +441,16 @@ GAS:
Genshi: Genshi:
primary_extension: .kid primary_extension: .kid
extensions:
- .kid
Gentoo Ebuild: Gentoo Ebuild:
group: Shell group: Shell
lexer: Bash lexer: Bash
primary_extension: .ebuild primary_extension: .ebuild
extensions:
- .ebuild
Gentoo Eclass: Gentoo Eclass:
group: Shell group: Shell
lexer: Bash lexer: Bash
primary_extension: .eclass primary_extension: .eclass
extensions:
- .eclass
Gettext Catalog: Gettext Catalog:
search_term: pot search_term: pot
@@ -517,15 +459,12 @@ Gettext Catalog:
- pot - pot
primary_extension: .po primary_extension: .po
extensions: extensions:
- .po
- .pot - .pot
Go: Go:
type: programming type: programming
color: "#8d04eb" color: "#8d04eb"
primary_extension: .go primary_extension: .go
extensions:
- .go
Gosu: Gosu:
type: programming type: programming
@@ -542,7 +481,6 @@ Groff:
- '.5' - '.5'
- '.6' - '.6'
- '.7' - '.7'
- .man
Groovy: Groovy:
type: programming type: programming
@@ -565,7 +503,6 @@ HTML:
primary_extension: .html primary_extension: .html
extensions: extensions:
- .htm - .htm
- .html
- .xhtml - .xhtml
HTML+Django: HTML+Django:
@@ -584,15 +521,12 @@ HTML+ERB:
- erb - erb
primary_extension: .erb primary_extension: .erb
extensions: extensions:
- .erb
- .html.erb - .html.erb
HTML+PHP: HTML+PHP:
type: markup type: markup
group: HTML group: HTML
primary_extension: .phtml primary_extension: .phtml
extensions:
- .phtml
HTTP: HTTP:
type: data type: data
@@ -613,7 +547,6 @@ Haskell:
color: "#29b544" color: "#29b544"
primary_extension: .hs primary_extension: .hs
extensions: extensions:
- .hs
- .hsc - .hsc
Haxe: Haxe:
@@ -647,8 +580,6 @@ Io:
type: programming type: programming
color: "#a9188d" color: "#a9188d"
primary_extension: .io primary_extension: .io
extensions:
- .io
Ioke: Ioke:
type: programming type: programming
@@ -677,8 +608,6 @@ Java Server Pages:
aliases: aliases:
- jsp - jsp
primary_extension: .jsp primary_extension: .jsp
extensions:
- .jsp
JavaScript: JavaScript:
type: programming type: programming
@@ -715,15 +644,31 @@ Kotlin:
LLVM: LLVM:
primary_extension: .ll primary_extension: .ll
Lasso:
type: programming
lexer: Lasso
ace_mode: lasso
color: "#2584c3"
primary_extension: .lasso
extensions: extensions:
- .ll - .inc
- .las
- .lasso9
- .ldml
Less:
type: markup
group: CSS
lexer: CSS
ace_mode: less
primary_extension: .less
LilyPond: LilyPond:
lexer: Text only lexer: Text only
primary_extension: .ly primary_extension: .ly
extensions: extensions:
- .ily - .ily
- .ly
Literate Haskell: Literate Haskell:
type: programming type: programming
@@ -732,8 +677,18 @@ Literate Haskell:
aliases: aliases:
- lhs - lhs
primary_extension: .lhs primary_extension: .lhs
LiveScript:
type: programming
ace_mode: ls
color: "#499886"
aliases:
- ls
primary_extension: .ls
extensions: extensions:
- .lhs - ._ls
filenames:
- Slakefile
Logtalk: Logtalk:
type: programming type: programming
@@ -745,8 +700,8 @@ Lua:
color: "#fa1fa1" color: "#fa1fa1"
primary_extension: .lua primary_extension: .lua
extensions: extensions:
- .lua
- .nse - .nse
- .pd_lua
Makefile: Makefile:
aliases: aliases:
@@ -763,7 +718,6 @@ Makefile:
Mako: Mako:
primary_extension: .mako primary_extension: .mako
extensions: extensions:
- .mako
- .mao - .mao
Markdown: Markdown:
@@ -782,8 +736,6 @@ Matlab:
type: programming type: programming
color: "#bb92ac" color: "#bb92ac"
primary_extension: .matlab primary_extension: .matlab
extensions:
- .matlab
Max: Max:
type: programming type: programming
@@ -813,13 +765,13 @@ Mirah:
Moocode: Moocode:
lexer: MOOCode lexer: MOOCode
primary_extension: .moo primary_extension: .moo
extensions:
- .moo MoonScript:
type: programming
primary_extension: .moon
Myghty: Myghty:
primary_extension: .myt primary_extension: .myt
extensions:
- .myt
Nemerle: Nemerle:
type: programming type: programming
@@ -852,7 +804,6 @@ NumPy:
group: Python group: Python
primary_extension: .numpy primary_extension: .numpy
extensions: extensions:
- .numpy
- .numpyw - .numpyw
- .numsc - .numsc
@@ -870,8 +821,6 @@ ObjDump:
type: data type: data
lexer: objdump lexer: objdump
primary_extension: .objdump primary_extension: .objdump
extensions:
- .objdump
Objective-C: Objective-C:
type: programming type: programming
@@ -890,9 +839,16 @@ Objective-J:
- obj-j - obj-j
primary_extension: .j primary_extension: .j
extensions: extensions:
- .j
- .sj - .sj
Omgrofl:
type: programming
primary_extension: .omgrofl
color: "#cabbff"
lexer: Text only
extensions:
- .omgrofl
Opa: Opa:
type: programming type: programming
primary_extension: .opa primary_extension: .opa
@@ -975,7 +931,6 @@ Prolog:
primary_extension: .prolog primary_extension: .prolog
extensions: extensions:
- .pro - .pro
- .prolog
Puppet: Puppet:
type: programming type: programming
@@ -991,8 +946,6 @@ Pure Data:
color: "#91de79" color: "#91de79"
lexer: Text only lexer: Text only
primary_extension: .pd primary_extension: .pd
extensions:
- .pd
Python: Python:
type: programming type: programming
@@ -1012,27 +965,21 @@ Python traceback:
lexer: Python Traceback lexer: Python Traceback
searchable: false searchable: false
primary_extension: .pytb primary_extension: .pytb
extensions:
- .pytb
R: R:
type: programming type: programming
color: "#198ce7" color: "#198ce7"
lexer: S lexer: S
primary_extension: .r primary_extension: .r
extensions:
- .r
RHTML: RHTML:
type: markup type: markup
group: HTML group: HTML
primary_extension: .rhtml primary_extension: .rhtml
extensions:
- .rhtml
Racket: Racket:
type: programming type: programming
lexer: Scheme lexer: Racket
color: "#ae17ff" color: "#ae17ff"
primary_extension: .rkt primary_extension: .rkt
extensions: extensions:
@@ -1044,8 +991,6 @@ Raw token data:
aliases: aliases:
- raw - raw
primary_extension: .raw primary_extension: .raw
extensions:
- .raw
Rebol: Rebol:
type: programming type: programming
@@ -1055,12 +1000,9 @@ Rebol:
extensions: extensions:
- .r2 - .r2
- .r3 - .r3
- .rebol
Redcode: Redcode:
primary_extension: .cw primary_extension: .cw
extensions:
- .cw
Ruby: Ruby:
type: programming type: programming
@@ -1109,16 +1051,12 @@ SQL:
ace_mode: sql ace_mode: sql
searchable: false searchable: false
primary_extension: .sql primary_extension: .sql
extensions:
- .sql
Sage: Sage:
type: programming type: programming
lexer: Python lexer: Python
group: Python group: Python
primary_extension: .sage primary_extension: .sage
extensions:
- .sage
Sass: Sass:
type: markup type: markup
@@ -1136,7 +1074,6 @@ Scheme:
color: "#1e4aec" color: "#1e4aec"
primary_extension: .scm primary_extension: .scm
extensions: extensions:
- .scm
- .sls - .sls
- .ss - .ss
@@ -1149,8 +1086,6 @@ Self:
color: "#0579aa" color: "#0579aa"
lexer: Text only lexer: Text only
primary_extension: .self primary_extension: .self
extensions:
- .self
Shell: Shell:
type: programming type: programming
@@ -1167,13 +1102,9 @@ Smalltalk:
type: programming type: programming
color: "#596706" color: "#596706"
primary_extension: .st primary_extension: .st
extensions:
- .st
Smarty: Smarty:
primary_extension: .tpl primary_extension: .tpl
extensions:
- .tpl
Standard ML: Standard ML:
type: programming type: programming
@@ -1192,8 +1123,6 @@ Tcl:
type: programming type: programming
color: "#e4cc98" color: "#e4cc98"
primary_extension: .tcl primary_extension: .tcl
extensions:
- .tcl
Tcsh: Tcsh:
type: programming type: programming
@@ -1201,7 +1130,6 @@ Tcsh:
primary_extension: .tcsh primary_extension: .tcsh
extensions: extensions:
- .csh - .csh
- .tcsh
TeX: TeX:
type: markup type: markup
@@ -1215,7 +1143,6 @@ TeX:
- .ins - .ins
- .ltx - .ltx
- .sty - .sty
- .tex
- .toc - .toc
Tea: Tea:
@@ -1228,8 +1155,6 @@ Textile:
ace_mode: textile ace_mode: textile
wrap: true wrap: true
primary_extension: .textile primary_extension: .textile
extensions:
- .textile
Turing: Turing:
type: programming type: programming
@@ -1244,23 +1169,18 @@ Twig:
group: PHP group: PHP
lexer: HTML+Django/Jinja lexer: HTML+Django/Jinja
primary_extension: .twig primary_extension: .twig
extensions:
- .twig
VHDL: VHDL:
type: programming type: programming
lexer: vhdl lexer: vhdl
color: "#543978" color: "#543978"
primary_extension: .vhdl primary_extension: .vhdl
extensions:
- .vhdl
Vala: Vala:
type: programming type: programming
color: "#ee7d06" color: "#ee7d06"
primary_extension: .vala primary_extension: .vala
extensions: extensions:
- .vala
- .vapi - .vapi
Verilog: Verilog:
@@ -1276,8 +1196,6 @@ VimL:
aliases: aliases:
- vim - vim
primary_extension: .vim primary_extension: .vim
extensions:
- .vim
filenames: filenames:
- vimrc - vimrc
- gvimrc - gvimrc
@@ -1290,7 +1208,6 @@ Visual Basic:
extensions: extensions:
- .bas - .bas
- .frx - .frx
- .vb
- .vba - .vba
- .vbs - .vbs
@@ -1304,13 +1221,17 @@ XML:
- wsdl - wsdl
primary_extension: .xml primary_extension: .xml
extensions: extensions:
- .ccxml
- .glade - .glade
- .grxml
- .kml - .kml
- .mxml - .mxml
- .plist - .plist
- .rdf - .rdf
- .rss - .rss
- .scxml
- .svg - .svg
- .vxml
- .wsdl - .wsdl
- .wxi - .wxi
- .wxl - .wxl
@@ -1331,14 +1252,11 @@ XQuery:
primary_extension: .xquery primary_extension: .xquery
extensions: extensions:
- .xq - .xq
- .xquery
- .xqy - .xqy
XS: XS:
lexer: C lexer: C
primary_extension: .xs primary_extension: .xs
extensions:
- .xs
XSLT: XSLT:
type: markup type: markup
@@ -1352,29 +1270,23 @@ YAML:
primary_extension: .yml primary_extension: .yml
extensions: extensions:
- .yaml - .yaml
- .yml
eC: eC:
type: programming type: programming
search_term: ec search_term: ec
primary_extension: .ec primary_extension: .ec
extensions: extensions:
- .ec
- .eh - .eh
mupad: mupad:
lexer: MuPAD lexer: MuPAD
primary_extension: .mu primary_extension: .mu
extensions:
- .mu
ooc: ooc:
type: programming type: programming
lexer: Ooc lexer: Ooc
color: "#b0b77e" color: "#b0b77e"
primary_extension: .ooc primary_extension: .ooc
extensions:
- .ooc
reStructuredText: reStructuredText:
type: markup type: markup
@@ -1384,5 +1296,4 @@ reStructuredText:
- rst - rst
primary_extension: .rst primary_extension: .rst
extensions: extensions:
- .rst
- .rest - .rest

View File

@@ -27,6 +27,8 @@
# Vendored depedencies # Vendored depedencies
- vendor/ - vendor/
# Debian packaging
- ^debian/
## Commonly Bundled JavaScript frameworks ## ## Commonly Bundled JavaScript frameworks ##

View File

@@ -0,0 +1,5 @@
: HELLO ( -- )
." Hello Forth (forth)!" ;
HELLO

View File

@@ -0,0 +1,5 @@
: HELLO ( -- )
." Hello Forth (fth)!" ;
HELLO

1351
samples/Lasso/database.inc Normal file

File diff suppressed because it is too large Load Diff

301
samples/Lasso/json.lasso Normal file
View File

@@ -0,0 +1,301 @@
<?LassoScript
//
// JSON Encoding and Decoding
//
// Copyright 2007-2012 LassoSoft Inc.
//
// <http://json.org/>
// <http://json-rpc.org/>
// <http://www.ietf.org/rfc/rfc4627.txt?number=4627>
// This tag is now incorporated in Lasso 8.6.0.1
//
If: (Lasso_TagExists: 'Encode_JSON') == False;
Define_Tag: 'JSON', -Namespace='Encode_', -Required='value', -Optional='options';
Local: 'escapes' = Map('\\' = '\\', '"' = '"', '\r' = 'r', '\n' = 'n', '\t' = 't', '\f' = 'f', '\b' = 'b');
Local: 'output' = '';
Local: 'newoptions' = (Array: -Internal);
If: !(Local_Defined: 'options') || (#options->(IsA: 'array') == False);
Local: 'options' = (Array);
/If;
If: (#options >> -UseNative) || (Params >> -UseNative);
#newoptions->(Insert: -UseNative);
/If;
If: (#options >> -NoNative) || (Params >> -NoNative);
#newoptions->(Insert: -NoNative);
/If;
If: (#options !>> -UseNative) && ((#value->(IsA: 'set')) || (#value->(IsA: 'list')) || (#value->(IsA: 'queue')) || (#value->(IsA: 'priorityqueue')) || (#value->(IsA: 'stack')));
#output += (Encode_JSON: Array->(insertfrom: #value->iterator) &, -Options=#newoptions);
Else: (#options !>> -UseNative) && (#value->(IsA: 'pair'));
#output += (Encode_JSON: (Array: #value->First, #value->Second));
Else: (#options !>> -Internal) && (#value->(Isa: 'array') == False) && (#value->(IsA: 'map') == False);
#output += '[' + (Encode_JSON: #value, -Options=#newoptions) + ']';
Else: (#value->(IsA: 'literal'));
#output += #value;
Else: (#value->(IsA: 'string'));
#output += '"';
Loop: (#value->Length);
Local('character' = #value->(Get: Loop_Count));
#output->(Append:
(Match_RegExp('[\\x{0020}-\\x{21}\\x{23}-\\x{5b}\\x{5d}-\\x{10fff}]') == #character) ? #character |
'\\' + (#escapes->(Contains: #character) ? #escapes->(Find: #character) | 'u' + String(Encode_Hex(#character))->PadLeading(4, '0')&)
);
/Loop;
#output += '"';
Else: (#value->(IsA: 'integer')) || (#value->(IsA: 'decimal')) || (#value->(IsA: 'boolean'));
#output += (String: #value);
Else: (#value->(IsA: 'null'));
#output += 'null';
Else: (#value->(IsA: 'date'));
If: #value->gmt;
#output += '"' + #value->(format: '%QT%TZ') + '"';
Else;
#output += '"' + #value->(format: '%QT%T') + '"';
/If;
Else: (#value->(IsA: 'array'));
#output += '[';
Iterate: #value, (Local: 'temp');
#output += (Encode_JSON: #temp, -Options=#newoptions);
If: #value->Size != Loop_Count;
#output += ', ';
/If;
/Iterate;
#output += ']';
Else: (#value->(IsA: 'object'));
#output += '{';
Iterate: #value, (Local: 'temp');
#output += #temp->First + ': ' + (Encode_JSON: #temp->Second, -Options=#newoptions);
If: (#value->Size != Loop_Count);
#output += ', ';
/If;
/Iterate;
#output += '}';
Else: (#value->(IsA: 'map'));
#output += '{';
Iterate: #value, (Local: 'temp');
#output += (Encode_JSON: #temp->First, -Options=#newoptions) + ': ' + (Encode_JSON: #temp->Second, -Options=#newoptions);
If: (#value->Size != Loop_Count);
#output += ', ';
/If;
/Iterate;
#output += '}';
Else: (#value->(IsA: 'client_ip')) || (#value->(IsA: 'client_address'));
#output += (Encode_JSON: (String: #value), -Options=#newoptions);
Else: (#options !>> -UseNative) && (#value->(IsA: 'set')) || (#value->(IsA: 'list')) || (#value->(IsA: 'queue')) || (#value->(IsA: 'priorityqueue')) || (#value->(IsA: 'stack'));
#output += (Encode_JSON: Array->(insertfrom: #value->iterator) &, -Options=#newoptions);
Else: (#options !>> -NoNative);
#output += (Encode_JSON: (Map: '__jsonclass__'=(Array:'deserialize',(Array:'<LassoNativeType>' + #value->Serialize + '</LassoNativeType>'))));
/If;
Return: @#output;
/Define_Tag;
/If;
If: (Lasso_TagExists: 'Decode_JSON') == False;
Define_Tag: 'JSON', -Namespace='Decode_', -Required='value';
(#value == '') ? Return: Null;
Define_Tag: 'consume_string', -Required='ibytes';
Local: 'unescapes' = (map: 34 = '"', 92 = '\\', 98 = '\b', 102 = '\f', 110 = '\n', 114 = '\r', 116 = '\t');
Local: 'temp' = 0, 'obytes' = Bytes;
While: ((#temp := #ibytes->export8bits) != 34); // '"'
If: (#temp === 92); // '\'
#temp = #ibytes->export8bits;
If: (#temp === 117); // 'u'
#obytes->(ImportString: (Decode_Hex: (String: #ibytes->(GetRange: #ibytes->Position + 1, 4)))->(ExportString: 'UTF-16'), 'UTF-8');
#ibytes->(SetPosition: #ibytes->Position + 4);
Else;
If: (#unescapes->(Contains: #temp));
#obytes->(ImportString: #unescapes->(Find: #temp), 'UTF-8');
Else;
#obytes->(Import8Bits: #temp);
/If;
/If;
Else;
#obytes->(Import8Bits: #temp);
/If;
/While;
Local('output' = #obytes->(ExportString: 'UTF-8'));
If: #output->(BeginsWith: '<LassoNativeType>') && #output->(EndsWith: '</LassoNativeType>');
Local: 'temp' = #output - '<LassoNativeType>' - '</LassoNativeType>';
Local: 'output' = null;
Protect;
#output->(Deserialize: #temp);
/Protect;
Else: (Valid_Date: #output, -Format='%QT%TZ');
Local: 'output' = (Date: #output, -Format='%QT%TZ');
Else: (Valid_Date: #output, -Format='%QT%T');
Local: 'output' = (Date: #output, -Format='%QT%T');
/If;
Return: @#output;
/Define_Tag;
Define_Tag: 'consume_token', -Required='ibytes', -required='temp';
Local: 'obytes' = bytes->(import8bits: #temp) &;
local: 'delimit' = (array: 9, 10, 13, 32, 44, 58, 93, 125); // \t\r\n ,:]}
While: (#delimit !>> (#temp := #ibytes->export8bits));
#obytes->(import8bits: #temp);
/While;
Local: 'output' = (String: #obytes);
If: (#output == 'true') || (#output == 'false');
Return: (Boolean: #output);
Else: (#output == 'null');
Return: Null;
Else: (String_IsNumeric: #output);
Return: (#output >> '.') ? (Decimal: #output) | (Integer: #output);
/If;
Return: @#output;
/Define_Tag;
Define_Tag: 'consume_array', -Required='ibytes';
Local: 'output' = array;
local: 'delimit' = (array: 9, 10, 13, 32, 44); // \t\r\n ,
local: 'temp' = 0;
While: ((#temp := #ibytes->export8bits) != 93); // ]
If: (#delimit >> #temp);
// Discard whitespace
Else: (#temp == 34); // "
#output->(insert: (consume_string: @#ibytes));
Else: (#temp == 91); // [
#output->(insert: (consume_array: @#ibytes));
Else: (#temp == 123); // {
#output->(insert: (consume_object: @#ibytes));
Else;
#output->(insert: (consume_token: @#ibytes, @#temp));
(#temp == 93) ? Loop_Abort;
/If;
/While;
Return: @#output;
/Define_Tag;
Define_Tag: 'consume_object', -Required='ibytes';
Local: 'output' = map;
local: 'delimit' = (array: 9, 10, 13, 32, 44); // \t\r\n ,
local: 'temp' = 0;
local: 'key' = null;
local: 'val' = null;
While: ((#temp := #ibytes->export8bits) != 125); // }
If: (#delimit >> #temp);
// Discard whitespace
Else: (#key !== null) && (#temp == 34); // "
#output->(insert: #key = (consume_string: @#ibytes));
#key = null;
Else: (#key !== null) && (#temp == 91); // [
#output->(insert: #key = (consume_array: @#ibytes));
#key = null;
Else: (#key !== null) && (#temp == 123); // {
#output->(insert: #key = (consume_object: @#ibytes));
#key = null;
Else: (#key !== null);
#output->(insert: #key = (consume_token: @#ibytes, @#temp));
(#temp == 125) ? Loop_abort;
#key = null;
Else;
#key = (consume_string: @#ibytes);
while(#delimit >> (#temp := #ibytes->export8bits));
/while;
#temp != 58 ? Loop_Abort;
/If;
/While;
If: (#output >> '__jsonclass__') && (#output->(Find: '__jsonclass__')->(isa: 'array')) && (#output->(Find: '__jsonclass__')->size >= 2) && (#output->(Find: '__jsonclass__')->First == 'deserialize');
Return: #output->(find: '__jsonclass__')->Second->First;
Else: (#output >> 'native') && (#output >> 'comment') && (#output->(find: 'comment') == 'http://www.lassosoft.com/json');
Return: #output->(find: 'native');
/If;
Return: @#output;
/Define_Tag;
Local: 'ibytes' = (bytes: #value);
Local: 'start' = 1;
#ibytes->removeLeading(BOM_UTF8);
Local: 'temp' = #ibytes->export8bits;
If: (#temp == 91); // [
Local: 'output' = (consume_array: @#ibytes);
Return: @#output;
Else: (#temp == 123); // {
Local: 'output' = (consume_object: @#ibytes);
Return: @#output;
/If;
/Define_Tag;
/If;
If: (Lasso_TagExists: 'Literal') == False;
Define_Type: 'Literal', 'String';
/Define_Type;
/If;
If: (Lasso_TagExists: 'Object') == False;
Define_Type: 'Object', 'Map';
/Define_Type;
/If;
If: (Lasso_TagExists: 'JSON_RPCCall') == False;
Define_Tag: 'RPCCall', -Namespace='JSON_',
-Required='method',
-Optional='params',
-Optional='id',
-Optional='host';
!(Local_Defined: 'host') ? Local: 'host' = 'http://localhost/lassoapps.8/rpc/rpc.lasso';
!(Local_Defined: 'id') ? Local: 'id' = Lasso_UniqueID;
Local: 'request' = (Map: 'method' = #method, 'params' = #params, 'id' = #id);
Local: 'request' = (Encode_JSON: #request);
Local: 'result' = (Include_URL: #host, -PostParams=#request);
Local: 'result' = (Decode_JSON: #result);
Return: @#result;
/Define_Tag;
/If;
If: (Lasso_TagExists: 'JSON_Records') == False;
Define_Tag: 'JSON_Records',
-Optional='KeyField',
-Optional='ReturnField',
-Optional='ExcludeField',
-Optional='Fields';
Local: '_fields' = (Local_Defined: 'fields') && #fields->(IsA: 'array') ? #fields | Field_Names;
Fail_If: #_fields->size == 0, -1, 'No fields found for [JSON_Records]';
Local: '_keyfield' = (Local: 'keyfield');
If: #_fields !>> #_keyfield;
Local: '_keyfield' = (KeyField_Name);
If: #_fields !>> #_keyfield;
Local: '_keyfield' = 'ID';
If: #_fields !>> #_keyfield;
Local: '_keyfield' = #_fields->First;
/If;
/If;
/If;
Local: '_index' = #_fields->(FindPosition: #_keyfield)->First;
Local: '_return' = (Local_Defined: 'returnfield') ? (Params->(Find: -ReturnField)->(ForEach: {Params->First = Params->First->Second; Return: True}) &) | @#_fields;
Local: '_exclude' = (Local_Defined: 'excludefield') ? (Params->(Find: -ExcludeField)->(ForEach: {Params->First = Params->First->Second; Return: True}) &) | Array;
Local: '_records' = Array;
Iterate: Records_Array, (Local: '_record');
Local: '_temp' = Map;
Iterate: #_fields, (Local: '_field');
((#_return >> #_field) && (#_exclude !>> #_field)) ? #_temp->Insert(#_field = #_record->(Get: Loop_Count));
/Iterate;
#_records->Insert(#_temp);
/Iterate;
Local: '_output' = (Encode_JSON: (Object: 'error_msg'=Error_Msg, 'error_code'=Error_Code, 'found_count'=Found_Count, 'keyfield'=#_keyfield, 'rows'=#_records));
Return: @#_output;
/Define_Tag;
/If;
?>

213
samples/Lasso/json.lasso9 Normal file
View File

@@ -0,0 +1,213 @@
/**
trait_json_serialize
Objects with this trait will be assumed to convert to json data
when its ->asString method is called
*/
define trait_json_serialize => trait {
require asString()
}
define json_serialize(e::bytes)::string => ('"' + (string(#e)->Replace(`\`, `\\`) & Replace('\"', '\\"') & Replace('\r', '\\r') & Replace('\n', '\\n') & Replace('\t', '\\t') & Replace('\f', '\\f') & Replace('\b', '\\b') &) + '"')
define json_serialize(e::string)::string => ('"' + (string(#e)->Replace(`\`, `\\`) & Replace('\"', '\\"') & Replace('\r', '\\r') & Replace('\n', '\\n') & Replace('\t', '\\t') & Replace('\f', '\\f') & Replace('\b', '\\b') &) + '"')
define json_serialize(e::json_literal)::string => (#e->asstring)
define json_serialize(e::integer)::string => (#e->asstring)
define json_serialize(e::decimal)::string => (#e->asstring)
define json_serialize(e::boolean)::string => (#e->asstring)
define json_serialize(e::null)::string => ('null')
define json_serialize(e::date)::string => ('"' + #e->format(#e->gmt ? '%QT%TZ' | '%Q%T') + '"')
/*
define json_serialize(e::array)::string => {
local(output) = '';
local(delimit) = '';
#e->foreach => { #output += #delimit + json_serialize(#1); #delimit = ', '; }
return('[' + #output + ']');
}
define json_serialize(e::staticarray)::string => {
local(output) = '';
local(delimit) = '';
#e->foreach => { #output += #delimit + json_serialize(#1); #delimit = ', '; }
return('[' + #output + ']');
}
*/
define json_serialize(e::trait_forEach)::string => {
local(output) = '';
local(delimit) = '';
#e->foreach => { #output += #delimit + json_serialize(#1); #delimit = ', '; }
return('[' + #output + ']');
}
define json_serialize(e::map)::string => {
local(output = with pr in #e->eachPair
select json_serialize(#pr->first->asString) + ': ' + json_serialize(#pr->second))
return '{' + #output->join(',') + '}'
}
define json_serialize(e::json_object)::string => {
local(output) = '';
local(delimit) = '';
#e->foreachpair => { #output += #delimit + #1->first + ': ' + json_serialize(#1->second); #delimit = ', '; }
return('{' + #output + '}');
}
define json_serialize(e::trait_json_serialize) => #e->asString
define json_serialize(e::any)::string => json_serialize('<LassoNativeType>' + #e->serialize + '</LassoNativeType>')
// Bil Corry fixes for decoding json
define json_consume_string(ibytes::bytes) => {
local(obytes) = bytes;
local(temp) = 0;
while((#temp := #ibytes->export8bits) != 34);
#obytes->import8bits(#temp);
(#temp == 92) ? #obytes->import8bits(#ibytes->export8bits); // Escape \
/while;
local(output = string(#obytes)->unescape)
//Replace('\\"', '\"') & Replace('\\r', '\r') & Replace('\\n', '\n') & Replace('\\t', '\t') & Replace('\\f', '\f') & Replace('\\b', '\b') &;
if(#output->BeginsWith('<LassoNativeType>') && #output->EndsWith('</LassoNativeType>'));
Protect;
return serialization_reader(xml(#output - '<LassoNativeType>' - '</LassoNativeType>'))->read
/Protect;
else( (#output->size == 16 or #output->size == 15) and regexp(`\d{8}T\d{6}Z?`, '', #output)->matches)
return date(#output, -Format=#output->size == 16?`yyyyMMdd'T'HHmmssZ`|`yyyyMMdd'T'HHmmss`)
/if
return #output
}
// Bil Corry fix + Ke fix
define json_consume_token(ibytes::bytes, temp::integer) => {
local(obytes = bytes->import8bits(#temp) &,
delimit = array(9, 10, 13, 32, 44, 58, 93, 125)) // \t\r\n ,:]}
while(#delimit !>> (#temp := #ibytes->export8bits))
#obytes->import8bits(#temp)
/while
#temp == 125? // }
#ibytes->marker -= 1
//============================================================================
// Is also end of token if end of array[]
#temp == 93? // ]
#ibytes->marker -= 1
//............................................................................
local(output = string(#obytes))
#output == 'true'?
return true
#output == 'false'?
return false
#output == 'null'?
return null
string_IsNumeric(#output)?
return (#output >> '.')? decimal(#output) | integer(#output)
return #output
}
// Bil Corry fix
define json_consume_array(ibytes::bytes)::array => {
Local(output) = array;
local(delimit) = array( 9, 10, 13, 32, 44); // \t\r\n ,
local(temp) = 0;
While((#temp := #ibytes->export8bits) != 93); // ]
If(#delimit >> #temp);
// Discard whitespace
Else(#temp == 34); // "
#output->insert(json_consume_string(#ibytes));
Else(#temp == 91); // [
#output->insert(json_consume_array(#ibytes));
Else(#temp == 123); // {
#output->insert(json_consume_object(#ibytes));
Else;
#output->insert(json_consume_token(#ibytes, #temp));
(#temp == 93) ? Loop_Abort;
/If;
/While;
Return(#output);
}
// Bil Corry fix
define json_consume_object(ibytes::bytes)::map => {
Local('output' = map,
'delimit' = array( 9, 10, 13, 32, 44), // \t\r\n ,
'temp' = 0,
'key' = null,
'val' = null);
While((#temp := #ibytes->export8bits) != 125); // }
If(#delimit >> #temp);
// Discard whitespace
Else((#key !== null) && (#temp == 34)); // "
#output->insert(#key = json_consume_string(#ibytes));
#key = null;
Else((#key !== null) && (#temp == 91)); // [
#output->insert(#key = json_consume_array(#ibytes));
#key = null;
Else((#key !== null) && (#temp == 123)); // {
#output->insert(#key = json_consume_object(#ibytes));
#key = null;
Else((#key !== null));
#output->insert(#key = json_consume_token(#ibytes, #temp));
#key = null;
Else;
#key = json_consume_string(#ibytes);
while(#delimit >> (#temp := #ibytes->export8bits));
/while;
#temp != 58 ? Loop_Abort;
/If;
/While;
If((#output >> '__jsonclass__') && (#output->Find('__jsonclass__')->isa('array')) && (#output->Find('__jsonclass__')->size >= 2) && (#output->Find('__jsonclass__')->First == 'deserialize'));
Return(#output->find('__jsonclass__')->Second->First);
Else((#output >> 'native') && (#output >> 'comment') && (#output->find('comment') == 'http://www.lassosoft.com/json'));
Return(#output->find('native'));
/If;
Return(#output);
}
// Bil Corry fix + Ke fix
define json_deserialize(ibytes::bytes)::any => {
#ibytes->removeLeading(bom_utf8);
//============================================================================
// Reset marker on provided bytes
#ibytes->marker = 0
//............................................................................
Local(temp) = #ibytes->export8bits;
If(#temp == 91); // [
Return(json_consume_array(#ibytes));
Else(#temp == 123); // {
Return(json_consume_object(#ibytes));
else(#temp == 34) // "
return json_consume_string(#ibytes)
/If;
}
define json_deserialize(s::string) => json_deserialize(bytes(#s))
/**! json_literal - This is a subclass of String used for JSON encoding.
A json_literal works exactly like a string, but will be inserted directly
rather than being encoded into JSON. This allows JavaScript elements
like functions to be inserted into JSON objects. This is most useful
when the JSON object will be used within a JavaScript on the local page.
[Map: 'fn'=Literal('function(){ ...})] => {'fn': function(){ ...}}
**/
define json_literal => type {
parent string
}
/**! json_object - This is a subclass of Map used for JSON encoding.
An object works exactly like a map, but when it is encoded into JSON all
of the keys will be inserted literally. This makes it easy to create a
JavaScript object without extraneous quote marks.
Object('name'='value') => {name: "value"}
**/
define json_object => type {
parent map
public onCreate(...) => ..onCreate(:#rest or (:))
}
define json_rpccall(method::string, params=map, id='', host='') => {
#id == '' ? #host = Lasso_UniqueID;
#host == '' ? #host = 'http://localhost/lassoapps.8/rpc/rpc.lasso';
Return(Decode_JSON(Include_URL(#host, -PostParams=Encode_JSON(Map('method' = #method, 'params' = #params, 'id' = #id)))));
}

8342
samples/Lasso/knop.las Normal file

File diff suppressed because it is too large Load Diff

8342
samples/Lasso/knop.ldml Normal file

File diff suppressed because it is too large Load Diff

14
samples/Less/screen.less Normal file
View File

@@ -0,0 +1,14 @@
@blue: #3bbfce;
@margin: 16px;
.content-navigation {
border-color: @blue;
color:
darken(@blue, 9%);
}
.border {
padding: @margin / 2;
margin: @margin / 2;
border-color: @blue;
}

View File

@@ -0,0 +1,35 @@
a = -> 1
const b = --> 2
var c = ~> 3
d = ~~> 10_000_000km * 500ms
e = (a) -> (b) ~> (c) --> (d, e) ~~> 5
dashes-identifiers = ->
a - a
b -- c
1-1 1- -1
a- a
a -a
underscores_i$d = ->
/regexp1/ and //regexp2//g
'strings' and "strings" and \strings
([2 til 10] or [1 to 50])
|> map (* 2)
|> filter (> 5)
|> fold (+)
class Class extends Anc-est-or
(args) ->
copy = (from, to, callback) -->
error, data <- read file
return callback error if error?
error <~ write file, data
return callback error if error?
callback()
->
~>
~~>
-->
# Comment
/* Comment */

View File

@@ -0,0 +1,20 @@
-- A simple counting object that increments an internal counter whenever it receives a bang at its first inlet, or changes to whatever number it receives at its second inlet.
local HelloCounter = pd.Class:new():register("h-counter")
function HelloCounter:initialize(sel, atoms)
self.inlets = 2
self.outlets = 1
self.num = 0
return true
end
function HelloCounter:in_1_bang()
self:outlet(1, "float", {self.num})
self.num = self.num + 1
end
function HelloCounter:in_2_float(f)
self.num = f
end

View File

@@ -0,0 +1,43 @@
local FileListParser = pd.Class:new():register("vidya-file-list-parser")
function FileListParser:initialize(sel, atoms)
-- 1. Base filename
-- 2. File extension
-- 3. Number of files in batch
self.inlets = 3
-- 1. To [list trim]-[binfile]
-- 2. To [vidya-file-modder]'s filename variables
-- 3. Sends a bang to [vidya-file-modder], triggering the object's mechanisms
self.outlets = 3
-- File extension
self.extension = "jpg"
-- Number of the last file in the batch
self.batchlimit = 0
return true
end
function FileListParser:in_1_symbol(s)
for i = 0, self.batchlimit do
self:outlet(2, "list", {s, i})
self:outlet(1, "read", {s .. i .. "." .. self.extension})
self:outlet(1, "info", {})
self:outlet(3, "bang", {})
end
end
function FileListParser:in_2_list(d)
self.extension = d[1]
end
function FileListParser:in_3_float(f)
self.batchlimit = f
end

View File

@@ -0,0 +1,137 @@
local FileModder = pd.Class:new():register("vidya-file-modder")
function FileModder:initialize(sel, atoms)
-- 1. Object-triggering bang
-- 2. Incoming single data bytes from [binfile]
-- 3. Total bytes in file, from [route buflength]
-- 4. Glitch type
-- 5. Glitch point
-- 6. Number of times to glitch a file
-- 7. Toggle for a randomized number of glitches within the bounds of (6)
-- 8. Active filename
self.inlets = 8
-- 1. To [binfile] inlet - bang(get next byte), clear(clear the buffer), FLOAT(write a byte to buffer), write(write to file)
self.outlets = 1
-- Currently active file's namedata
self.filedata = {
"default-filename",
0,
}
-- Glitch type (pattern, random, or splice)
self.glitchtype = "random"
-- Minimum glitch point in image data
self.glitchpoint = 500
-- Number of times to repeat random glitches on a given file
self.randrepeat = 1
-- Toggles whether the number of repeating glitches should be random, within the bounds of 1 to self.randrepeat
self.randtoggle = "concrete"
-- Hold all bytes, which are converted to ints in the 0-255 range
self.bytebuffer = {}
-- Buffer length of currently active file
self.buflength = 0
return true
end
function FileModder:in_1_bang()
for i = 1, self.buflength do
self:outlet(1, "bang", {})
end
self:outlet(1, "clear", {})
if self.glitchtype == "pattern" then
local plen = math.random(2, 1000)
local patbuffer = {}
for i = 1, plen do
table.insert(patbuffer, math.random(1, 254))
end
for i = self.glitchpoint, self.buflength do
self.bytebuffer[i] = patbuffer[((i - 1) % #patbuffer) + 1]
end
elseif self.glitchtype == "random" then
local randlimit = 0
if self.randtoggle == "random" then
randlimit = math.random(1, self.randrepeat)
else
randlimit = self.randrepeat
end
for i = 1, randlimit do
self.bytebuffer[math.random(self.glitchpoint, self.buflength)] = math.random(1, 244)
end
elseif self.glitchtype == "splice" then
local sloc = math.random(self.glitchpoint, self.buflength)
local schunksize = math.random(1, self.buflength - sloc)
local splicebuffer = {}
for i = 1, schunksize do
table.insert(splicebuffer, table.remove(self.bytebuffer, sloc))
end
local insertpoint = math.random(self.glitchpoint, #self.bytebuffer)
for _, v in ipairs(splicebuffer) do
table.insert(self.bytebuffer, insertpoint, v)
end
end
for _, v in ipairs(self.bytebuffer) do
self:outlet(1, "float", {v})
end
local outname = self.filedata[1] .. "-glitch" .. self.filedata[2] .. ".jpeg"
self:outlet(1, "write", {outname})
pd.post("New glitched image: " .. outname)
self:outlet(1, "clear", {})
self.bytebuffer = {}
end
function FileModder:in_2_float(f)
table.insert(self.bytebuffer, f)
end
function FileModder:in_3_list(f)
self.buflength = f[1] + 1 -- Shift from 0-indexed to 1-indexed
end
function FileModder:in_4_list(d)
self.glitchtype = d[1]
end
function FileModder:in_5_float(f)
self.glitchpoint = f
end
function FileModder:in_6_float(f)
self.randrepeat = f
end
function FileModder:in_7_list(d)
self.randtoggle = d[1]
end
function FileModder:in_8_list(d)
self.filedata = {d[1], d[2]}
end

View File

@@ -0,0 +1,903 @@
types = require "moonscript.types"
util = require "moonscript.util"
data = require "moonscript.data"
import reversed, unpack from util
import ntype, mtype, build, smart_node, is_slice, value_is_singular from types
import insert from table
import NameProxy, LocalName from require "moonscript.transform.names"
destructure = require "moonscript.transform.destructure"
local implicitly_return
class Run
new: (@fn) =>
self[1] = "run"
call: (state) =>
self.fn state
-- transform the last stm is a list of stms
-- will puke on group
apply_to_last = (stms, fn) ->
-- find last (real) exp
last_exp_id = 0
for i = #stms, 1, -1
stm = stms[i]
if stm and mtype(stm) != Run
last_exp_id = i
break
return for i, stm in ipairs stms
if i == last_exp_id
fn stm
else
stm
-- is a body a sindle expression/statement
is_singular = (body) ->
return false if #body != 1
if "group" == ntype body
is_singular body[2]
else
true
find_assigns = (body, out={}) ->
for thing in *body
switch thing[1]
when "group"
find_assigns thing[2], out
when "assign"
table.insert out, thing[2] -- extract names
out
hoist_declarations = (body) ->
assigns = {}
-- hoist the plain old assigns
for names in *find_assigns body
for name in *names
table.insert assigns, name if type(name) == "string"
-- insert after runs
idx = 1
while mtype(body[idx]) == Run do idx += 1
table.insert body, idx, {"declare", assigns}
expand_elseif_assign = (ifstm) ->
for i = 4, #ifstm
case = ifstm[i]
if ntype(case) == "elseif" and ntype(case[2]) == "assign"
split = { unpack ifstm, 1, i - 1 }
insert split, {
"else", {
{"if", case[2], case[3], unpack ifstm, i + 1}
}
}
return split
ifstm
constructor_name = "new"
with_continue_listener = (body) ->
continue_name = nil
{
Run =>
@listen "continue", ->
unless continue_name
continue_name = NameProxy"continue"
@put_name continue_name
continue_name
build.group body
Run =>
return unless continue_name
@put_name continue_name, nil
@splice (lines) -> {
{"assign", {continue_name}, {"false"}}
{"repeat", "true", {
lines
{"assign", {continue_name}, {"true"}}
}}
{"if", {"not", continue_name}, {
{"break"}
}}
}
}
class Transformer
new: (@transformers) =>
@seen_nodes = setmetatable {}, __mode: "k"
transform: (scope, node, ...) =>
return node if @seen_nodes[node]
@seen_nodes[node] = true
while true
transformer = @transformers[ntype node]
res = if transformer
transformer(scope, node, ...) or node
else
node
return node if res == node
node = res
node
bind: (scope) =>
(...) -> @transform scope, ...
__call: (...) => @transform ...
can_transform: (node) =>
@transformers[ntype node] != nil
construct_comprehension = (inner, clauses) ->
current_stms = inner
for _, clause in reversed clauses
t = clause[1]
current_stms = if t == "for"
_, names, iter = unpack clause
{"foreach", names, {iter}, current_stms}
elseif t == "when"
_, cond = unpack clause
{"if", cond, current_stms}
else
error "Unknown comprehension clause: "..t
current_stms = {current_stms}
current_stms[1]
Statement = Transformer {
root_stms: (body) =>
apply_to_last body, implicitly_return @
assign: (node) =>
names, values = unpack node, 2
-- bubble cascading assigns
transformed = if #values == 1
value = values[1]
t = ntype value
if t == "decorated"
value = @transform.statement value
t = ntype value
if types.cascading[t]
ret = (stm) ->
if types.is_value stm
{"assign", names, {stm}}
else
stm
build.group {
{"declare", names}
@transform.statement value, ret, node
}
node = transformed or node
if destructure.has_destructure names
return destructure.split_assign node
node
continue: (node) =>
continue_name = @send "continue"
error "continue must be inside of a loop" unless continue_name
build.group {
build.assign_one continue_name, "true"
{"break"}
}
export: (node) =>
-- assign values if they are included
if #node > 2
if node[2] == "class"
cls = smart_node node[3]
build.group {
{"export", {cls.name}}
cls
}
else
build.group {
node
build.assign {
names: node[2]
values: node[3]
}
}
else
nil
update: (node) =>
_, name, op, exp = unpack node
op_final = op\match "^(.+)=$"
error "Unknown op: "..op if not op_final
exp = {"parens", exp} unless value_is_singular exp
build.assign_one name, {"exp", name, op_final, exp}
import: (node) =>
_, names, source = unpack node
stubs = for name in *names
if type(name) == "table"
name
else
{"dot", name}
real_names = for name in *names
type(name) == "table" and name[2] or name
if type(source) == "string"
build.assign {
names: real_names
values: [build.chain { base: source, stub} for stub in *stubs]
}
else
source_name = NameProxy "table"
build.group {
{"declare", real_names}
build["do"] {
build.assign_one source_name, source
build.assign {
names: real_names
values: [build.chain { base: source_name, stub} for stub in *stubs]
}
}
}
comprehension: (node, action) =>
_, exp, clauses = unpack node
action = action or (exp) -> {exp}
construct_comprehension action(exp), clauses
do: (node, ret) =>
node[2] = apply_to_last node[2], ret if ret
node
decorated: (node) =>
stm, dec = unpack node, 2
wrapped = switch dec[1]
when "if"
cond, fail = unpack dec, 2
fail = { "else", { fail } } if fail
{ "if", cond, { stm }, fail }
when "unless"
{ "unless", dec[2], { stm } }
when "comprehension"
{ "comprehension", stm, dec[2] }
else
error "Unknown decorator " .. dec[1]
if ntype(stm) == "assign"
wrapped = build.group {
build.declare names: [name for name in *stm[2] when type(name) == "string"]
wrapped
}
wrapped
unless: (node) =>
{ "if", {"not", {"parens", node[2]}}, unpack node, 3 }
if: (node, ret) =>
-- expand assign in cond
if ntype(node[2]) == "assign"
_, assign, body = unpack node
if destructure.has_destructure assign[2]
name = NameProxy "des"
body = {
destructure.build_assign assign[2][1], name
build.group node[3]
}
return build.do {
build.assign_one name, assign[3][1]
{"if", name, body, unpack node, 4}
}
else
name = assign[2][1]
return build["do"] {
assign
{"if", name, unpack node, 3}
}
node = expand_elseif_assign node
-- apply cascading return decorator
if ret
smart_node node
-- mutate all the bodies
node['then'] = apply_to_last node['then'], ret
for i = 4, #node
case = node[i]
body_idx = #node[i]
case[body_idx] = apply_to_last case[body_idx], ret
node
with: (node, ret) =>
_, exp, block = unpack node
scope_name = NameProxy "with"
named_assign = if ntype(exp) == "assign"
names, values = unpack exp, 2
assign_name = names[1]
exp = values[1]
values[1] = scope_name
{"assign", names, values}
build.do {
Run => @set "scope_var", scope_name
build.assign_one scope_name, exp
build.group { named_assign }
build.group block
if ret
ret scope_name
}
foreach: (node) =>
smart_node node
source = unpack node.iter
destructures = {}
node.names = for i, name in ipairs node.names
if ntype(name) == "table"
with proxy = NameProxy "des"
insert destructures, destructure.build_assign name, proxy
else
name
if next destructures
insert destructures, build.group node.body
node.body = destructures
if ntype(source) == "unpack"
list = source[2]
index_name = NameProxy "index"
list_name = NameProxy "list"
slice_var = nil
bounds = if is_slice list
slice = list[#list]
table.remove list
table.remove slice, 1
slice[2] = if slice[2] and slice[2] != ""
max_tmp_name = NameProxy "max"
slice_var = build.assign_one max_tmp_name, slice[2]
{"exp", max_tmp_name, "<", 0
"and", {"length", list_name}, "+", max_tmp_name
"or", max_tmp_name }
else
{"length", list_name}
slice
else
{1, {"length", list_name}}
return build.group {
build.assign_one list_name, list
slice_var
build["for"] {
name: index_name
bounds: bounds
body: {
{"assign", node.names, {list_name\index index_name}}
build.group node.body
}
}
}
node.body = with_continue_listener node.body
while: (node) =>
smart_node node
node.body = with_continue_listener node.body
for: (node) =>
smart_node node
node.body = with_continue_listener node.body
switch: (node, ret) =>
_, exp, conds = unpack node
exp_name = NameProxy "exp"
-- convert switch conds into if statment conds
convert_cond = (cond) ->
t, case_exps, body = unpack cond
out = {}
insert out, t == "case" and "elseif" or "else"
if t != "else"
cond_exp = {}
for i, case in ipairs case_exps
if i == 1
insert cond_exp, "exp"
else
insert cond_exp, "or"
case = {"parens", case} unless value_is_singular case
insert cond_exp, {"exp", case, "==", exp_name}
insert out, cond_exp
else
body = case_exps
if ret
body = apply_to_last body, ret
insert out, body
out
first = true
if_stm = {"if"}
for cond in *conds
if_cond = convert_cond cond
if first
first = false
insert if_stm, if_cond[2]
insert if_stm, if_cond[3]
else
insert if_stm, if_cond
build.group {
build.assign_one exp_name, exp
if_stm
}
class: (node, ret, parent_assign) =>
_, name, parent_val, body = unpack node
-- split apart properties and statements
statements = {}
properties = {}
for item in *body
switch item[1]
when "stm"
insert statements, item[2]
when "props"
for tuple in *item[2,]
if ntype(tuple[1]) == "self"
insert statements, build.assign_one unpack tuple
else
insert properties, tuple
-- find constructor
constructor = nil
properties = for tuple in *properties
key = tuple[1]
if key[1] == "key_literal" and key[2] == constructor_name
constructor = tuple[2]
nil
else
tuple
parent_cls_name = NameProxy "parent"
base_name = NameProxy "base"
self_name = NameProxy "self"
cls_name = NameProxy "class"
if not constructor
constructor = build.fndef {
args: {{"..."}}
arrow: "fat"
body: {
build["if"] {
cond: parent_cls_name
then: {
build.chain { base: "super", {"call", {"..."}} }
}
}
}
}
else
smart_node constructor
constructor.arrow = "fat"
real_name = name or parent_assign and parent_assign[2][1]
real_name = switch ntype real_name
when "chain"
last = real_name[#real_name]
switch ntype last
when "dot"
{"string", '"', last[2]}
when "index"
last[2]
else
"nil"
when "nil"
"nil"
else
{"string", '"', real_name}
cls = build.table {
{"__init", constructor}
{"__base", base_name}
{"__name", real_name} -- "quote the string"
{"__parent", parent_cls_name}
}
-- look up a name in the class object
class_lookup = build["if"] {
cond: {"exp", "val", "==", "nil", "and", parent_cls_name}
then: {
parent_cls_name\index"name"
}
}
insert class_lookup, {"else", {"val"}}
cls_mt = build.table {
{"__index", build.fndef {
args: {{"cls"}, {"name"}}
body: {
build.assign_one LocalName"val", build.chain {
base: "rawget", {"call", {base_name, "name"}}
}
class_lookup
}
}}
{"__call", build.fndef {
args: {{"cls"}, {"..."}}
body: {
build.assign_one self_name, build.chain {
base: "setmetatable"
{"call", {"{}", base_name}}
}
build.chain {
base: "cls.__init"
{"call", {self_name, "..."}}
}
self_name
}
}}
}
cls = build.chain {
base: "setmetatable"
{"call", {cls, cls_mt}}
}
value = nil
with build
out_body = {
Run =>
-- make sure we don't assign the class to a local inside the do
@put_name name if name
@set "super", (block, chain) ->
if chain
slice = [item for item in *chain[3,]]
new_chain = {"chain", parent_cls_name}
head = slice[1]
if head == nil
return parent_cls_name
switch head[1]
-- calling super, inject calling name and self into chain
when "call"
calling_name = block\get"current_block"
slice[1] = {"call", {"self", unpack head[2]}}
if ntype(calling_name) == "key_literal"
insert new_chain, {"dot", calling_name[2]}
else
insert new_chain, {"index", calling_name}
-- colon call on super, replace class with self as first arg
when "colon"
call = head[3]
insert new_chain, {"dot", head[2]}
slice[1] = { "call", { "self", unpack call[2] } }
insert new_chain, item for item in *slice
new_chain
else
parent_cls_name
.assign_one parent_cls_name, parent_val == "" and "nil" or parent_val
.assign_one base_name, {"table", properties}
.assign_one base_name\chain"__index", base_name
.if {
cond: parent_cls_name
then: {
.chain {
base: "setmetatable"
{"call", {
base_name,
.chain { base: parent_cls_name, {"dot", "__base"}}
}}
}
}
}
.assign_one cls_name, cls
.assign_one base_name\chain"__class", cls_name
.group if #statements > 0 then {
.assign_one LocalName"self", cls_name
.group statements
}
-- run the inherited callback
.if {
cond: {"exp",
parent_cls_name, "and", parent_cls_name\chain "__inherited"
}
then: {
parent_cls_name\chain "__inherited", {"call", {
parent_cls_name, cls_name
}}
}
}
.group if name then {
.assign_one name, cls_name
}
if ret
ret cls_name
}
hoist_declarations out_body
value = .group {
.group if ntype(name) == "value" then {
.declare names: {name}
}
.do out_body
}
value
}
class Accumulator
body_idx: { for: 4, while: 3, foreach: 4 }
new: =>
@accum_name = NameProxy "accum"
@value_name = NameProxy "value"
@len_name = NameProxy "len"
-- wraps node and mutates body
convert: (node) =>
index = @body_idx[ntype node]
node[index] = @mutate_body node[index]
@wrap node
-- wrap the node into a block_exp
wrap: (node) =>
build.block_exp {
build.assign_one @accum_name, build.table!
build.assign_one @len_name, 0
node
@accum_name
}
-- mutates the body of a loop construct to save last value into accumulator
-- can optionally skip nil results
mutate_body: (body, skip_nil=true) =>
val = if not skip_nil and is_singular body
with body[1]
body = {}
else
body = apply_to_last body, (n) ->
if types.is_value n
build.assign_one @value_name, n
else
-- just ignore it
build.group {
{"declare", {@value_name}}
n
}
@value_name
update = {
{"update", @len_name, "+=", 1}
build.assign_one @accum_name\index(@len_name), val
}
if skip_nil
table.insert body, build["if"] {
cond: {"exp", @value_name, "!=", "nil"}
then: update
}
else
table.insert body, build.group update
body
default_accumulator = (node) =>
Accumulator!\convert node
implicitly_return = (scope) ->
is_top = true
fn = (stm) ->
t = ntype stm
-- expand decorated
if t == "decorated"
stm = scope.transform.statement stm
t = ntype stm
if types.cascading[t]
is_top = false
scope.transform.statement stm, fn
elseif types.manual_return[t] or not types.is_value stm
-- remove blank return statement
if is_top and t == "return" and stm[2] == ""
nil
else
stm
else
if t == "comprehension" and not types.comprehension_has_value stm
stm
else
{"return", stm}
fn
Value = Transformer {
for: default_accumulator
while: default_accumulator
foreach: default_accumulator
do: (node) =>
build.block_exp node[2]
decorated: (node) =>
@transform.statement node
class: (node) =>
build.block_exp { node }
string: (node) =>
delim = node[2]
convert_part = (part) ->
if type(part) == "string" or part == nil
{"string", delim, part or ""}
else
build.chain { base: "tostring", {"call", {part[2]}} }
-- reduced to single item
if #node <= 3
return if type(node[3]) == "string"
node
else
convert_part node[3]
e = {"exp", convert_part node[3]}
for i=4, #node
insert e, ".."
insert e, convert_part node[i]
e
comprehension: (node) =>
a = Accumulator!
node = @transform.statement node, (exp) ->
a\mutate_body {exp}, false
a\wrap node
tblcomprehension: (node) =>
_, explist, clauses = unpack node
key_exp, value_exp = unpack explist
accum = NameProxy "tbl"
inner = if value_exp
dest = build.chain { base: accum, {"index", key_exp} }
{ build.assign_one dest, value_exp }
else
-- If we only have single expression then
-- unpack the result into key and value
key_name, val_name = NameProxy"key", NameProxy"val"
dest = build.chain { base: accum, {"index", key_name} }
{
build.assign names: {key_name, val_name}, values: {key_exp}
build.assign_one dest, val_name
}
build.block_exp {
build.assign_one accum, build.table!
construct_comprehension inner, clauses
accum
}
fndef: (node) =>
smart_node node
node.body = apply_to_last node.body, implicitly_return self
node.body = {
Run => @listen "varargs", -> -- capture event
unpack node.body
}
node
if: (node) => build.block_exp { node }
unless: (node) =>build.block_exp { node }
with: (node) => build.block_exp { node }
switch: (node) =>
build.block_exp { node }
-- pull out colon chain
chain: (node) =>
stub = node[#node]
-- escape lua keywords used in dot accessors
for i=3,#node
part = node[i]
if ntype(part) == "dot" and data.lua_keywords[part[2]]
node[i] = { "index", {"string", '"', part[2]} }
if ntype(node[2]) == "string"
-- add parens if callee is raw string
node[2] = {"parens", node[2] }
elseif type(stub) == "table" and stub[1] == "colon_stub"
-- convert colon stub into code
table.remove node, #node
base_name = NameProxy "base"
fn_name = NameProxy "fn"
is_super = node[2] == "super"
@transform.value build.block_exp {
build.assign {
names: {base_name}
values: {node}
}
build.assign {
names: {fn_name}
values: {
build.chain { base: base_name, {"dot", stub[2]} }
}
}
build.fndef {
args: {{"..."}}
body: {
build.chain {
base: fn_name, {"call", {is_super and "self" or base_name, "..."}}
}
}
}
}
block_exp: (node) =>
_, body = unpack node
fn = nil
arg_list = {}
fn = smart_node build.fndef body: {
Run =>
@listen "varargs", ->
insert arg_list, "..."
insert fn.args, {"..."}
@unlisten "varargs"
unpack body
}
build.chain { base: {"parens", fn}, {"call", arg_list} }
}
{ :Statement, :Value, :Run }

View File

@@ -0,0 +1,28 @@
lol iz 71
wtf lol iz liek 71
lmao lol
brb
w00t Hello, World!
rofl lol
lol iz 101
rofl lol
lol iz 108
rofl lol
rofl lol
lool iz 111
rofl lool
loool iz 44
rofl loool
loool iz 32
rofl loool
loool iz 87
rofl loool
rofl lool
lool iz 114
rofl lool
rofl lol
lol iz 100
rofl lol
lol iz 33
rofl lol
stfu

View File

@@ -197,6 +197,9 @@ class TestBlob < Test::Unit::TestCase
assert blob("deps/http_parser/http_parser.c").vendored? assert blob("deps/http_parser/http_parser.c").vendored?
assert blob("deps/v8/src/v8.h").vendored? assert blob("deps/v8/src/v8.h").vendored?
# Debian packaging
assert blob("debian/cron.d").vendored?
# Prototype # Prototype
assert !blob("public/javascripts/application.js").vendored? assert !blob("public/javascripts/application.js").vendored?
assert blob("public/javascripts/prototype.js").vendored? assert blob("public/javascripts/prototype.js").vendored?

View File

@@ -45,7 +45,7 @@ class TestLanguage < Test::Unit::TestCase
assert_equal Lexer['S'], Language['R'].lexer assert_equal Lexer['S'], Language['R'].lexer
assert_equal Lexer['Scheme'], Language['Emacs Lisp'].lexer assert_equal Lexer['Scheme'], Language['Emacs Lisp'].lexer
assert_equal Lexer['Scheme'], Language['Nu'].lexer assert_equal Lexer['Scheme'], Language['Nu'].lexer
assert_equal Lexer['Scheme'], Language['Racket'].lexer assert_equal Lexer['Racket'], Language['Racket'].lexer
assert_equal Lexer['Scheme'], Language['Scheme'].lexer assert_equal Lexer['Scheme'], Language['Scheme'].lexer
assert_equal Lexer['Standard ML'], Language['Standard ML'].lexer assert_equal Lexer['Standard ML'], Language['Standard ML'].lexer
assert_equal Lexer['TeX'], Language['TeX'].lexer assert_equal Lexer['TeX'], Language['TeX'].lexer