Compare commits

..

123 Commits

Author SHA1 Message Date
Colin Seymour
7c17b1f10c Bump to v5.0.11 (#3642) 2017-05-25 16:12:34 +01:00
Paul Chaignon
d490fc303f Support for CWeb language (#3592)
Move .w file extension for CWeb to its own entry.
2017-05-25 09:22:40 +01:00
Michael Hadley
20fdac95f6 Add Closure Templates (#3634)
* Add Closure Templates to languages.yml

* Run script/add-grammar

* Add sample

* Run script/set-language-ids

* Add codemirror_mime_type
2017-05-25 09:15:32 +01:00
Colin Seymour
234ee8b6d2 Update location of Reason grammar (#3639) 2017-05-25 08:59:02 +01:00
Ross Kirsling
58ab593a64 Switch Dart grammars (Sublime → Atom). (#3633) 2017-05-20 17:41:46 +01:00
John Gardner
ec1f6a4cd6 Add ".nr" as a Roff file extension (#3630) 2017-05-18 03:03:47 +10:00
Colin Seymour
3eea8212f4 Revert "Use Textmate's HAML grammar" (#3629)
* Revert "Use Textmate's HAML grammar (#3627)"

This reverts commit a1e09ae3e6.

* Add back missing grammar sources
2017-05-16 15:58:39 +01:00
Vicent Martí
a1e09ae3e6 Use Textmate's HAML grammar (#3627)
* Use Textmate's HAML grammar

* Whitelist license
2017-05-16 12:46:04 +02:00
Robert Koeninger
c1f76c26e5 Add Shen grammar to vendor/README.md (#3626)
* Added sublime-shen as submodule

* Specified tm_scope in languages.yml

* Imported copy of license

* Added Shen grammar repo to vendor/README.md
2017-05-16 08:12:45 +01:00
Robert Koeninger
0983f62e02 Add syntax grammar for Shen language (#3625)
* Added sublime-shen as submodule

* Specified tm_scope in languages.yml

* Imported copy of license
2017-05-15 15:06:09 +01:00
Samuel Gunadi
190e54c020 Add comp, tesc, and tese as GLSL extensions (#3614)
* Add comp, tesc, and tese as GLSL file extensions

* Add GLSL compute shader sample

* Add GLSL tessellation control shader sample

* Add GLSL tessellation evaluation shader sample

* Remove .comp from GLSL extensions

We have to be sure that most of the .comp files on GitHub are indeed GLSL compute shaders.

* Remove GLSL compute shader sample
2017-05-15 09:05:07 +01:00
Lucas Bajolet
ded651159d Add Pep8 Assembly language (#2070)
Pep/8 is a toy assembly language used in some universities for teaching
the basics of assembly and low-level programming.

Signed-off-by: Lucas Bajolet <lucas.bajolet@gmail.com>
2017-05-15 09:02:06 +01:00
Serghei Iakovlev
acbab53198 Update Zephir links (#3608) 2017-05-10 15:56:21 +01:00
Simen Bekkhus
fba4babdcd Don't show npm lockfiles by default (#3611) 2017-05-10 15:55:16 +01:00
Colin Seymour
eb6a213921 Revert "Revert "Switch the PHP grammar to the upstream repo (#3575)"" (#3616)
* Revert "Revert "Switch the PHP grammar to the upstream repo (#3575)" (#3603)"

This reverts commit e93f41f097.
2017-05-10 15:53:15 +01:00
Colin Seymour
5e2c79e950 Bump version to v5.0.10 (#3604) 2017-05-05 18:49:35 +01:00
Colin Seymour
e93f41f097 Revert "Switch the PHP grammar to the upstream repo (#3575)" (#3603)
* Revert "Switch the PHP grammar to the upstream repo (#3575)"

Manually reverting this as it breaks PHP syntax highlighting on
github.com.

* Update submodule ref
2017-05-05 17:11:29 +01:00
Colin Seymour
994bc1f135 Release v5.0.9 (#3597)
* Update all grammars

* Update atom-language-clean grammar to match

* Don't update reason grammer

There seems to be a problem with the 1.3.5 release in that the conversion isn't producing a reason entry so doesn't match whats in grammar.yml

* Bump version to 5.0.9

* Update grammars

* Don't update javascript grammar

The current grammar has a known issue and is pending the fix in https://github.com/atom/language-javascript/pull/497
2017-05-03 14:49:26 +01:00
John Gardner
44f03e64c1 Merge heuristics for disambiguating ".t" files (#3587)
References: github/linguist#3546
2017-04-29 11:15:39 +02:00
Jacob Elder
4166f2e89d Clarify support for generated code (#3588)
* Clarify support for generated code

* Incorporate feedback

* TIL about how .gitattributes matching works
2017-04-28 16:20:22 -07:00
John Gardner
1a8f19c6f2 Fix numbering of ordered lists (#3586) 2017-04-28 14:02:38 -07:00
Santiago M. Mola
c0e242358a Fix heuristics after rename (#3556)
* fix Roff detection in heuristics

This affects extensions .l, .ms, .n and .rno.

Groff was renamed to Roff in 673aeb32b9851cc58429c4b598c876292aaf70c7,
but heuristic was not updated.

* replace FORTRAN with Fortran

It was already renamed in most places since 4fd8fce08574809aa58e9771e2a9da5d135127be
heuristics.rb was missing though.

* fix caseness of GCC Machine Description
2017-04-26 15:31:36 -07:00
thesave
eb38c8dcf8 [Add Language] Jolie (#3574)
* added support for Jolie language

* added support for Jolie language

* added samples for Jolie
2017-04-26 11:04:25 -07:00
Trent Schafer
f146b4afbd New extension support for PL/SQL language (#2735)
* Add additional PL/SQL file extensions

* Add PL/SQL samples for .ddl and .prc

* Fix sort order of PL/SQL extensions

* Restore vendor/grammars/assembly.

* Restore `pls` as primary PL/SQL extension

* Add tpb to go with tps
2017-04-26 11:03:01 -07:00
Nicolas Garnier
db15d0f5d2 Added MJML as an XML extension (#3582) 2017-04-26 19:24:57 +10:00
Michael Grafnetter
e6d57c771d Add .admx and .adml as extensions for XML (#3580)
* Add .admx and .adml as extensions for XML

* Fixed the order of extensions
2017-04-24 09:55:22 -07:00
Nathan Phillip Brink
eef0335c5f Clarify description of implicit alias. (#3578)
* Clarify description of implicit alias.

I was trying to look up the alias to use for DNS Zone. From the docs
the alias I should use would be dns zone, but in reality it is dns-zone.
This change updates the comments to describe how to derive the
implicit name of a given alias.

* Further clarify description of implicit alias.

@pchaigno requested replacing the Ruby with English.
2017-04-24 09:54:37 -07:00
Christoph Pojer
461c27c066 Revert "Added Jest snapshot test files as generated src (#3572)" (#3579)
This reverts commit f38d6bd124.
2017-04-22 14:20:54 +02:00
Matvei Stefarov
59d67d6743 Treat vstemplate and vsixmanifest as XML (#3517) 2017-04-22 09:25:50 +01:00
Sandy Armstrong
7aeeb82d3d Treat Xamarin .workbook files as markdown (#3500)
* Treat Xamarin .workbook files as markdown

Xamarin Workbook files are interactive coding documents for C#, serialized as
markdown files. They include a YAML front matter header block with some
metadata. Interactive code cells are included as `csharp` fenced code blocks.

An example can be found here:
https://github.com/xamarin/Workbooks/blob/master/csharp/csharp6/csharp6.workbook

Treated as markdown, it would appear like so:
https://gist.github.com/sandyarmstrong/e331dfeaf89cbce89043a1c31faa1297

* Add .workbook sample

Source: https://github.com/xamarin/Workbooks/blob/master/csharp/csharp6/csharp6.workbook
2017-04-20 15:29:17 +01:00
Christophe Coevoet
c98ca20076 Switch the PHP grammar to the upstream repo (#3575)
* Switch the PHP grammar to the upstream repo

* Update all URLs pointing to the PHP grammar bundle
2017-04-20 14:40:44 +01:00
Paul Chaignon
4e0b5f02aa Fix usage line in binary (#3564)
Linguist cannot work on any directory; it needs to be a Git
repository.
2017-04-20 10:18:03 +01:00
Tim Jones
8da7cb805e Add .cginc extension to HLSL language (#3491)
* Add .cginc extension to HLSL language

* Move extension to correct position

* Add representative sample .cginc file
2017-04-20 09:48:48 +01:00
Dorian
e5e81a8560 Add .irbc and Rakefile to matching ruby filenames (#3457) 2017-04-20 09:41:31 +01:00
Tim Jones
dd53fa1585 Add ShaderLab language (#3490)
* Add ShaderLab language

* Update HLSL and ShaderLab grammars to latest version

* Add .shader extension back to GLSL language

* Add sample GLSL .shader files

Note that these are copies of existing GLSL samples, renamed to have
the .shader extension.
2017-04-20 09:04:08 +01:00
Daniel F Moisset
354a8f079a Add suport for python typeshed (.pyi) extension (#3548) 2017-04-20 09:01:41 +01:00
Hank Brekke
f38d6bd124 Added Jest snapshot test files as generated src (#3572) 2017-04-20 08:58:39 +01:00
Santiago M. Mola
e80b92e407 Fix heuristic for Unix Assembly with .ms extension (#3550) 2017-04-06 22:01:42 +10:00
Martin Nowak
fa6ae1116f better heuristic distinction of .d files (#3145)
* fix benchmark

- require json for Hash.to_json

* better heuristic distinction of .d files

- properly recongnize dtrace probes
- recongnize \ in Makefile paths
- recongnize single line `file.ext : dep.ext` make targets
- recognize D module, import, function, and unittest declarations
- add more representative D samples

D changed from 31.2% to 28.1%
DTrace changed from 33.5% to 32.5%
Makefile changed from 35.3% to 39.4%

See
https://gist.github.com/MartinNowak/fda24fdef64f2dbb05c5a5ceabf22bd3
for the scraper used to get a test corpus.
2017-03-30 18:25:53 +01:00
Yuki Izumi
b7e27a9f58 .pod disambiguation heuristic fix (#3541)
Look for any line starting with "=\w+", not full lines, otherwise we
miss e.g. "=head1 HEADING".
2017-03-27 14:10:17 +11:00
Javier Honduvilla Coto
69ba4c5586 Update the Instrumenter doc ... (#3530)
... with an instance of the given`Instrumenter` instead of the class itself.
2017-03-23 06:11:45 +01:00
Rafer Hazen
c39d7fd6e8 Add data-engineering staff to maintainers list (#3533) 2017-03-22 07:06:58 -06:00
Yuki Izumi
44ed47cea1 Release v5.0.8 (#3535) 2017-03-22 16:41:36 +11:00
Yuki Izumi
de51cb08d2 Add .mdwn for Markdown (#3534) 2017-03-22 16:28:59 +11:00
Ronald Wampler
3dd2d08190 Add .mdown as an extension for Markdown (#3525)
* Add `.mdown` as an extension for Markdown

* Add `.mdown` sample
2017-03-22 16:14:54 +11:00
Yuki Izumi
3b625e1954 Release v5.0.7 (#3524)
* grammar update
* Release v5.0.7
2017-03-20 14:13:04 +11:00
Yuki Izumi
5c6f690b97 Prefer Markdown over GCC Machine Description (#3523)
* Add minimal Markdown sample
* Heuristic defaults to Markdown on no match
* Allow Linguist to detect empty blobs
2017-03-20 13:07:54 +11:00
Michael Rawlings
3bbfc907f3 [Add Language] Marko (#3519)
* add marko

* update marko
2017-03-17 09:46:20 +00:00
Colin Seymour
053b8bca97 GitHub.com now uses gitattributes overrides for syntax highlighting (#3518)
See https://github.com/github/linguist/issues/1792#issuecomment-286379822 for more details.
2017-03-15 22:42:08 -07:00
Yves Siegrist
7fb3db6203 Add .eye files to be used as ruby (#3509)
Usually files that are used for [eye](https://github.com/kostya/eye) have the file extension `.eye`.
A eye definition file always contains ruby code.
2017-03-13 17:22:56 -07:00
Liav Turkia
ba09394f85 Added a demos folder and updated regexes (#3512)
I added a check for case-sensitivity to the regex's. In my repositories, I have both a Docs and Demos folder and those wouldn't have been matched before. Now, they would.
2017-03-13 17:20:36 -07:00
Paul Chaignon
c59c88f16e Update grammar whitelist (#3510)
* Remove a few hashes for grammars with BSD licenses

There was an error in Licensee v8.8.2, which caused it to not
recognize some BSD licenses. v8.8.3 fixes it.

* Update submodules

Remove 2 grammars from the whitelist because their licenses were
added to a LICENSE file which a proper format (one that Licensee
detects).

MagicPython now supports all scopes that were previously supported
by language-python.
2017-03-13 17:19:06 -07:00
Brandon Black
8a6e74799a Merge branch 'master' of https://github.com/github/linguist 2017-03-13 17:13:48 -07:00
Brandon Black
4268769d2e adjusting travis config 2017-03-13 17:13:24 -07:00
NN
6601864084 Add wixproj as XML (#3511)
* Add wixproj as XML

WiX uses wixproj for projects.

* Add wixproj sample
2017-03-13 17:01:58 -07:00
Paul Chaignon
d57aa37fb7 Grammar for OpenSCAD from Textmate bundle (#3502) 2017-03-13 17:00:27 -07:00
Karl Pettersson
e72347fd98 Add alias for pandoc (#3493) 2017-03-13 16:59:35 -07:00
Brandon Black
1b429ea46b updating rubies 2017-03-10 00:00:19 -08:00
Paul Chaignon
9468ad4947 Fix grammar hashes (#3504)
* Update Licensee hashes for grammar licenses

Licensee v8.8 changed the way licenses are normalized, thus changing hashes for
some grammars

* Update Licensee

Prevent automatic updates to major releases
2017-03-09 23:57:35 -08:00
Nate Whetsell
733ef63193 Add Jison (#3488) 2017-02-22 00:24:50 -08:00
Brandon Black
9ca6a5841e Release v5.0.6 (#3489)
* grammar update

* bumping linguist version

* fixes for grammar updates
2017-02-21 23:13:15 -08:00
Brandon Black
41ace5fba0 using fork for php.tmbundle since updates are broken 2017-02-21 17:13:55 -08:00
Alex Arslan
cc4295b3b3 Update URL for Julia TextMate repo (#3487) 2017-02-21 17:05:59 -08:00
doug tangren
1e4ce80fd9 add support for detecting bazel WORKSPACE files (#3459)
* add support for detecting bazel WORKSPACE files

* Update languages.yml
2017-02-21 16:48:44 -08:00
Brandon Black
74a71fd90d fixing merge conflict 2017-02-21 16:28:34 -08:00
TingPing
9b08318456 Add Meson language (#3463) 2017-02-21 16:24:58 -08:00
Tim Jones
fa5b6b03dc Add grammar for HLSL (High Level Shading Language) (#3469) 2017-02-21 16:05:25 -08:00
Garen Torikian
cb59296fe0 Like ^docs?, sometimes one sample is enough (#3485) 2017-02-20 10:29:30 -08:00
Eloy Durán
f1be771611 Disambiguate TypeScript with tsx extension. (#3464)
Using the technique as discussed in #2761.
2017-02-20 10:17:18 +00:00
Alex Louden
b66fcb2529 Improve Terraform (.tf) / HCL (.hcl) syntax highlighting (#3392)
* Add Terraform grammar, and change .tf and .hcl files from using Ruby to Terraform sublime syntax

* Expand Terraform sample to demonstrate more language features

* Revert terraform sample change

* Add terraform sample - Dokku AWS deploy

* Updated to latest Terraform

* Update terraform string interpolation

* Update terraform to latest
2017-02-20 10:09:59 +00:00
Brandon Black
f7fe1fee66 Release v5.0.5 -- part Deux (#3479)
* bumping to v5.0.5

* relaxing rugged version requirement
2017-02-15 21:29:04 -08:00
Brandon Black
94367cc460 Update LICENSE 2017-02-15 14:11:37 -08:00
Phineas
72bec1fddc Update LICENSE Copyright Date to 2017 (#3476) 2017-02-15 14:11:13 -08:00
Brandon Black
4e2eba4ef8 Revert "Release v5.0.5" (#3477) 2017-02-15 12:48:45 -08:00
Colin Seymour
10457b6639 Release v5.0.5 (#3473)
Release v5.0.5

* Update submodules

* Update grammars

* Bump version to 5.0.5

* Relax dependency on rugged

It's probably not wise to depend on a beta version just yet.

* revert php.tmbundle grammar update

One of the changes in 3ed4837b43...010cc1c22c leads to breakage in snippet highlighting on github.com
2017-02-15 11:12:53 +00:00
Paul Chaignon
d58cbc68a6 Support for the P4 language
P4 is a language to describe the processing pipeline of network devices
2017-02-15 06:53:46 +01:00
Colin Seymour
01de40faaa Return early in Classifier.classify if no languages supplied (#3471)
* Return early if no languages supplied

There's no need to tokenise the data when attempting to classify without a limited language scope as no action will be performed when it comes to scoring anyway.

* Add test for empty languages array
2017-02-13 18:22:54 +00:00
Paul Chaignon
62d285fce6 Fix head commit for TXL grammar (#3470) 2017-02-13 14:35:38 +01:00
Stefan Stölzle
0056095e8c Add .lkml to LookML (#3454)
* Add .lkml to LookML

* Limit .lkml to .view.lkml and .model.lkml

* Add lkml samples

* Fix extension order
2017-02-03 11:50:30 +01:00
Lars Brinkhoff
d6dc3a3991 Accomodate Markdown lines which begin with '>'. (#3452) 2017-02-02 11:58:52 -08:00
Greg Zimmerman
b524461b7c Add PowerShell color. Matches default console color for PowerShell. (#3448) 2017-02-02 11:13:01 -08:00
fix-fix
76d41697aa Use official HTML primary color (#3447)
Use primary color of HTML5 logo as defined on Logo FAQ page
2017-02-02 11:09:55 -08:00
John Gardner
32147b629e Register "Emakefile" as an Erlang filename (#3443) 2017-02-02 11:09:07 -08:00
John Gardner
e7b5e25bf8 Add support for regular expression data (#3441) 2017-02-02 11:08:20 -08:00
Brandon Black
d761658f8b Release v5.0.4 (#3445)
* bumping to v5.0.3

* updating rugged
2017-01-31 15:08:52 -08:00
Brandon Black
3719214aba fixing null reference in yarn.lock check (#3444) 2017-01-31 14:45:22 -08:00
Paul Chaignon
47b109be36 Improve heuristic for Modula-2 (#3434)
Module names can contain dots
2017-01-24 15:10:46 -08:00
Javier Honduvilla Coto
1ec4db97c2 Be able to configure the number of threads for git submodules update (#3438)
* make the number of submodule update threads configurable

* change thread number to be a script argument
2017-01-24 09:55:31 -08:00
Brandon Black
9fe5fe0de2 v5.0.3 Release (#3436)
* bumping to v5.0.3

* updating grammars
2017-01-23 20:09:26 -08:00
sunderls
b36ea7ac9d Add yarn (#3432)
* add yarn.lock

* fix comment

* remove yarn test

* add test

* fix test

* try fix again

* try 3rd time

* check filename and firstline for yarn lockfile
2017-01-23 10:58:53 -08:00
Yuki Izumi
625b06c30d Use textmate/diff.tmbundle again (#3429)
Upstream textmate/diff.tmbundle#6 was merged.
2017-01-17 17:02:10 +11:00
Brandon Black
28bce533b2 Release v5.0.2 (#3427)
* updated grammars

* bumping version

* adding .gem files to gitignore
2017-01-11 16:08:31 -08:00
John Gardner
93ec1922cb Swap grammar used for CSS highlighting (#3426)
* Swap grammar used for CSS highlighting

* Whitelist license of Atom's CSS grammar

* Explicitly declare grammar as MIT-licensed

Source: https://github.com/atom/language-css/blob/5d4af/package.json#L14
2017-01-11 16:16:25 +11:00
Yuki Izumi
5d09fb67dd Allow for split(",") returning nil (#3424) 2017-01-10 11:44:24 +11:00
Arfon Smith
93dcb61742 Removing references to @arfon (#3423) 2017-01-09 11:54:55 -08:00
Brandon Black
3a03594685 bumping to v5.0.1 (#3421) 2017-01-08 23:30:56 -08:00
Brandon Black
5ce2c254f9 purging vestigial search term references 2017-01-08 23:08:16 -08:00
Brandon Black
d7814c4899 fixing incompatibility with latest rugged 2017-01-08 22:59:00 -08:00
Brandon Black
50c08bf29e updating grammars 2017-01-08 22:29:01 -08:00
Yuki Izumi
34928baee6 Use https://github.com/textmate/diff.tmbundle/pull/6 (#3420) 2017-01-08 21:53:14 -08:00
USAMI Kenta
27bb41aa4d Add Cask file as Emacs Lisp (#3416)
* Add Cask file as Emacs Lisp

* Replace Cask file (original is zonuexe/composer.el)
2017-01-04 11:26:20 -08:00
Brandon Black
1415f4b52d fixing grammar file pedantry 2017-01-03 17:23:56 -08:00
Arie Kurniawan
ae8ffcad22 change http to https
fix the little flaws found on http protocol that is used,
one of the web using the http protocol which is already supporting more secure protocol,
which is https
2017-01-03 17:19:58 -08:00
Brandon Black
f43633bf10 fixing license for atom-language-rust 2017-01-03 17:02:25 -08:00
Brandon Black
a604de9846 replacing atom grammar due to ST2 compatibility change 2017-01-03 16:46:02 -08:00
Brandon Black
3e224e0039 updating grammars 2017-01-03 16:33:46 -08:00
Christopher Gilbert
15b04f86c3 Amended license file for SublimeEthereum language grammar 2017-01-03 16:13:27 -08:00
Christopher Gilbert
42af436c20 Added grammar for Solidity programming language 2017-01-03 16:13:27 -08:00
Christopher Gilbert
2b08c66f0b Added grammar for Solidity programming language 2017-01-03 16:13:27 -08:00
Zach Brock
f98ab593fb Detect Javascript files generated by Protocol Buffers. 2017-01-03 16:07:26 -08:00
Brandon Black
f951ec07de bin help cleanup 2017-01-03 15:52:22 -08:00
Vadim Markovtsev
e9ac71590f Add --json option to bin/linguist 2017-01-03 14:56:12 -08:00
Al Thomas
210cd19876 Add Genie programming language (#3396)
* Add Genie programming language

Genie was introduced in 2008 as part of the GNOME project:
https://wiki.gnome.org/Projects/Genie

It is a programming language that uses the Vala compiler to
produce native binaries. It has good bindings to C libraries
especially those that are part of the GObject world such as
Gtk+3 and GStreamer

* Change color for Genie so tests pass
2017-01-03 14:15:17 -08:00
Mate Lorincz
f473c555ac Update vendor.yml (#3382)
Exclude Realm and RealmSwift frameworks from language statistics.
2017-01-03 14:10:49 -08:00
Nate Whetsell
48e4394d87 Add Jison-generated JavaScript to generated files (#3393)
* Fix typos

* Add Jison-generated JavaScript to generated files
2017-01-03 14:08:29 -08:00
meganemura
e1ce88920d Fix add-grammar and convert-grammars (#3354)
* Skip removed grammar submodule

* Clean up old grammar from grammars and grammars.yml

* Clean up unused grammar license

Run `script/licensed`.
This was missing change in 12f9295 of #3350.

* Clean up license files when we replace grammar

Update license files by running `script/licensed`.
Since we replace grammar, the old grammar license must be removed
and new grammar license must be added.
2017-01-03 13:57:46 -08:00
Paul Chaignon
675cee1d72 Improve the .pl Prolog heuristic rule (#3409)
:- can be directly at the start of a line
2017-01-03 13:54:47 -08:00
yutannihilation
1c4baf6dc2 ignore roxygen2-generated files (#3373) 2017-01-03 13:31:04 -08:00
Samantha McVey
8f2820e9cc Add XCompose language and highlighter (#3402)
* Add XCompose language and highlighter

* XCompose fix some errors in the Travis build

* Remove xmodmap files for XCompose

Most xmodmap files aren't XCompose, and there's not enough xmodmap files
which are XCompose to be worth adding to heuristics

* Remove some extensions/filenames from XCompose

* Rename and move sample to correct folder and filename

That we have added in languages.yml

* Use generated language id
2017-01-03 13:29:00 -08:00
Danila Malyutin
04c268e535 Add mysql extension for sql scripts (#3413) 2017-01-03 11:47:19 -08:00
Paul Chaignon
ec749b3f8d Remove the Hy grammar (#3411)
The grammar repository was recently deleted
2017-01-03 11:38:44 -08:00
Samantha McVey
08b63e7033 Change repository for Perl 6 grammar 2016-12-24 00:58:21 +01:00
Darin Morrison
7867b946b9 Add support for Reason (#3336) 2016-12-22 17:03:18 -08:00
238 changed files with 22759 additions and 313 deletions

1
.gitignore vendored
View File

@@ -1,3 +1,4 @@
*.gem
/Gemfile.lock
.bundle/
.idea

80
.gitmodules vendored
View File

@@ -67,9 +67,6 @@
[submodule "vendor/grammars/language-javascript"]
path = vendor/grammars/language-javascript
url = https://github.com/atom/language-javascript
[submodule "vendor/grammars/language-python"]
path = vendor/grammars/language-python
url = https://github.com/atom/language-python
[submodule "vendor/grammars/language-shellscript"]
path = vendor/grammars/language-shellscript
url = https://github.com/atom/language-shellscript
@@ -115,9 +112,6 @@
[submodule "vendor/grammars/fancy-tmbundle"]
path = vendor/grammars/fancy-tmbundle
url = https://github.com/fancy-lang/fancy-tmbundle
[submodule "vendor/grammars/dart-sublime-bundle"]
path = vendor/grammars/dart-sublime-bundle
url = https://github.com/guillermooo/dart-sublime-bundle
[submodule "vendor/grammars/sublimetext-cuda-cpp"]
path = vendor/grammars/sublimetext-cuda-cpp
url = https://github.com/harrism/sublimetext-cuda-cpp
@@ -130,9 +124,6 @@
[submodule "vendor/grammars/Sublime-Text-2-OpenEdge-ABL"]
path = vendor/grammars/Sublime-Text-2-OpenEdge-ABL
url = https://github.com/jfairbank/Sublime-Text-2-OpenEdge-ABL
[submodule "vendor/grammars/sublime-rust"]
path = vendor/grammars/sublime-rust
url = https://github.com/jhasse/sublime-rust
[submodule "vendor/grammars/sublime-befunge"]
path = vendor/grammars/sublime-befunge
url = https://github.com/johanasplund/sublime-befunge
@@ -180,7 +171,7 @@
url = https://github.com/mokus0/Agda.tmbundle
[submodule "vendor/grammars/Julia.tmbundle"]
path = vendor/grammars/Julia.tmbundle
url = https://github.com/nanoant/Julia.tmbundle
url = https://github.com/JuliaEditorSupport/Julia.tmbundle
[submodule "vendor/grammars/ooc.tmbundle"]
path = vendor/grammars/ooc.tmbundle
url = https://github.com/nilium/ooc.tmbundle
@@ -247,9 +238,6 @@
[submodule "vendor/grammars/cpp-qt.tmbundle"]
path = vendor/grammars/cpp-qt.tmbundle
url = https://github.com/textmate/cpp-qt.tmbundle
[submodule "vendor/grammars/css.tmbundle"]
path = vendor/grammars/css.tmbundle
url = https://github.com/textmate/css.tmbundle
[submodule "vendor/grammars/d.tmbundle"]
path = vendor/grammars/d.tmbundle
url = https://github.com/textmate/d.tmbundle
@@ -396,7 +384,7 @@
url = https://github.com/textmate/c.tmbundle
[submodule "vendor/grammars/zephir-sublime"]
path = vendor/grammars/zephir-sublime
url = https://github.com/vmg/zephir-sublime
url = https://github.com/phalcon/zephir-sublime
[submodule "vendor/grammars/llvm.tmbundle"]
path = vendor/grammars/llvm.tmbundle
url = https://github.com/whitequark/llvm.tmbundle
@@ -443,9 +431,6 @@
[submodule "vendor/grammars/Sublime-Nit"]
path = vendor/grammars/Sublime-Nit
url = https://github.com/R4PaSs/Sublime-Nit
[submodule "vendor/grammars/language-hy"]
path = vendor/grammars/language-hy
url = https://github.com/rwtolbert/language-hy
[submodule "vendor/grammars/Racket"]
path = vendor/grammars/Racket
url = https://github.com/soegaard/racket-highlight-for-github
@@ -623,9 +608,6 @@
[submodule "vendor/grammars/language-yang"]
path = vendor/grammars/language-yang
url = https://github.com/DzonyKalafut/language-yang.git
[submodule "vendor/grammars/perl6fe"]
path = vendor/grammars/perl6fe
url = https://github.com/MadcapJake/language-perl6fe.git
[submodule "vendor/grammars/language-less"]
path = vendor/grammars/language-less
url = https://github.com/atom/language-less.git
@@ -809,4 +791,60 @@
[submodule "vendor/grammars/rascal-syntax-highlighting"]
path = vendor/grammars/rascal-syntax-highlighting
url = https://github.com/usethesource/rascal-syntax-highlighting
[submodule "vendor/grammars/atom-language-perl6"]
path = vendor/grammars/atom-language-perl6
url = https://github.com/perl6/atom-language-perl6
[submodule "vendor/grammars/reason"]
path = vendor/grammars/reason
url = https://github.com/chenglou/sublime-reason
[submodule "vendor/grammars/language-xcompose"]
path = vendor/grammars/language-xcompose
url = https://github.com/samcv/language-xcompose
[submodule "vendor/grammars/SublimeEthereum"]
path = vendor/grammars/SublimeEthereum
url = https://github.com/davidhq/SublimeEthereum.git
[submodule "vendor/grammars/atom-language-rust"]
path = vendor/grammars/atom-language-rust
url = https://github.com/zargony/atom-language-rust
[submodule "vendor/grammars/language-css"]
path = vendor/grammars/language-css
url = https://github.com/atom/language-css
[submodule "vendor/grammars/language-regexp"]
path = vendor/grammars/language-regexp
url = https://github.com/Alhadis/language-regexp
[submodule "vendor/grammars/Terraform.tmLanguage"]
path = vendor/grammars/Terraform.tmLanguage
url = https://github.com/alexlouden/Terraform.tmLanguage
[submodule "vendor/grammars/shaders-tmLanguage"]
path = vendor/grammars/shaders-tmLanguage
url = https://github.com/tgjones/shaders-tmLanguage
[submodule "vendor/grammars/language-meson"]
path = vendor/grammars/language-meson
url = https://github.com/TingPing/language-meson
[submodule "vendor/grammars/atom-language-p4"]
path = vendor/grammars/atom-language-p4
url = https://github.com/TakeshiTseng/atom-language-p4
[submodule "vendor/grammars/language-jison"]
path = vendor/grammars/language-jison
url = https://github.com/cdibbs/language-jison
[submodule "vendor/grammars/openscad.tmbundle"]
path = vendor/grammars/openscad.tmbundle
url = https://github.com/tbuser/openscad.tmbundle
[submodule "vendor/grammars/marko-tmbundle"]
path = vendor/grammars/marko-tmbundle
url = https://github.com/marko-js/marko-tmbundle
[submodule "vendor/grammars/language-jolie"]
path = vendor/grammars/language-jolie
url = https://github.com/fmontesi/language-jolie
[submodule "vendor/grammars/sublime-shen"]
path = vendor/grammars/sublime-shen
url = https://github.com/rkoeninger/sublime-shen
[submodule "vendor/grammars/Sublime-Pep8"]
path = vendor/grammars/Sublime-Pep8
url = https://github.com/R4PaSs/Sublime-Pep8
[submodule "vendor/grammars/dartlang"]
path = vendor/grammars/dartlang
url = https://github.com/dart-atom/dartlang
[submodule "vendor/grammars/language-closure-templates"]
path = vendor/grammars/language-closure-templates
url = https://github.com/mthadley/language-closure-templates

View File

@@ -1,20 +1,33 @@
language: ruby
sudo: false
addons:
apt:
packages:
- libicu-dev
- libicu48
before_install: script/travis/before_install
script:
- bundle exec rake
- script/licensed verify
rvm:
- 2.0.0
- 2.1
- 2.2
- 2.3.3
- 2.4.0
matrix:
allow_failures:
- rvm: 2.4.0
notifications:
disabled: true
git:
submodules: false
depth: 3
cache: bundler

View File

@@ -10,15 +10,15 @@ We try only to add new extensions once they have some usage on GitHub. In most c
To add support for a new extension:
0. Add your extension to the language entry in [`languages.yml`][languages], keeping the extensions in alphabetical order.
0. Add at least one sample for your extension to the [samples directory][samples] in the correct subdirectory.
0. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
1. Add your extension to the language entry in [`languages.yml`][languages], keeping the extensions in alphabetical order.
1. Add at least one sample for your extension to the [samples directory][samples] in the correct subdirectory.
1. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
In addition, if this extension is already listed in [`languages.yml`][languages] then sometimes a few more steps will need to be taken:
0. Make sure that example `.yourextension` files are present in the [samples directory][samples] for each language that uses `.yourextension`.
0. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.yourextension` files. (ping **@arfon** or **@bkeepers** to help with this) to ensure we're not misclassifying files.
0. If the Bayesian classifier does a bad job with the sample `.yourextension` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
1. Make sure that example `.yourextension` files are present in the [samples directory][samples] for each language that uses `.yourextension`.
1. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.yourextension` files. (ping **@bkeepers** to help with this) to ensure we're not misclassifying files.
1. If the Bayesian classifier does a bad job with the sample `.yourextension` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
## Adding a language
@@ -27,17 +27,17 @@ We try only to add languages once they have some usage on GitHub. In most cases
To add support for a new language:
0. Add an entry for your language to [`languages.yml`][languages]. Omit the `language_id` field for now.
0. Add a grammar for your language: `script/add-grammar https://github.com/JaneSmith/MyGrammar`. Please only add grammars that have [one of these licenses][licenses].
0. Add samples for your language to the [samples directory][samples] in the correct subdirectory.
0. Add a `language_id` for your language using `script/set-language-ids`. **You should only ever need to run `script/set-language-ids --update`. Anything other than this risks breaking GitHub search :cry:**
0. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
1. Add an entry for your language to [`languages.yml`][languages]. Omit the `language_id` field for now.
1. Add a grammar for your language: `script/add-grammar https://github.com/JaneSmith/MyGrammar`. Please only add grammars that have [one of these licenses][licenses].
1. Add samples for your language to the [samples directory][samples] in the correct subdirectory.
1. Add a `language_id` for your language using `script/set-language-ids`. **You should only ever need to run `script/set-language-ids --update`. Anything other than this risks breaking GitHub search :cry:**
1. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
In addition, if your new language defines an extension that's already listed in [`languages.yml`][languages] (such as `.foo`) then sometimes a few more steps will need to be taken:
0. Make sure that example `.foo` files are present in the [samples directory][samples] for each language that uses `.foo`.
0. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.foo` files. (ping **@arfon** or **@bkeepers** to help with this) to ensure we're not misclassifying files.
0. If the Bayesian classifier does a bad job with the sample `.foo` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
1. Make sure that example `.foo` files are present in the [samples directory][samples] for each language that uses `.foo`.
1. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.foo` files. (ping **@bkeepers** to help with this) to ensure we're not misclassifying files.
1. If the Bayesian classifier does a bad job with the sample `.foo` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
Remember, the goal here is to try and avoid false positives!
@@ -80,13 +80,14 @@ Here's our current build status: [![Build Status](https://api.travis-ci.org/gith
Linguist is maintained with :heart: by:
- **@Alhadis**
- **@arfon**
- **@brandonblack** (GitHub staff)
- **@BenEddy** (GitHub staff)
- **@Caged** (GitHub staff)
- **@grantr** (GitHub staff)
- **@larsbrinkhoff**
- **@lildude** (GitHub staff)
- **@lizzhale** (GitHub staff)
- **@mikemcquaid** (GitHub staff)
- **@pchaigno**
- **@rafer** (GitHub staff)
- **@shreyasjoshis** (GitHub staff)
As Linguist is a production dependency for GitHub we have a couple of workflow restrictions:
@@ -97,21 +98,21 @@ As Linguist is a production dependency for GitHub we have a couple of workflow r
If you are the current maintainer of this gem:
0. Create a branch for the release: `git checkout -b cut-release-vxx.xx.xx`
0. Make sure your local dependencies are up to date: `script/bootstrap`
0. If grammar submodules have not been updated recently, update them: `git submodule update --remote && git commit -a`
0. Ensure that samples are updated: `bundle exec rake samples`
0. Ensure that tests are green: `bundle exec rake test`
0. Bump gem version in `lib/linguist/version.rb`, [like this](https://github.com/github/linguist/commit/8d2ea90a5ba3b2fe6e1508b7155aa4632eea2985).
0. Make a PR to github/linguist, [like this](https://github.com/github/linguist/pull/1238).
0. Build a local gem: `bundle exec rake build_gem`
0. Test the gem:
0. Bump the Gemfile and Gemfile.lock versions for an app which relies on this gem
0. Install the new gem locally
0. Test behavior locally, branch deploy, whatever needs to happen
0. Merge github/linguist PR
0. Tag and push: `git tag vx.xx.xx; git push --tags`
0. Push to rubygems.org -- `gem push github-linguist-3.0.0.gem`
1. Create a branch for the release: `git checkout -b cut-release-vxx.xx.xx`
1. Make sure your local dependencies are up to date: `script/bootstrap`
1. If grammar submodules have not been updated recently, update them: `git submodule update --remote && git commit -a`
1. Ensure that samples are updated: `bundle exec rake samples`
1. Ensure that tests are green: `bundle exec rake test`
1. Bump gem version in `lib/linguist/version.rb`, [like this](https://github.com/github/linguist/commit/8d2ea90a5ba3b2fe6e1508b7155aa4632eea2985).
1. Make a PR to github/linguist, [like this](https://github.com/github/linguist/pull/1238).
1. Build a local gem: `bundle exec rake build_gem`
1. Test the gem:
1. Bump the Gemfile and Gemfile.lock versions for an app which relies on this gem
1. Install the new gem locally
1. Test behavior locally, branch deploy, whatever needs to happen
1. Merge github/linguist PR
1. Tag and push: `git tag vx.xx.xx; git push --tags`
1. Push to rubygems.org -- `gem push github-linguist-3.0.0.gem`
[grammars]: /grammars.yml
[languages]: /lib/linguist/languages.yml

View File

@@ -1,4 +1,4 @@
Copyright (c) 2011-2016 GitHub, Inc.
Copyright (c) 2017 GitHub, Inc.
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation

View File

@@ -15,10 +15,10 @@ See [Troubleshooting](#troubleshooting) and [`CONTRIBUTING.md`](/CONTRIBUTING.md
The Language stats bar displays languages percentages for the files in the repository. The percentages are calculated based on the bytes of code for each language as reported by the [List Languages](https://developer.github.com/v3/repos/#list-languages) API. If the bar is reporting a language that you don't expect:
0. Click on the name of the language in the stats bar to see a list of the files that are identified as that language.
0. If you see files that you didn't write, consider moving the files into one of the [paths for vendored code](/lib/linguist/vendor.yml), or use the [manual overrides](#overrides) feature to ignore them.
0. If the files are being misclassified, search for [open issues][issues] to see if anyone else has already reported the issue. Any information you can add, especially links to public repositories, is helpful.
0. If there are no reported issues of this misclassification, [open an issue][new-issue] and include a link to the repository or a sample of the code that is being misclassified.
1. Click on the name of the language in the stats bar to see a list of the files that are identified as that language.
1. If you see files that you didn't write, consider moving the files into one of the [paths for vendored code](/lib/linguist/vendor.yml), or use the [manual overrides](#overrides) feature to ignore them.
1. If the files are being misclassified, search for [open issues][issues] to see if anyone else has already reported the issue. Any information you can add, especially links to public repositories, is helpful.
1. If there are no reported issues of this misclassification, [open an issue][new-issue] and include a link to the repository or a sample of the code that is being misclassified.
### There's a problem with the syntax highlighting of a file
@@ -32,13 +32,15 @@ Linguist supports a number of different custom overrides strategies for language
### Using gitattributes
Add a `.gitattributes` file to your project and use standard git-style path matchers for the files you want to override to set `linguist-documentation`, `linguist-language`, and `linguist-vendored`. `.gitattributes` will be used to determine language statistics, but will not be used to syntax highlight files. To manually set syntax highlighting, use [Vim or Emacs modelines](#using-emacs-or-vim-modelines).
Add a `.gitattributes` file to your project and use standard git-style path matchers for the files you want to override to set `linguist-documentation`, `linguist-language`, `linguist-vendored`, and `linguist-generated`. `.gitattributes` will be used to determine language statistics and will be used to syntax highlight files. You can also manually set syntax highlighting using [Vim or Emacs modelines](#using-emacs-or-vim-modelines).
```
$ cat .gitattributes
*.rb linguist-language=Java
```
#### Vendored code
Checking code you didn't write, such as JavaScript libraries, into your git repo is a common practice, but this often inflates your project's language stats and may even cause your project to be labeled as another language. By default, Linguist treats all of the paths defined in [lib/linguist/vendor.yml](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml) as vendored and therefore doesn't include them in the language statistics for a repository.
Use the `linguist-vendored` attribute to vendor or un-vendor paths.
@@ -49,6 +51,8 @@ special-vendored-path/* linguist-vendored
jquery.js linguist-vendored=false
```
#### Documentation
Just like vendored files, Linguist excludes documentation files from your project's language stats. [lib/linguist/documentation.yml](lib/linguist/documentation.yml) lists common documentation paths and excludes them from the language statistics for your repository.
Use the `linguist-documentation` attribute to mark or unmark paths as documentation.
@@ -59,19 +63,18 @@ project-docs/* linguist-documentation
docs/formatter.rb linguist-documentation=false
```
#### Generated file detection
#### Generated code
Not all plain text files are true source files. Generated files like minified js and compiled CoffeeScript can be detected and excluded from language stats. As an added bonus, unlike vendored and documentation files, these files are suppressed in diffs.
```ruby
Linguist::FileBlob.new("underscore.min.js").generated? # => true
```
See [Linguist::Generated#generated?](https://github.com/github/linguist/blob/master/lib/linguist/generated.rb).
$ cat .gitattributes
Api.elm linguist-generated=true
```
### Using Emacs or Vim modelines
Alternatively, you can use Vim or Emacs style modelines to set the language for a single file. Modelines can be placed anywhere within a file and are respected when determining how to syntax-highlight a file on GitHub.com
If you do not want to use `.gitattributes` to override the syntax highlighting used on GitHub.com, you can use Vim or Emacs style modelines to set the language for a single file. Modelines can be placed anywhere within a file and are respected when determining how to syntax-highlight a file on GitHub.com
##### Vim
```

View File

@@ -4,6 +4,7 @@ require 'rake/testtask'
require 'yaml'
require 'yajl'
require 'open-uri'
require 'json'
task :default => :test

View File

@@ -1,5 +1,7 @@
#!/usr/bin/env ruby
$LOAD_PATH[0, 0] = File.join(File.dirname(__FILE__), '..', 'lib')
require 'linguist'
require 'rugged'
require 'optparse'
@@ -102,10 +104,16 @@ def git_linguist(args)
commit = nil
parser = OptionParser.new do |opts|
opts.banner = "Usage: git-linguist [OPTIONS] stats|breakdown|dump-cache|clear|disable"
opts.banner = <<-HELP
Linguist v#{Linguist::VERSION}
Detect language type and determine language breakdown for a given Git repository.
Usage:
git-linguist [OPTIONS] stats|breakdown|dump-cache|clear|disable"
HELP
opts.on("-f", "--force", "Force a full rescan") { incremental = false }
opts.on("--commit=COMMIT", "Commit to index") { |v| commit = v}
opts.on("-c", "--commit=COMMIT", "Commit to index") { |v| commit = v}
end
parser.parse!(args)

View File

@@ -1,29 +1,37 @@
#!/usr/bin/env ruby
# linguist — detect language type for a file, or, given a directory, determine language breakdown
# usage: linguist <path> [<--breakdown>]
#
$LOAD_PATH[0, 0] = File.join(File.dirname(__FILE__), '..', 'lib')
require 'linguist'
require 'rugged'
require 'json'
require 'optparse'
path = ARGV[0] || Dir.pwd
# special case if not given a directory but still given the --breakdown option
# special case if not given a directory
# but still given the --breakdown or --json options/
if path == "--breakdown"
path = Dir.pwd
breakdown = true
elsif path == "--json"
path = Dir.pwd
json_breakdown = true
end
ARGV.shift
breakdown = true if ARGV[0] == "--breakdown"
json_breakdown = true if ARGV[0] == "--json"
if File.directory?(path)
rugged = Rugged::Repository.new(path)
repo = Linguist::Repository.new(rugged, rugged.head.target_id)
repo.languages.sort_by { |_, size| size }.reverse.each do |language, size|
percentage = ((size / repo.size.to_f) * 100)
percentage = sprintf '%.2f' % percentage
puts "%-7s %s" % ["#{percentage}%", language]
if !json_breakdown
repo.languages.sort_by { |_, size| size }.reverse.each do |language, size|
percentage = ((size / repo.size.to_f) * 100)
percentage = sprintf '%.2f' % percentage
puts "%-7s %s" % ["#{percentage}%", language]
end
end
if breakdown
puts
@@ -35,6 +43,8 @@ if File.directory?(path)
end
puts
end
elsif json_breakdown
puts JSON.dump(repo.breakdown_by_file)
end
elsif File.file?(path)
blob = Linguist::FileBlob.new(path, Dir.pwd)
@@ -63,5 +73,12 @@ elsif File.file?(path)
puts " appears to be a vendored file"
end
else
abort "usage: linguist <path>"
abort <<-HELP
Linguist v#{Linguist::VERSION}
Detect language type for a file, or, given a repository, determine language breakdown.
Usage: linguist <path>
linguist <path> [--breakdown] [--json]
linguist [--breakdown] [--json]
HELP
end

View File

@@ -16,7 +16,7 @@ Gem::Specification.new do |s|
s.add_dependency 'charlock_holmes', '~> 0.7.3'
s.add_dependency 'escape_utils', '~> 1.1.0'
s.add_dependency 'mime-types', '>= 1.19'
s.add_dependency 'rugged', '>= 0.23.0b'
s.add_dependency 'rugged', '>= 0.25.1'
s.add_development_dependency 'minitest', '>= 5.0'
s.add_development_dependency 'mocha'
@@ -26,5 +26,5 @@ Gem::Specification.new do |s|
s.add_development_dependency 'yajl-ruby'
s.add_development_dependency 'color-proximity', '~> 0.2.1'
s.add_development_dependency 'licensed'
s.add_development_dependency 'licensee', '>= 8.6.0'
s.add_development_dependency 'licensee', '~> 8.8.0'
end

View File

@@ -1,9 +1,9 @@
---
http://svn.edgewall.org/repos/genshi/contrib/textmate/Genshi.tmbundle/Syntaxes/Markup%20Template%20%28XML%29.tmLanguage:
- text.xml.genshi
https://bitbucket.org/Clams/sublimesystemverilog/get/default.tar.gz:
- source.systemverilog
- source.ucfconstraints
https://svn.edgewall.org/repos/genshi/contrib/textmate/Genshi.tmbundle/Syntaxes/Markup%20Template%20%28XML%29.tmLanguage:
- text.xml.genshi
vendor/grammars/ABNF.tmbundle:
- source.abnf
vendor/grammars/Agda.tmbundle:
@@ -56,6 +56,8 @@ vendor/grammars/MQL5-sublime:
vendor/grammars/MagicPython:
- source.python
- source.regexp.python
- text.python.console
- text.python.traceback
vendor/grammars/Modelica:
- source.modelica
vendor/grammars/NSIS:
@@ -98,6 +100,8 @@ vendor/grammars/Sublime-Modula-2:
- source.modula2
vendor/grammars/Sublime-Nit:
- source.nit
vendor/grammars/Sublime-Pep8/:
- source.pep8
vendor/grammars/Sublime-QML:
- source.qml
vendor/grammars/Sublime-REBOL:
@@ -113,7 +117,9 @@ vendor/grammars/SublimeBrainfuck:
- source.bf
vendor/grammars/SublimeClarion:
- source.clarion
vendor/grammars/SublimeGDB:
vendor/grammars/SublimeEthereum:
- source.solidity
vendor/grammars/SublimeGDB/:
- source.disasm
- source.gdb
- source.gdb.session
@@ -128,6 +134,8 @@ vendor/grammars/TLA:
- source.tla
vendor/grammars/TXL:
- source.txl
vendor/grammars/Terraform.tmLanguage:
- source.terraform
vendor/grammars/Textmate-Gosu-Bundle:
- source.gosu.2
vendor/grammars/UrWeb-Language-Definition:
@@ -178,8 +186,18 @@ vendor/grammars/atom-language-1c-bsl:
- source.sdbl
vendor/grammars/atom-language-clean:
- source.clean
- text.restructuredtext.clean
vendor/grammars/atom-language-p4:
- source.p4
vendor/grammars/atom-language-perl6:
- source.meta-info
- source.perl6fe
- source.quoting.perl6fe
- source.regexp.perl6fe
vendor/grammars/atom-language-purescript:
- source.purescript
vendor/grammars/atom-language-rust:
- source.rust
vendor/grammars/atom-language-srt:
- text.srt
vendor/grammars/atom-language-stan:
@@ -211,7 +229,6 @@ vendor/grammars/capnproto.tmbundle:
vendor/grammars/carto-atom:
- source.css.mss
vendor/grammars/ceylon-sublimetext:
- module.ceylon
- source.ceylon
vendor/grammars/chapel-tmbundle:
- source.chapel
@@ -225,8 +242,6 @@ vendor/grammars/cpp-qt.tmbundle:
- source.qmake
vendor/grammars/creole:
- text.html.creole
vendor/grammars/css.tmbundle:
- source.css
vendor/grammars/cucumber-tmbundle:
- source.ruby.rspec.cucumber.steps
- text.gherkin.feature
@@ -234,11 +249,9 @@ vendor/grammars/cython:
- source.cython
vendor/grammars/d.tmbundle:
- source.d
vendor/grammars/dart-sublime-bundle:
vendor/grammars/dartlang:
- source.dart
- source.pubspec
- text.dart-analysis-output
- text.dart-doccomments
- source.yaml-ext
vendor/grammars/desktop.tmbundle:
- source.desktop
vendor/grammars/diff.tmbundle:
@@ -345,6 +358,8 @@ vendor/grammars/language-click:
- source.click
vendor/grammars/language-clojure:
- source.clojure
vendor/grammars/language-closure-templates:
- text.html.soy
vendor/grammars/language-coffee-script:
- source.coffee
- source.litcoffee
@@ -360,6 +375,8 @@ vendor/grammars/language-csound:
- source.csound
- source.csound-document
- source.csound-score
vendor/grammars/language-css:
- source.css
vendor/grammars/language-emacs-lisp:
- source.emacs.lisp
vendor/grammars/language-fontforge:
@@ -384,14 +401,19 @@ vendor/grammars/language-haskell:
- source.haskell
- source.hsc2hs
- text.tex.latex.haskell
vendor/grammars/language-hy:
- source.hy
vendor/grammars/language-inform7:
- source.inform7
vendor/grammars/language-javascript:
- source.js
- source.js.regexp
- source.js.regexp.replacement
- source.jsdoc
vendor/grammars/language-jison:
- source.jison
- source.jisonlex
- source.jisonlex-injection
vendor/grammars/language-jolie:
- source.jolie
vendor/grammars/language-jsoniq:
- source.jq
- source.xq
@@ -399,20 +421,24 @@ vendor/grammars/language-less:
- source.css.less
vendor/grammars/language-maxscript:
- source.maxscript
vendor/grammars/language-meson:
- source.meson
vendor/grammars/language-ncl:
- source.ncl
vendor/grammars/language-ninja:
- source.ninja
vendor/grammars/language-povray:
- source.pov-ray sdl
vendor/grammars/language-python:
- text.python.console
- text.python.traceback
vendor/grammars/language-regexp:
- source.regexp
- source.regexp.extended
vendor/grammars/language-renpy:
- source.renpy
vendor/grammars/language-restructuredtext:
- text.restructuredtext
vendor/grammars/language-roff:
- source.ditroff
- source.ditroff.desc
- source.ideal
- source.pic
- text.roff
@@ -436,6 +462,8 @@ vendor/grammars/language-wavefront:
- source.wavefront.obj
vendor/grammars/language-xbase:
- source.harbour
vendor/grammars/language-xcompose:
- config.xcompose
vendor/grammars/language-yaml:
- source.yaml
vendor/grammars/language-yang:
@@ -465,6 +493,8 @@ vendor/grammars/make.tmbundle:
- source.makefile
vendor/grammars/mako-tmbundle:
- text.html.mako
vendor/grammars/marko-tmbundle:
- text.marko
vendor/grammars/mathematica-tmbundle:
- source.mathematica
vendor/grammars/matlab.tmbundle:
@@ -502,6 +532,8 @@ vendor/grammars/ooc.tmbundle:
- source.ooc
vendor/grammars/opa.tmbundle:
- source.opa
vendor/grammars/openscad.tmbundle:
- source.scad
vendor/grammars/oz-tmbundle/Syntaxes/Oz.tmLanguage:
- source.oz
vendor/grammars/parrot:
@@ -513,10 +545,6 @@ vendor/grammars/pawn-sublime-language:
vendor/grammars/perl.tmbundle:
- source.perl
- source.perl.6
vendor/grammars/perl6fe:
- source.meta-info
- source.perl6fe
- source.regexp.perl6fe
vendor/grammars/php-smarty.tmbundle:
- text.html.smarty
vendor/grammars/php.tmbundle:
@@ -541,6 +569,8 @@ vendor/grammars/r.tmbundle:
- text.tex.latex.rd
vendor/grammars/rascal-syntax-highlighting:
- source.rascal
vendor/grammars/reason:
- source.reason
vendor/grammars/ruby-slim.tmbundle:
- text.slim
vendor/grammars/ruby.tmbundle:
@@ -560,6 +590,9 @@ vendor/grammars/scilab.tmbundle:
- source.scilab
vendor/grammars/secondlife-lsl:
- source.lsl
vendor/grammars/shaders-tmLanguage:
- source.hlsl
- source.shaderlab
vendor/grammars/smali-sublime:
- source.smali
vendor/grammars/smalltalk-tmbundle:
@@ -608,8 +641,8 @@ vendor/grammars/sublime-rexx:
- source.rexx
vendor/grammars/sublime-robot-plugin:
- text.robot
vendor/grammars/sublime-rust:
- source.rust
vendor/grammars/sublime-shen:
- source.shen
vendor/grammars/sublime-spintools:
- source.regexp.spin
- source.spin

View File

@@ -15,9 +15,9 @@ class << Linguist
# see Linguist::LazyBlob and Linguist::FileBlob for examples
#
# Returns Language or nil.
def detect(blob)
def detect(blob, allow_empty: false)
# Bail early if the blob is binary or empty.
return nil if blob.likely_binary? || blob.binary? || blob.empty?
return nil if blob.likely_binary? || blob.binary? || (!allow_empty && blob.empty?)
Linguist.instrument("linguist.detection", :blob => blob) do
# Call each strategy until one candidate is returned.
@@ -74,7 +74,7 @@ class << Linguist
# end
# end
#
# Linguist.instrumenter = CustomInstrumenter
# Linguist.instrumenter = CustomInstrumenter.new
#
# The instrumenter must conform to the `ActiveSupport::Notifications`
# interface, which defines `#instrument` and accepts:

View File

@@ -95,7 +95,7 @@ module Linguist
# Returns sorted Array of result pairs. Each pair contains the
# String language name and a Float score.
def classify(tokens, languages)
return [] if tokens.nil?
return [] if tokens.nil? || languages.empty?
tokens = Tokenizer.tokenize(tokens) if tokens.is_a?(String)
scores = {}

View File

@@ -9,11 +9,12 @@
## Documentation directories ##
- ^docs?/
- ^[Dd]ocs?/
- (^|/)[Dd]ocumentation/
- (^|/)javadoc/
- ^man/
- (^|/)[Jj]avadoc/
- ^[Mm]an/
- ^[Ee]xamples/
- ^[Dd]emos?/
## Documentation files ##
@@ -27,4 +28,4 @@
- (^|/)[Rr]eadme(\.|$)
# Samples folders
- ^[Ss]amples/
- ^[Ss]amples?/

View File

@@ -3,7 +3,7 @@ module Linguist
# Public: Is the blob a generated file?
#
# name - String filename
# data - String blob data. A block also maybe passed in for lazy
# data - String blob data. A block also may be passed in for lazy
# loading. This behavior is deprecated and you should always
# pass in a String.
#
@@ -57,7 +57,7 @@ module Linguist
composer_lock? ||
node_modules? ||
go_vendor? ||
npm_shrinkwrap? ||
npm_shrinkwrap_or_package_lock? ||
godeps? ||
generated_by_zephir? ||
minified_files? ||
@@ -70,6 +70,7 @@ module Linguist
compiled_cython_file? ||
generated_go? ||
generated_protocol_buffer? ||
generated_javascript_protocol_buffer? ||
generated_apache_thrift? ||
generated_jni_header? ||
vcr_cassette? ||
@@ -77,7 +78,10 @@ module Linguist
generated_unity3d_meta? ||
generated_racc? ||
generated_jflex? ||
generated_grammarkit?
generated_grammarkit? ||
generated_roxygen2? ||
generated_jison? ||
generated_yarn_lock?
end
# Internal: Is the blob an Xcode file?
@@ -275,6 +279,17 @@ module Linguist
return lines[0].include?("Generated by the protocol buffer compiler. DO NOT EDIT!")
end
# Internal: Is the blob a Javascript source file generated by the
# Protocol Buffer compiler?
#
# Returns true of false.
def generated_javascript_protocol_buffer?
return false unless extname == ".js"
return false unless lines.count > 6
return lines[5].include?("GENERATED CODE -- DO NOT EDIT!")
end
APACHE_THRIFT_EXTENSIONS = ['.rb', '.py', '.go', '.js', '.m', '.java', '.h', '.cc', '.cpp', '.php']
# Internal: Is the blob generated by Apache Thrift compiler?
@@ -311,11 +326,11 @@ module Linguist
!!name.match(/vendor\/((?!-)[-0-9A-Za-z]+(?<!-)\.)+(com|edu|gov|in|me|net|org|fm|io)/)
end
# Internal: Is the blob a generated npm shrinkwrap file.
# Internal: Is the blob a generated npm shrinkwrap or package lock file?
#
# Returns true or false.
def npm_shrinkwrap?
!!name.match(/npm-shrinkwrap\.json/)
def npm_shrinkwrap_or_package_lock?
name.match(/npm-shrinkwrap\.json/) || name.match(/package-lock\.json/)
end
# Internal: Is the blob part of Godeps/,
@@ -333,7 +348,7 @@ module Linguist
!!name.match(/composer\.lock/)
end
# Internal: Is the blob a generated by Zephir
# Internal: Is the blob generated by Zephir?
#
# Returns true or false.
def generated_by_zephir?
@@ -433,5 +448,46 @@ module Linguist
return false unless lines.count > 1
return lines[0].start_with?("// This is a generated file. Not intended for manual editing.")
end
# Internal: Is this a roxygen2-generated file?
#
# A roxygen2-generated file typically contain:
# % Generated by roxygen2: do not edit by hand
# on the first line.
#
# Return true or false
def generated_roxygen2?
return false unless extname == '.Rd'
return false unless lines.count > 1
return lines[0].include?("% Generated by roxygen2: do not edit by hand")
end
# Internal: Is this a Jison-generated file?
#
# Jison-generated parsers typically contain:
# /* parser generated by jison
# on the first line.
#
# Jison-generated lexers typically contain:
# /* generated by jison-lex
# on the first line.
#
# Return true or false
def generated_jison?
return false unless extname == '.js'
return false unless lines.count > 1
return lines[0].start_with?("/* parser generated by jison ") ||
lines[0].start_with?("/* generated by jison-lex ")
end
# Internal: Is the blob a generated yarn lockfile?
#
# Returns true or false.
def generated_yarn_lock?
return false unless name.match(/yarn\.lock/)
return false unless lines.count > 0
return lines[0].include?("# THIS IS AN AUTOGENERATED FILE")
end
end
end

View File

@@ -125,11 +125,18 @@ module Linguist
end
disambiguate ".d" do |data|
if /^module /.match(data)
# see http://dlang.org/spec/grammar
# ModuleDeclaration | ImportDeclaration | FuncDeclaration | unittest
if /^module\s+[\w.]*\s*;|import\s+[\w\s,.:]*;|\w+\s+\w+\s*\(.*\)(?:\(.*\))?\s*{[^}]*}|unittest\s*(?:\(.*\))?\s*{[^}]*}/.match(data)
Language["D"]
elsif /^((dtrace:::)?BEGIN|provider |#pragma (D (option|attributes)|ident)\s)/.match(data)
# see http://dtrace.org/guide/chp-prog.html, http://dtrace.org/guide/chp-profile.html, http://dtrace.org/guide/chp-opt.html
elsif /^(\w+:\w*:\w*:\w*|BEGIN|END|provider\s+|(tick|profile)-\w+\s+{[^}]*}|#pragma\s+D\s+(option|attributes|depends_on)\s|#pragma\s+ident\s)/.match(data)
Language["DTrace"]
elsif /(\/.*:( .* \\)$| : \\$|^ : |: \\$)/.match(data)
# path/target : dependency \
# target : \
# : dependency
# path/file.ext1 : some/path/../file.ext2
elsif /([\/\\].*:\s+.*\s\\$|: \\$|^ : |^[\w\s\/\\.]+\w+\.\w+\s*:\s+[\w\s\/\\.]+\w+\.\w+)/.match(data)
Language["Makefile"]
end
end
@@ -158,7 +165,7 @@ module Linguist
elsif data.include?("flowop")
Language["Filebench WML"]
elsif fortran_rx.match(data)
Language["FORTRAN"]
Language["Fortran"]
end
end
@@ -166,7 +173,7 @@ module Linguist
if /^: /.match(data)
Language["Forth"]
elsif fortran_rx.match(data)
Language["FORTRAN"]
Language["Fortran"]
end
end
@@ -219,7 +226,7 @@ module Linguist
elsif /^(%[%{}]xs|<.*>)/.match(data)
Language["Lex"]
elsif /^\.[a-z][a-z](\s|$)/i.match(data)
Language["Groff"]
Language["Roff"]
elsif /^\((de|class|rel|code|data|must)\s/.match(data)
Language["PicoLisp"]
end
@@ -260,10 +267,12 @@ module Linguist
end
disambiguate ".md" do |data|
if /(^[-a-z0-9=#!\*\[|])|<\//i.match(data) || data.empty?
if /(^[-a-z0-9=#!\*\[|>])|<\//i.match(data) || data.empty?
Language["Markdown"]
elsif /^(;;|\(define_)/.match(data)
Language["GCC machine description"]
Language["GCC Machine Description"]
else
Language["Markdown"]
end
end
@@ -278,7 +287,7 @@ module Linguist
disambiguate ".mod" do |data|
if data.include?('<!ENTITY ')
Language["XML"]
elsif /MODULE\s\w+\s*;/i.match(data) || /^\s*END \w+;$/i.match(data)
elsif /^\s*MODULE [\w\.]+;/i.match(data) || /^\s*END [\w\.]+;/i.match(data)
Language["Modula-2"]
else
[Language["Linux Kernel Module"], Language["AMPL"]]
@@ -287,9 +296,9 @@ module Linguist
disambiguate ".ms" do |data|
if /^[.'][a-z][a-z](\s|$)/i.match(data)
Language["Groff"]
Language["Roff"]
elsif /(?<!\S)\.(include|globa?l)\s/.match(data) || /(?<!\/\*)(\A|\n)\s*\.[A-Za-z]/.match(data.gsub(/"([^\\"]|\\.)*"|'([^\\']|\\.)*'|\\\s*(?:--.*)?\n/, ""))
Language["GAS"]
Language["Unix Assembly"]
else
Language["MAXScript"]
end
@@ -297,7 +306,7 @@ module Linguist
disambiguate ".n" do |data|
if /^[.']/.match(data)
Language["Groff"]
Language["Roff"]
elsif /^(module|namespace|using)\s/.match(data)
Language["Nemerle"]
end
@@ -326,7 +335,7 @@ module Linguist
end
disambiguate ".pl" do |data|
if /^[^#]+:-/.match(data)
if /^[^#]*:-/.match(data)
Language["Prolog"]
elsif /use strict|use\s+v?5\./.match(data)
Language["Perl"]
@@ -335,16 +344,16 @@ module Linguist
end
end
disambiguate ".pm", ".t" do |data|
if /use strict|use\s+v?5\./.match(data)
Language["Perl"]
elsif /^(use v6|(my )?class|module)/.match(data)
disambiguate ".pm" do |data|
if /^\s*(?:use\s+v6\s*;|(?:\bmy\s+)?class|module)\b/.match(data)
Language["Perl6"]
elsif /\buse\s+(?:strict\b|v?5\.)/.match(data)
Language["Perl"]
end
end
disambiguate ".pod" do |data|
if /^=\w+$/.match(data)
if /^=\w+\b/.match(data)
Language["Pod"]
else
Language["Perl"]
@@ -383,7 +392,7 @@ module Linguist
if /^\.!|^\.end lit(?:eral)?\b/i.match(data)
Language["RUNOFF"]
elsif /^\.\\" /.match(data)
Language["Groff"]
Language["Roff"]
end
end
@@ -434,10 +443,12 @@ module Linguist
end
disambiguate ".t" do |data|
if /^\s*%|^\s*var\s+\w+\s*:\s*\w+/.match(data)
if /^\s*%[ \t]+|^\s*var\s+\w+\s*:=\s*\w+/.match(data)
Language["Turing"]
elsif /^\s*use\s+v6\s*;/.match(data)
elsif /^\s*(?:use\s+v6\s*;|\bmodule\b|\b(?:my\s+)?class\b)/.match(data)
Language["Perl6"]
elsif /\buse\s+(?:strict\b|v?5\.)/.match(data)
Language["Perl"]
end
end
@@ -465,5 +476,13 @@ module Linguist
Language["Scilab"]
end
end
disambiguate ".tsx" do |data|
if /^\s*(import.+(from\s+|require\()['"]react|\/\/\/\s*<reference\s)/.match(data)
Language["TypeScript"]
elsif /^\s*<\?xml\s+version/i.match(data)
Language["XML"]
end
end
end
end

View File

@@ -215,7 +215,14 @@ module Linguist
# Returns the Language or nil if none was found.
def self.[](name)
return nil if name.to_s.empty?
name && (@index[name.downcase] || @index[name.split(',').first.downcase])
lang = @index[name.downcase]
return lang if lang
name = name.split(',').first
return nil if name.to_s.empty?
@index[name.downcase]
end
# Public: A List of popular languages
@@ -265,7 +272,7 @@ module Linguist
@color = attributes[:color]
# Set aliases
@aliases = [default_alias_name] + (attributes[:aliases] || [])
@aliases = [default_alias] + (attributes[:aliases] || [])
# Load the TextMate scope name or try to guess one
@tm_scope = attributes[:tm_scope] || begin
@@ -283,9 +290,6 @@ module Linguist
@codemirror_mime_type = attributes[:codemirror_mime_type]
@wrap = attributes[:wrap] || false
# Set legacy search term
@search_term = attributes[:search_term] || default_alias_name
# Set the language_id
@language_id = attributes[:language_id]
@@ -437,12 +441,13 @@ module Linguist
EscapeUtils.escape_url(name).gsub('+', '%20')
end
# Internal: Get default alias name
# Public: Get default alias name
#
# Returns the alias name String
def default_alias_name
def default_alias
name.downcase.gsub(/\s/, '-')
end
alias_method :default_alias_name, :default_alias
# Public: Get Language group
#
@@ -557,7 +562,6 @@ module Linguist
:wrap => options['wrap'],
:group_name => options['group'],
:searchable => options.fetch('searchable', true),
:search_term => options['search_term'],
:language_id => options['language_id'],
:extensions => Array(options['extensions']),
:interpreters => options['interpreters'].sort,

View File

@@ -455,7 +455,6 @@ C:
- ".cats"
- ".h"
- ".idc"
- ".w"
interpreters:
- tcc
ace_mode: c_cpp
@@ -499,6 +498,7 @@ C++:
- ".inc"
- ".inl"
- ".ipp"
- ".re"
- ".tcc"
- ".tpp"
language_id: 43
@@ -588,6 +588,13 @@ CSV:
extensions:
- ".csv"
language_id: 51
CWeb:
type: programming
extensions:
- ".w"
tm_scope: none
ace_mode: text
language_id: 657332628
Cap'n Proto:
type: programming
tm_scope: source.capnp
@@ -686,6 +693,18 @@ Clojure:
filenames:
- riemann.config
language_id: 62
Closure Templates:
type: markup
group: HTML
ace_mode: soy_template
codemirror_mode: soy
codemirror_mime_type: text/x-soy
alias:
- soy
extensions:
- ".soy"
tm_scope: text.html.soy
language_id: 357046146
CoffeeScript:
type: programming
tm_scope: source.coffee
@@ -1120,6 +1139,7 @@ Emacs Lisp:
- ".gnus"
- ".spacemacs"
- ".viper"
- Cask
- Project.ede
- _emacs
- abbrev_defs
@@ -1154,6 +1174,7 @@ Erlang:
- ".xrl"
- ".yrl"
filenames:
- Emakefile
- rebar.config
- rebar.config.lock
- rebar.lock
@@ -1356,6 +1377,8 @@ GLSL:
- ".glslv"
- ".gshader"
- ".shader"
- ".tesc"
- ".tese"
- ".vert"
- ".vrx"
- ".vsh"
@@ -1384,6 +1407,14 @@ Game Maker Language:
codemirror_mode: clike
codemirror_mime_type: text/x-c++src
language_id: 125
Genie:
type: programming
ace_mode: text
extensions:
- ".gs"
color: "#fb855d"
tm_scope: none
language_id: 792408528
Genshi:
type: programming
extensions:
@@ -1577,17 +1608,18 @@ HCL:
ace_mode: ruby
codemirror_mode: ruby
codemirror_mime_type: text/x-ruby
tm_scope: source.ruby
tm_scope: source.terraform
language_id: 144
HLSL:
type: programming
extensions:
- ".hlsl"
- ".cginc"
- ".fx"
- ".fxh"
- ".hlsli"
ace_mode: text
tm_scope: none
tm_scope: source.hlsl
language_id: 145
HTML:
type: markup
@@ -1595,7 +1627,7 @@ HTML:
ace_mode: html
codemirror_mode: htmlmixed
codemirror_mime_type: text/html
color: "#e44b23"
color: "#e34c26"
aliases:
- xhtml
extensions:
@@ -1754,7 +1786,7 @@ Hy:
- ".hy"
aliases:
- hylang
tm_scope: source.hy
tm_scope: none
language_id: 159
HyPhy:
type: programming
@@ -2012,6 +2044,33 @@ JavaScript:
interpreters:
- node
language_id: 183
Jison:
type: programming
group: Yacc
extensions:
- ".jison"
tm_scope: source.jison
ace_mode: text
language_id: 284531423
Jison Lex:
type: programming
group: Lex
extensions:
- ".jisonlex"
tm_scope: source.jisonlex
ace_mode: text
language_id: 406395330
Jolie:
type: programming
extensions:
- ".ol"
- ".iol"
interpreters:
- jolie
color: "#843179"
ace_mode: text
tm_scope: source.jolie
language_id: 998078858
Julia:
type: programming
extensions:
@@ -2281,6 +2340,8 @@ LookML:
color: "#652B81"
extensions:
- ".lookml"
- ".model.lkml"
- ".view.lkml"
tm_scope: source.yaml
language_id: 211
LoomScript:
@@ -2427,6 +2488,8 @@ Mako:
language_id: 221
Markdown:
type: prose
aliases:
- pandoc
ace_mode: markdown
codemirror_mode: gfm
codemirror_mime_type: text/x-gfm
@@ -2434,12 +2497,27 @@ Markdown:
extensions:
- ".md"
- ".markdown"
- ".mdown"
- ".mdwn"
- ".mkd"
- ".mkdn"
- ".mkdown"
- ".ron"
- ".workbook"
tm_scope: source.gfm
language_id: 222
Marko:
group: HTML
type: markup
tm_scope: text.marko
extensions:
- ".marko"
aliases:
- markojs
ace_mode: text
codemirror_mode: htmlmixed
codemirror_mime_type: text/html
language_id: 932782397
Mask:
type: markup
color: "#f97732"
@@ -2524,6 +2602,15 @@ Mercury:
- ".moo"
tm_scope: source.mercury
language_id: 229
Meson:
type: programming
color: "#007800"
filenames:
- meson.build
- meson_options.txt
tm_scope: source.meson
ace_mode: text
language_id: 799141244
Metal:
type: programming
color: "#8f14e9"
@@ -2892,7 +2979,7 @@ OpenSCAD:
type: programming
extensions:
- ".scad"
tm_scope: none
tm_scope: source.scad
ace_mode: scad
language_id: 266
OpenType Feature File:
@@ -2939,6 +3026,14 @@ Oz:
codemirror_mode: oz
codemirror_mime_type: text/x-oz
language_id: 270
P4:
type: programming
color: "#7055b5"
extensions:
- ".p4"
tm_scope: source.p4
ace_mode: text
language_id: 348895984
PAWN:
type: programming
color: "#dbb284"
@@ -2984,12 +3079,21 @@ PLSQL:
color: "#dad8d8"
extensions:
- ".pls"
- ".bdy"
- ".ddl"
- ".fnc"
- ".pck"
- ".pkb"
- ".pks"
- ".plb"
- ".plsql"
- ".prc"
- ".spc"
- ".sql"
- ".tpb"
- ".tps"
- ".trg"
- ".vw"
language_id: 273
PLpgSQL:
type: programming
@@ -3075,6 +3179,14 @@ Pascal:
codemirror_mode: pascal
codemirror_mime_type: text/x-pascal
language_id: 281
Pep8:
type: programming
color: "#C76F5B"
extensions:
- ".pep"
ace_mode: text
tm_scope: source.pep8
language_id: 840372442
Perl:
type: programming
tm_scope: source.perl
@@ -3216,6 +3328,7 @@ PowerBuilder:
language_id: 292
PowerShell:
type: programming
color: "#012456"
ace_mode: powershell
codemirror_mode: powershell
codemirror_mime_type: application/x-powershell
@@ -3343,6 +3456,7 @@ Python:
- ".lmi"
- ".py3"
- ".pyde"
- ".pyi"
- ".pyp"
- ".pyt"
- ".pyw"
@@ -3358,6 +3472,7 @@ Python:
- SConscript
- SConstruct
- Snakefile
- WORKSPACE
- wscript
interpreters:
- python
@@ -3548,6 +3663,19 @@ Raw token data:
tm_scope: none
ace_mode: text
language_id: 318
Reason:
type: programming
group: OCaml
ace_mode: rust
codemirror_mode: rust
codemirror_mime_type: text/x-rustsrc
extensions:
- ".re"
- ".rei"
interpreters:
- ocaml
tm_scope: source.reason
language_id: 869538413
Rebol:
type: programming
color: "#358a5b"
@@ -3578,6 +3706,17 @@ Redcode:
tm_scope: none
ace_mode: text
language_id: 321
Regular Expression:
type: data
extensions:
- ".regexp"
- ".regex"
aliases:
- regexp
- regex
ace_mode: text
tm_scope: source.regexp
language_id: 363378884
Ren'Py:
type: programming
aliases:
@@ -3628,6 +3767,7 @@ Roff:
- ".me"
- ".ms"
- ".n"
- ".nr"
- ".rno"
- ".roff"
- ".tmac"
@@ -3666,10 +3806,10 @@ Ruby:
extensions:
- ".rb"
- ".builder"
- ".eye"
- ".fcgi"
- ".gemspec"
- ".god"
- ".irbrc"
- ".jbuilder"
- ".mspec"
- ".pluginspec"
@@ -3691,6 +3831,7 @@ Ruby:
- jruby
- rbx
filenames:
- ".irbrc"
- ".pryrc"
- Appraisals
- Berksfile
@@ -3706,6 +3847,7 @@ Ruby:
- Mavenfile
- Podfile
- Puppetfile
- Rakefile
- Snapfile
- Thorfile
- Vagrantfile
@@ -3790,6 +3932,7 @@ SQL:
- ".cql"
- ".ddl"
- ".inc"
- ".mysql"
- ".prc"
- ".tab"
- ".udf"
@@ -3923,6 +4066,13 @@ Self:
tm_scope: none
ace_mode: text
language_id: 345
ShaderLab:
type: programming
extensions:
- ".shader"
ace_mode: text
tm_scope: source.shaderlab
language_id: 664257356
Shell:
type: programming
color: "#89e051"
@@ -3976,7 +4126,7 @@ Shen:
color: "#120F14"
extensions:
- ".shen"
tm_scope: none
tm_scope: source.shen
ace_mode: text
language_id: 348
Slash:
@@ -4274,12 +4424,16 @@ Text:
- ".no"
filenames:
- COPYING
- COPYRIGHT.regex
- FONTLOG
- INSTALL
- INSTALL.mysql
- LICENSE
- LICENSE.mysql
- NEWS
- README.1ST
- README.me
- README.mysql
- click.me
- delete.me
- keep.me
@@ -4566,6 +4720,15 @@ XC:
codemirror_mode: clike
codemirror_mime_type: text/x-csrc
language_id: 398
XCompose:
type: data
filenames:
- ".XCompose"
- XCompose
- xcompose
tm_scope: config.xcompose
ace_mode: text
language_id: 225167241
XML:
type: data
ace_mode: xml
@@ -4577,6 +4740,8 @@ XML:
- wsdl
extensions:
- ".xml"
- ".adml"
- ".admx"
- ".ant"
- ".axml"
- ".builds"
@@ -4604,6 +4769,7 @@ XML:
- ".kml"
- ".launch"
- ".mdpolicy"
- ".mjml"
- ".mm"
- ".mod"
- ".mxml"
@@ -4642,8 +4808,11 @@ XML:
- ".ux"
- ".vbproj"
- ".vcxproj"
- ".vsixmanifest"
- ".vssettings"
- ".vstemplate"
- ".vxml"
- ".wixproj"
- ".wsdl"
- ".wsf"
- ".wxi"
@@ -4759,6 +4928,7 @@ YAML:
- ".syntax"
- ".yaml"
- ".yaml-tmlanguage"
- ".yml.mysql"
filenames:
- ".clang-format"
ace_mode: yaml

View File

@@ -238,6 +238,12 @@
# BuddyBuild
- BuddyBuildSDK.framework/
# Realm
- Realm.framework
# RealmSwift
- RealmSwift.framework
# git config files
- gitattributes$
- gitignore$

View File

@@ -1,3 +1,3 @@
module Linguist
VERSION = "5.0.0"
VERSION = "5.0.11"
end

View File

@@ -0,0 +1,46 @@
#include <iostream>
#define YYCTYPE unsigned char
#define YYCURSOR cursor
#define YYLIMIT cursor
#define YYMARKER marker
#define YYFILL(n)
bool scan(const char *text)
{
YYCTYPE *start = (YYCTYPE *)text;
YYCTYPE *cursor = (YYCTYPE *)text;
YYCTYPE *marker = (YYCTYPE *)text;
next:
YYCTYPE *token = cursor;
/*!re2c
'(This file must be converted with BinHex 4.0)'
{
if (token == start || *(token - 1) == '\n')
return true; else goto next;
}
[\001-\377]
{ goto next; }
[\000]
{ return false; }
*/
return false;
}
#define do_scan(str, expect) \
res = scan(str) == expect ? 0 : 1; \
std::cerr << str << "\t-\t" << (res ? "fail" : "ok") << std::endl; \
result += res
/*!max:re2c */
int main(int,void**)
{
int res, result = 0;
do_scan("(This file must be converted with BinHex 4.0)", 1);
do_scan("x(This file must be converted with BinHex 4.0)", 0);
do_scan("(This file must be converted with BinHex 4.0)x", 1);
do_scan("x(This file must be converted with BinHex 4.0)x", 0);
return result;
}

239
samples/C++/cnokw.re Normal file
View File

@@ -0,0 +1,239 @@
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#define ADDEQ 257
#define ANDAND 258
#define ANDEQ 259
#define ARRAY 260
#define ASM 261
#define AUTO 262
#define BREAK 263
#define CASE 264
#define CHAR 265
#define CONST 266
#define CONTINUE 267
#define DECR 268
#define DEFAULT 269
#define DEREF 270
#define DIVEQ 271
#define DO 272
#define DOUBLE 273
#define ELLIPSIS 274
#define ELSE 275
#define ENUM 276
#define EQL 277
#define EXTERN 278
#define FCON 279
#define FLOAT 280
#define FOR 281
#define FUNCTION 282
#define GEQ 283
#define GOTO 284
#define ICON 285
#define ID 286
#define IF 287
#define INCR 288
#define INT 289
#define LEQ 290
#define LONG 291
#define LSHIFT 292
#define LSHIFTEQ 293
#define MODEQ 294
#define MULEQ 295
#define NEQ 296
#define OREQ 297
#define OROR 298
#define POINTER 299
#define REGISTER 300
#define RETURN 301
#define RSHIFT 302
#define RSHIFTEQ 303
#define SCON 304
#define SHORT 305
#define SIGNED 306
#define SIZEOF 307
#define STATIC 308
#define STRUCT 309
#define SUBEQ 310
#define SWITCH 311
#define TYPEDEF 312
#define UNION 313
#define UNSIGNED 314
#define VOID 315
#define VOLATILE 316
#define WHILE 317
#define XOREQ 318
#define EOI 319
typedef unsigned int uint;
typedef unsigned char uchar;
#define BSIZE 8192
#define YYCTYPE uchar
#define YYCURSOR cursor
#define YYLIMIT s->lim
#define YYMARKER s->ptr
#define YYFILL(n) {cursor = fill(s, cursor);}
#define RET(i) {s->cur = cursor; return i;}
typedef struct Scanner {
int fd;
uchar *bot, *tok, *ptr, *cur, *pos, *lim, *top, *eof;
uint line;
} Scanner;
uchar *fill(Scanner *s, uchar *cursor){
if(!s->eof){
uint cnt = s->tok - s->bot;
if(cnt){
memcpy(s->bot, s->tok, s->lim - s->tok);
s->tok = s->bot;
s->ptr -= cnt;
cursor -= cnt;
s->pos -= cnt;
s->lim -= cnt;
}
if((s->top - s->lim) < BSIZE){
uchar *buf = (uchar*) malloc(((s->lim - s->bot) + BSIZE)*sizeof(uchar));
memcpy(buf, s->tok, s->lim - s->tok);
s->tok = buf;
s->ptr = &buf[s->ptr - s->bot];
cursor = &buf[cursor - s->bot];
s->pos = &buf[s->pos - s->bot];
s->lim = &buf[s->lim - s->bot];
s->top = &s->lim[BSIZE];
free(s->bot);
s->bot = buf;
}
if((cnt = read(s->fd, (char*) s->lim, BSIZE)) != BSIZE){
s->eof = &s->lim[cnt]; *(s->eof)++ = '\n';
}
s->lim += cnt;
}
return cursor;
}
int scan(Scanner *s){
uchar *cursor = s->cur;
std:
s->tok = cursor;
/*!re2c
any = [\000-\377];
O = [0-7];
D = [0-9];
L = [a-zA-Z_];
H = [a-fA-F0-9];
E = [Ee] [+-]? D+;
FS = [fFlL];
IS = [uUlL]*;
ESC = [\\] ([abfnrtv?'"\\] | "x" H+ | O+);
*/
/*!re2c
"/*" { goto comment; }
L (L|D)* { RET(ID); }
("0" [xX] H+ IS?) | ("0" D+ IS?) | (D+ IS?) |
(['] (ESC|any\[\n\\'])* ['])
{ RET(ICON); }
(D+ E FS?) | (D* "." D+ E? FS?) | (D+ "." D* E? FS?)
{ RET(FCON); }
(["] (ESC|any\[\n\\"])* ["])
{ RET(SCON); }
"..." { RET(ELLIPSIS); }
">>=" { RET(RSHIFTEQ); }
"<<=" { RET(LSHIFTEQ); }
"+=" { RET(ADDEQ); }
"-=" { RET(SUBEQ); }
"*=" { RET(MULEQ); }
"/=" { RET(DIVEQ); }
"%=" { RET(MODEQ); }
"&=" { RET(ANDEQ); }
"^=" { RET(XOREQ); }
"|=" { RET(OREQ); }
">>" { RET(RSHIFT); }
"<<" { RET(LSHIFT); }
"++" { RET(INCR); }
"--" { RET(DECR); }
"->" { RET(DEREF); }
"&&" { RET(ANDAND); }
"||" { RET(OROR); }
"<=" { RET(LEQ); }
">=" { RET(GEQ); }
"==" { RET(EQL); }
"!=" { RET(NEQ); }
";" { RET(';'); }
"{" { RET('{'); }
"}" { RET('}'); }
"," { RET(','); }
":" { RET(':'); }
"=" { RET('='); }
"(" { RET('('); }
")" { RET(')'); }
"[" { RET('['); }
"]" { RET(']'); }
"." { RET('.'); }
"&" { RET('&'); }
"!" { RET('!'); }
"~" { RET('~'); }
"-" { RET('-'); }
"+" { RET('+'); }
"*" { RET('*'); }
"/" { RET('/'); }
"%" { RET('%'); }
"<" { RET('<'); }
">" { RET('>'); }
"^" { RET('^'); }
"|" { RET('|'); }
"?" { RET('?'); }
[ \t\v\f]+ { goto std; }
"\n"
{
if(cursor == s->eof) RET(EOI);
s->pos = cursor; s->line++;
goto std;
}
any
{
printf("unexpected character: %c\n", *s->tok);
goto std;
}
*/
comment:
/*!re2c
"*/" { goto std; }
"\n"
{
if(cursor == s->eof) RET(EOI);
s->tok = s->pos = cursor; s->line++;
goto comment;
}
any { goto comment; }
*/
}
main(){
Scanner in;
int t;
memset((char*) &in, 0, sizeof(in));
in.fd = 0;
while((t = scan(&in)) != EOI){
/*
printf("%d\t%.*s\n", t, in.cur - in.tok, in.tok);
printf("%d\n", t);
*/
}
close(in.fd);
}

63
samples/C++/cvsignore.re Normal file
View File

@@ -0,0 +1,63 @@
#define YYFILL(n) if (cursor >= limit) break;
#define YYCTYPE char
#define YYCURSOR cursor
#define YYLIMIT limit
#define YYMARKER marker
/*!re2c
any = (.|"\n");
value = (":" (.\"$")+)?;
cvsdat = "Date";
cvsid = "Id";
cvslog = "Log";
cvsrev = "Revision";
cvssrc = "Source";
*/
#define APPEND(text) \
append(output, outsize, text, sizeof(text) - sizeof(YYCTYPE))
inline void append(YYCTYPE *output, size_t & outsize, const YYCTYPE * text, size_t len)
{
memcpy(output + outsize, text, len);
outsize += (len / sizeof(YYCTYPE));
}
void scan(YYCTYPE *pText, size_t *pSize, int *pbChanged)
{
// rule
// scan lines
// find $ in lines
// compact $<keyword>: .. $ to $<keyword>$
YYCTYPE *output;
const YYCTYPE *cursor, *limit, *marker;
cursor = marker = output = *pText;
size_t insize = *pSize;
size_t outsize = 0;
limit = cursor + insize;
while(1) {
loop:
/*!re2c
"$" cvsdat value "$" { APPEND(L"$" L"Date$"); goto loop; }
"$" cvsid value "$" { APPEND(L"$" L"Id$"); goto loop; }
"$" cvslog value "$" { APPEND(L"$" L"Log$"); goto loop; }
"$" cvsrev value "$" { APPEND(L"$" L"Revision$"); goto loop; }
"$" cvssrc value "$" { APPEND(L"$" L"Source$"); goto loop; }
any { output[outsize++] = cursor[-1]; if (cursor >= limit) break; goto loop; }
*/
}
output[outsize] = '\0';
// set the new size
*pSize = outsize;
*pbChanged = (insize == outsize) ? 0 : 1;
}

13
samples/C++/simple.re Normal file
View File

@@ -0,0 +1,13 @@
#define NULL ((char*) 0)
char *scan(char *p){
char *q;
#define YYCTYPE char
#define YYCURSOR p
#define YYLIMIT p
#define YYMARKER q
#define YYFILL(n)
/*!re2c
[0-9]+ {return YYCURSOR;}
[\000-\377] {return NULL;}
*/
}

404
samples/CWeb/sat-life.w Normal file
View File

@@ -0,0 +1,404 @@
\datethis
@*Intro. This program generates clauses for the transition relation
from time $t$ to time $t+1$ in Conway's Game of Life, assuming that
all of the potentially live cells at time $t$ belong to a pattern
that's specified in |stdin|. The pattern is defined by one or more
lines representing rows of cells, where each line has `\..' in a
cell that's guaranteed to be dead at time~$t$, otherwise it has `\.*'.
The time is specified separately as a command-line parameter.
The Boolean variable for cell $(x,y)$ at time $t$ is named by its
so-called ``xty code,'' namely by the decimal value of~$x$, followed
by a code letter for~$t$, followed by the decimal value of~$y$. For
example, if $x=10$ and $y=11$ and $t=0$, the variable that indicates
liveness of the cell is \.{10a11}; and the corresponding variable
for $t=1$ is \.{10b11}.
Up to 19 auxiliary variables are used together with each xty code,
in order to construct clauses that define the successor state.
The names of these variables are obtained by appending one of
the following two-character combinations to the xty code:
\.{A2}, \.{A3}, \.{A4},
\.{B1}, \.{B2}, \.{B3}, \.{B4},
\.{C1}, \.{C2}, \.{C3}, \.{C4},
\.{D1}, \.{D2},
\.{E1}, \.{E2},
\.{F1}, \.{F2},
\.{G1}, \.{G2}.
These variables are derived from the Bailleux--Boufkhad method
of encoding cardinality constraints:
The auxiliary variable \.{A$k$} stands for the condition
``at least $k$ of the eight neighbors are alive.'' Similarly,
\.{B$k$} stands for ``at least $k$ of the first four neighbors
are alive,'' and \.{C$k$} accounts for the other four neighbors.
Codes \.D, \.E, \.F, and~\.G refer to pairs of neighbors.
Thus, for instance, \.{10a11C2} means that at least two of the
last four neighbors of cell $(10,11)$ are alive.
Those auxiliary variables receive values by means of up to 77 clauses per cell.
For example, if $u$ and~$v$ are the neighbors of cell~$z$ that correspond
to a pairing of type~\.D, there are six clauses
$$\bar u d_1,\quad
\bar v d_1,\quad
\bar u\bar v d_2,\quad
u v\bar d_1,\quad
u\bar d_2,\quad
v\bar d_2.$$
The sixteen clauses
$$\displaylines{\hfill
\bar d_1b_1,\quad
\bar e_1b_1,\quad
\bar d_2b_2,\quad
\bar d_1\bar e_1b_2,\quad
\bar e_2b_2,\quad
\bar d_2\bar e_1b_3,\quad
\bar d_1\bar e_2b_3,\quad
\bar d_2\bar e_2b_4,
\hfill\cr\hfill
d_1e_1\bar b_1,\quad
d_1e_2\bar b_2,\quad
d_2e_1\bar b_2,\quad
d_1\bar b_3,\quad
d_2e_2\bar b_3,\quad
e_1\bar b_3,\quad
d_2\bar b_4,\quad
e_2\bar b_4
\hfill}$$
define $b$ variables from $d$'s and $e$'s; and another sixteen
define $c$'s from $f$'s and $g$'s in the same fashion.
A similar set of 21 clauses will define the $a$'s from the $b$'s and $c$'s.
Once the $a$'s are defined, thus essentially counting the
live neighbors of cell $z$, the next
state~$z'$ is defined by five further clauses
$$\bar a_4\bar z',\quad
a_2\bar z',\quad
a_3z\bar z',\quad
\bar a_3a_4z',\quad
\bar a_2a_4\bar zz'.$$
For example, the last of these states that $z'$ will be true
(i.e., that cell $z$ will be alive at time $t+1$) if
$z$ is alive at time~$t$ and has $\ge2$ live neighbors
but not $\ge4$.
Nearby cells can share auxiliary variables, according to a tricky scheme that
is worked out below. In consequence, the actual number of auxiliary variables
and clauses per cell is reduced from 19 and $77+5$ to 13 and $57+5$,
respectively, except at the boundaries.
@ So here's the overall outline of the program.
@d maxx 50 /* maximum number of lines in the pattern supplied by |stdin| */
@d maxy 50 /* maximum number of columns per line in |stdin| */
@c
#include <stdio.h>
#include <stdlib.h>
char p[maxx+2][maxy+2]; /* is cell $(x,y)$ potentially alive? */
char have_b[maxx+2][maxy+2]; /* did we already generate $b(x,y)$? */
char have_d[maxx+2][maxy+2]; /* did we already generate $d(x,y)$? */
char have_e[maxx+2][maxy+4]; /* did we already generate $e(x,y)$? */
char have_f[maxx+4][maxy+2]; /* did we already generate $f(x-2,y)$? */
int tt; /* time as given on the command line */
int xmax,ymax; /* the number of rows and columns in the input pattern */
int xmin=maxx,ymin=maxy; /* limits in the other direction */
char timecode[]="abcdefghijklmnopqrstuvwxyz"@|
"ABCDEFGHIJKLMNOPQRSTUVWXYZ"@|
"!\"#$%&'()*+,-./:;<=>?@@[\\]^_`{|}~"; /* codes for $0\le t\le83$ */
@q$@>
char buf[maxy+2]; /* input buffer */
unsigned int clause[4]; /* clauses are assembled here */
int clauseptr; /* this many literals are in the current clause */
@<Subroutines@>@;
main(int argc,char*argv[]) {
register int j,k,x,y;
@<Process the command line@>;
@<Input the pattern@>;
for (x=xmin-1;x<=xmax+1;x++) for (y=ymin-1;y<=ymax+1;y++) {
@<If cell $(x,y)$ is obviously dead at time $t+1$, |continue|@>;
a(x,y);
zprime(x,y);
}
}
@ @<Process the command line@>=
if (argc!=2 || sscanf(argv[1],"%d",&tt)!=1) {
fprintf(stderr,"Usage: %s t\n",argv[0]);
exit(-1);
}
if (tt<0 || tt>82) {
fprintf(stderr,"The time should be between 0 and 82 (not %d)!\n",tt);
exit(-2);
}
@ @<Input the pattern@>=
for (x=1;;x++) {
if (!fgets(buf,maxy+2,stdin)) break;
if (x>maxx) {
fprintf(stderr,"Sorry, the pattern should have at most %d rows!\n",maxx);
exit(-3);
}
for (y=1;buf[y-1]!='\n';y++) {
if (y>maxy) {
fprintf(stderr,"Sorry, the pattern should have at most %d columns!\n",
maxy);
exit(-4);
}
if (buf[y-1]=='*') {
p[x][y]=1;
if (y>ymax) ymax=y;
if (y<ymin) ymin=y;
if (x>xmax) xmax=x;
if (x<xmin) xmin=x;
}@+else if (buf[y-1]!='.') {
fprintf(stderr,"Unexpected character `%c' found in the pattern!\n",
buf[y-1]);
exit(-5);
}
}
}
@ @d pp(xx,yy) ((xx)>=0 && (yy)>=0? p[xx][yy]: 0)
@<If cell $(x,y)$ is obviously dead at time $t+1$, |continue|@>=
if (pp(x-1,y-1)+pp(x-1,y)+pp(x-1,y+1)+
pp(x,y-1)+p[x][y]+p[x][y+1]+
pp(x+1,y-1)+p[x+1][y]+p[x+1][y+1]<3) continue;
@ Clauses are assembled in the |clause| array (surprise), where we
put encoded literals.
The code for a literal is an unsigned 32-bit quantity, where the leading
bit is 1 if the literal should be complemented. The next three bits
specify the type of the literal (0 thru 7 for plain and \.A--\.G);
the next three bits specify an integer~$k$; and the next bit is zero.
That leaves room for two 12-bit fields, which specify $x$ and $y$.
Type 0 literals have $k=0$ for the ordinary xty code. However, the
value $k=1$ indicates that the time code should be for $t+1$ instead of~$t$.
And $k=2$ denotes a special ``tautology'' literal, which is always true.
If the tautology literal is complemented, we omit it from the clause;
otherwise we omit the entire clause.
Finally, $k=7$ denotes an auxiliary literal, used to avoid
clauses of length~4.
Here's a subroutine that outputs the current clause and resets
the |clause| array.
@d taut (2<<25)
@d sign (1U<<31)
@<Sub...@>=
void outclause(void) {
register int c,k,x,y,p;
for (p=0;p<clauseptr;p++)
if (clause[p]==taut) goto done;
for (p=0;p<clauseptr;p++) if (clause[p]!=taut+sign) {
if (clause[p]>>31) printf(" ~");@+else printf(" ");
c=(clause[p]>>28)&0x7;
k=(clause[p]>>25)&0x7;
x=(clause[p]>>12)&0xfff;
y=clause[p]&0xfff;
if (c) printf("%d%c%d%c%d",
x,timecode[tt],y,c+'@@',k);
else if (k==7) printf("%d%c%dx",
x,timecode[tt],y);
else printf("%d%c%d",
x,timecode[tt+k],y);
}
printf("\n");
done: clauseptr=0;
}
@ And here's another, which puts a type-0 literal into |clause|.
@<Sub...@>=
void applit(int x,int y,int bar,int k) {
if (k==0 && (x<xmin || x>xmax || y<ymin || y>ymax || p[x][y]==0))
clause[clauseptr++]=(bar? 0: sign)+taut;
else clause[clauseptr++]=(bar? sign:0)+(k<<25)+(x<<12)+y;
}
@ The |d| and |e| subroutines are called for only one-fourth
of all cell addresses $(x,y)$. Indeed, one can show that
$x$ is always odd, and that $y\bmod4<2$.
Therefore we remember if we've seen $(x,y)$ before.
Slight trick: If |yy| is not in range, we avoid generating the
clause $\bar d_k$ twice.
@d newlit(x,y,c,k) clause[clauseptr++]=((c)<<28)+((k)<<25)+((x)<<12)+(y)
@d newcomplit(x,y,c,k)
clause[clauseptr++]=sign+((c)<<28)+((k)<<25)+((x)<<12)+(y)
@<Sub...@>=
void d(int x,int y) {
register x1=x-1,x2=x,yy=y+1;
if (have_d[x][y]!=tt+1) {
applit(x1,yy,1,0),newlit(x,y,4,1),outclause();
applit(x2,yy,1,0),newlit(x,y,4,1),outclause();
applit(x1,yy,1,0),applit(x2,yy,1,0),newlit(x,y,4,2),outclause();
applit(x1,yy,0,0),applit(x2,yy,0,0),newcomplit(x,y,4,1),outclause();
applit(x1,yy,0,0),newcomplit(x,y,4,2),outclause();
if (yy>=ymin && yy<=ymax)
applit(x2,yy,0,0),newcomplit(x,y,4,2),outclause();
have_d[x][y]=tt+1;
}
}
@#
void e(int x,int y) {
register x1=x-1,x2=x,yy=y-1;
if (have_e[x][y]!=tt+1) {
applit(x1,yy,1,0),newlit(x,y,5,1),outclause();
applit(x2,yy,1,0),newlit(x,y,5,1),outclause();
applit(x1,yy,1,0),applit(x2,yy,1,0),newlit(x,y,5,2),outclause();
applit(x1,yy,0,0),applit(x2,yy,0,0),newcomplit(x,y,5,1),outclause();
applit(x1,yy,0,0),newcomplit(x,y,5,2),outclause();
if (yy>=ymin && yy<=ymax)
applit(x2,yy,0,0),newcomplit(x,y,5,2),outclause();
have_e[x][y]=tt+1;
}
}
@ The |f| subroutine can't be shared quite so often. But we
do save a factor of~2, because $x+y$ is always even.
@<Sub...@>=
void f(int x,int y) {
register xx=x-1,y1=y,y2=y+1;
if (have_f[x][y]!=tt+1) {
applit(xx,y1,1,0),newlit(x,y,6,1),outclause();
applit(xx,y2,1,0),newlit(x,y,6,1),outclause();
applit(xx,y1,1,0),applit(xx,y2,1,0),newlit(x,y,6,2),outclause();
applit(xx,y1,0,0),applit(xx,y2,0,0),newcomplit(x,y,6,1),outclause();
applit(xx,y1,0,0),newcomplit(x,y,6,2),outclause();
if (xx>=xmin && xx<=xmax)
applit(xx,y2,0,0),newcomplit(x,y,6,2),outclause();
have_f[x][y]=tt+1;
}
}
@ The |g| subroutine cleans up the dregs, by somewhat tediously
locating the two neighbors that weren't handled by |d|, |e|, or~|f|.
No sharing is possible here.
@<Sub...@>=
void g(int x,int y) {
register x1,x2,y1,y2;
if (x&1) x1=x-1,y1=y,x2=x+1,y2=y^1;
else x1=x+1,y1=y,x2=x-1,y2=y-1+((y&1)<<1);
applit(x1,y1,1,0),newlit(x,y,7,1),outclause();
applit(x2,y2,1,0),newlit(x,y,7,1),outclause();
applit(x1,y1,1,0),applit(x2,y2,1,0),newlit(x,y,7,2),outclause();
applit(x1,y1,0,0),applit(x2,y2,0,0),newcomplit(x,y,7,1),outclause();
applit(x1,y1,0,0),newcomplit(x,y,7,2),outclause();
applit(x2,y2,0,0),newcomplit(x,y,7,2),outclause();
}
@ Fortunately the |b| subroutine {\it can\/} be shared (since |x| is always
odd), thus saving half of the sixteen clauses generated.
@<Sub...@>=
void b(int x,int y) {
register j,k,xx=x,y1=y-(y&2),y2=y+(y&2);
if (have_b[x][y]!=tt+1) {
d(xx,y1);
e(xx,y2);
for (j=0;j<3;j++) for (k=0;k<3;k++) if (j+k) {
if (j) newcomplit(xx,y1,4,j); /* $\bar d_j$ */
if (k) newcomplit(xx,y2,5,k); /* $\bar e_k$ */
newlit(x,y,2,j+k); /* $b_{j+k}$ */
outclause();
if (j) newlit(xx,y1,4,3-j); /* $d_{3-j}$ */
if (k) newlit(xx,y2,5,3-k); /* $e_{3-k}$ */
newcomplit(x,y,2,5-j-k); /* $\bar b_{5-j-k}$ */
outclause();
}
have_b[x][y]=tt+1;
}
}
@ The (unshared) |c| subroutine handles the other four neighbors,
by working with |f| and |g| instead of |d| and~|e|.
If |y=0|, the overlap rules set |y1=-1|, which can be problematic.
I've decided to avoid this case by omitting |f| when it is
guaranteed to be zero.
@<Sub...@>=
void c(int x,int y) {
register j,k,x1,y1;
if (x&1) x1=x+2,y1=(y-1)|1;
else x1=x,y1=y&-2;
g(x,y);
if (x1-1<xmin || x1-1>xmax || y1+1<ymin || y1>ymax)
@<Set |c| equal to |g|@>@;
else {
f(x1,y1);
for (j=0;j<3;j++) for (k=0;k<3;k++) if (j+k) {
if (j) newcomplit(x1,y1,6,j); /* $\bar f_j$ */
if (k) newcomplit(x,y,7,k); /* $\bar g_k$ */
newlit(x,y,3,j+k); /* $c_{j+k}$ */
outclause();
if (j) newlit(x1,y1,6,3-j); /* $f_{3-j}$ */
if (k) newlit(x,y,7,3-k); /* $g_{3-k}$ */
newcomplit(x,y,3,5-j-k); /* $\bar c_{5-j-k}$ */
outclause();
}
}
}
@ @<Set |c| equal to |g|@>=
{
for (k=1;k<3;k++) {
newcomplit(x,y,7,k),newlit(x,y,3,k),outclause(); /* $\bar g_k\lor c_k$ */
newlit(x,y,7,k),newcomplit(x,y,3,k),outclause(); /* $g_k\lor\bar c_k$ */
}
newcomplit(x,y,3,3),outclause(); /* $\bar c_3$ */
newcomplit(x,y,3,4),outclause(); /* $\bar c_4$ */
}
@ Totals over all eight neighbors are then deduced by the |a|
subroutine.
@<Sub...@>=
void a(int x,int y) {
register j,k,xx=x|1;
b(xx,y);
c(x,y);
for (j=0;j<5;j++) for (k=0;k<5;k++) if (j+k>1 && j+k<5) {
if (j) newcomplit(xx,y,2,j); /* $\bar b_j$ */
if (k) newcomplit(x,y,3,k); /* $\bar c_k$ */
newlit(x,y,1,j+k); /* $a_{j+k}$ */
outclause();
}
for (j=0;j<5;j++) for (k=0;k<5;k++) if (j+k>2 && j+k<6 && j*k) {
if (j) newlit(xx,y,2,j); /* $b_j$ */
if (k) newlit(x,y,3,k); /* $c_k$ */
newcomplit(x,y,1,j+k-1); /* $\bar a_{j+k-1}$ */
outclause();
}
}
@ Finally, as mentioned at the beginning, $z'$ is determined
from $z$, $a_2$, $a_3$, and $a_4$.
I actually generate six clauses, not five, in order to stick to
{\mc 3SAT}.
@<Sub...@>=
void zprime(int x,int y) {
newcomplit(x,y,1,4),applit(x,y,1,1),outclause(); /* $\bar a_4\bar z'$ */
newlit(x,y,1,2),applit(x,y,1,1),outclause(); /* $a_2\bar z'$ */
newlit(x,y,1,3),applit(x,y,0,0),applit(x,y,1,1),outclause();
/* $a_3z\bar z'$ */
newcomplit(x,y,1,3),newlit(x,y,1,4),applit(x,y,0,1),outclause();
/* $\bar a_3a_4z'$ */
applit(x,y,0,7),newcomplit(x,y,1,2),newlit(x,y,1,4),outclause();
/* $x\bar a_2a_4$ */
applit(x,y,1,7),applit(x,y,1,0),applit(x,y,0,1),outclause();
/* $\bar x\bar zz'$ */
}
@*Index.

View File

@@ -0,0 +1,24 @@
{namespace Exmaple}
/**
* Example
*/
{template .foo}
{@param count: string}
{@param? name: int}
{if isNonnull($name)}
<h1>{$name}</h1>
{/if}
<div class="content">
{switch count}
{case 0}
{call Empty.view}
{param count: $count /}
{/call}
{default}
<h2>Wow, so many!</h2>
{/switch}
</div>
{/template}

440
samples/D/aa.d Normal file
View File

@@ -0,0 +1,440 @@
/**
* Implementation of associative arrays.
*
* Copyright: Martin Nowak 2015 -.
* License: $(LINK2 http://www.boost.org/LICENSE_1_0.txt, Boost License 1.0)
* Authors: Martin Nowak
*/
module core.aa;
import core.memory : GC;
private
{
// grow threshold
enum GROW_NUM = 4;
enum GROW_DEN = 5;
// shrink threshold
enum SHRINK_NUM = 1;
enum SHRINK_DEN = 8;
// grow factor
enum GROW_FAC = 4;
// growing the AA doubles it's size, so the shrink threshold must be
// smaller than half the grow threshold to have a hysteresis
static assert(GROW_FAC * SHRINK_NUM * GROW_DEN < GROW_NUM * SHRINK_DEN);
// initial load factor (for literals), mean of both thresholds
enum INIT_NUM = (GROW_DEN * SHRINK_NUM + GROW_NUM * SHRINK_DEN) / 2;
enum INIT_DEN = SHRINK_DEN * GROW_DEN;
// magic hash constants to distinguish empty, deleted, and filled buckets
enum HASH_EMPTY = 0;
enum HASH_DELETED = 0x1;
enum HASH_FILLED_MARK = size_t(1) << 8 * size_t.sizeof - 1;
}
enum INIT_NUM_BUCKETS = 8;
struct AA(Key, Val)
{
this(size_t sz)
{
impl = new Impl(nextpow2(sz));
}
@property bool empty() const pure nothrow @safe @nogc
{
return !length;
}
@property size_t length() const pure nothrow @safe @nogc
{
return impl is null ? 0 : impl.length;
}
void opIndexAssign(Val val, in Key key)
{
// lazily alloc implementation
if (impl is null)
impl = new Impl(INIT_NUM_BUCKETS);
// get hash and bucket for key
immutable hash = calcHash(key);
// found a value => assignment
if (auto p = impl.findSlotLookup(hash, key))
{
p.entry.val = val;
return;
}
auto p = findSlotInsert(hash);
if (p.deleted)
--deleted;
// check load factor and possibly grow
else if (++used * GROW_DEN > dim * GROW_NUM)
{
grow();
p = findSlotInsert(hash);
assert(p.empty);
}
// update search cache and allocate entry
firstUsed = min(firstUsed, cast(uint)(p - buckets.ptr));
p.hash = hash;
p.entry = new Impl.Entry(key, val); // TODO: move
return;
}
ref inout(Val) opIndex(in Key key) inout @trusted
{
auto p = opIn_r(key);
assert(p !is null);
return *p;
}
inout(Val)* opIn_r(in Key key) inout @trusted
{
if (empty)
return null;
immutable hash = calcHash(key);
if (auto p = findSlotLookup(hash, key))
return &p.entry.val;
return null;
}
bool remove(in Key key)
{
if (empty)
return false;
immutable hash = calcHash(key);
if (auto p = findSlotLookup(hash, key))
{
// clear entry
p.hash = HASH_DELETED;
p.entry = null;
++deleted;
if (length * SHRINK_DEN < dim * SHRINK_NUM)
shrink();
return true;
}
return false;
}
Val get(in Key key, lazy Val val)
{
auto p = opIn_r(key);
return p is null ? val : *p;
}
ref Val getOrSet(in Key key, lazy Val val)
{
// lazily alloc implementation
if (impl is null)
impl = new Impl(INIT_NUM_BUCKETS);
// get hash and bucket for key
immutable hash = calcHash(key);
// found a value => assignment
if (auto p = impl.findSlotLookup(hash, key))
return p.entry.val;
auto p = findSlotInsert(hash);
if (p.deleted)
--deleted;
// check load factor and possibly grow
else if (++used * GROW_DEN > dim * GROW_NUM)
{
grow();
p = findSlotInsert(hash);
assert(p.empty);
}
// update search cache and allocate entry
firstUsed = min(firstUsed, cast(uint)(p - buckets.ptr));
p.hash = hash;
p.entry = new Impl.Entry(key, val);
return p.entry.val;
}
/**
Convert the AA to the type of the builtin language AA.
*/
Val[Key] toBuiltinAA() pure nothrow
{
return cast(Val[Key]) _aaFromCoreAA(impl, rtInterface);
}
private:
private this(inout(Impl)* impl) inout
{
this.impl = impl;
}
ref Val getLValue(in Key key)
{
// lazily alloc implementation
if (impl is null)
impl = new Impl(INIT_NUM_BUCKETS);
// get hash and bucket for key
immutable hash = calcHash(key);
// found a value => assignment
if (auto p = impl.findSlotLookup(hash, key))
return p.entry.val;
auto p = findSlotInsert(hash);
if (p.deleted)
--deleted;
// check load factor and possibly grow
else if (++used * GROW_DEN > dim * GROW_NUM)
{
grow();
p = findSlotInsert(hash);
assert(p.empty);
}
// update search cache and allocate entry
firstUsed = min(firstUsed, cast(uint)(p - buckets.ptr));
p.hash = hash;
p.entry = new Impl.Entry(key); // TODO: move
return p.entry.val;
}
static struct Impl
{
this(size_t sz)
{
buckets = allocBuckets(sz);
}
@property size_t length() const pure nothrow @nogc
{
assert(used >= deleted);
return used - deleted;
}
@property size_t dim() const pure nothrow @nogc
{
return buckets.length;
}
@property size_t mask() const pure nothrow @nogc
{
return dim - 1;
}
// find the first slot to insert a value with hash
inout(Bucket)* findSlotInsert(size_t hash) inout pure nothrow @nogc
{
for (size_t i = hash & mask, j = 1;; ++j)
{
if (!buckets[i].filled)
return &buckets[i];
i = (i + j) & mask;
}
}
// lookup a key
inout(Bucket)* findSlotLookup(size_t hash, in Key key) inout
{
for (size_t i = hash & mask, j = 1;; ++j)
{
if (buckets[i].hash == hash && key == buckets[i].entry.key)
return &buckets[i];
else if (buckets[i].empty)
return null;
i = (i + j) & mask;
}
}
void grow()
{
// If there are so many deleted entries, that growing would push us
// below the shrink threshold, we just purge deleted entries instead.
if (length * SHRINK_DEN < GROW_FAC * dim * SHRINK_NUM)
resize(dim);
else
resize(GROW_FAC * dim);
}
void shrink()
{
if (dim > INIT_NUM_BUCKETS)
resize(dim / GROW_FAC);
}
void resize(size_t ndim) pure nothrow
{
auto obuckets = buckets;
buckets = allocBuckets(ndim);
foreach (ref b; obuckets)
if (b.filled)
*findSlotInsert(b.hash) = b;
firstUsed = 0;
used -= deleted;
deleted = 0;
GC.free(obuckets.ptr); // safe to free b/c impossible to reference
}
static struct Entry
{
Key key;
Val val;
}
static struct Bucket
{
size_t hash;
Entry* entry;
@property bool empty() const
{
return hash == HASH_EMPTY;
}
@property bool deleted() const
{
return hash == HASH_DELETED;
}
@property bool filled() const
{
return cast(ptrdiff_t) hash < 0;
}
}
Bucket[] allocBuckets(size_t dim) @trusted pure nothrow
{
enum attr = GC.BlkAttr.NO_INTERIOR;
immutable sz = dim * Bucket.sizeof;
return (cast(Bucket*) GC.calloc(sz, attr))[0 .. dim];
}
Bucket[] buckets;
uint used;
uint deleted;
uint firstUsed;
}
RTInterface* rtInterface()() pure nothrow @nogc
{
static size_t aaLen(in void* pimpl) pure nothrow @nogc
{
auto aa = const(AA)(cast(const(Impl)*) pimpl);
return aa.length;
}
static void* aaGetY(void** pimpl, in void* pkey)
{
auto aa = AA(cast(Impl*)*pimpl);
auto res = &aa.getLValue(*cast(const(Key*)) pkey);
*pimpl = aa.impl; // might have changed
return res;
}
static inout(void)* aaInX(inout void* pimpl, in void* pkey)
{
auto aa = inout(AA)(cast(inout(Impl)*) pimpl);
return aa.opIn_r(*cast(const(Key*)) pkey);
}
static bool aaDelX(void* pimpl, in void* pkey)
{
auto aa = AA(cast(Impl*) pimpl);
return aa.remove(*cast(const(Key*)) pkey);
}
static immutable vtbl = RTInterface(&aaLen, &aaGetY, &aaInX, &aaDelX);
return cast(RTInterface*)&vtbl;
}
static size_t calcHash(in ref Key key)
{
return hashOf(key) | HASH_FILLED_MARK;
}
Impl* impl;
alias impl this;
}
package extern (C) void* _aaFromCoreAA(void* impl, RTInterface* rtIntf) pure nothrow;
private:
struct RTInterface
{
alias AA = void*;
size_t function(in AA aa) pure nothrow @nogc len;
void* function(AA* aa, in void* pkey) getY;
inout(void)* function(inout AA aa, in void* pkey) inX;
bool function(AA aa, in void* pkey) delX;
}
unittest
{
AA!(int, int) aa;
assert(aa.length == 0);
aa[0] = 1;
assert(aa.length == 1 && aa[0] == 1);
aa[1] = 2;
assert(aa.length == 2 && aa[1] == 2);
import core.stdc.stdio;
int[int] rtaa = aa.toBuiltinAA();
assert(rtaa.length == 2);
puts("length");
assert(rtaa[0] == 1);
assert(rtaa[1] == 2);
rtaa[2] = 3;
assert(aa[2] == 3);
}
unittest
{
auto aa = AA!(int, int)(3);
aa[0] = 0;
aa[1] = 1;
aa[2] = 2;
assert(aa.length == 3);
}
//==============================================================================
// Helper functions
//------------------------------------------------------------------------------
size_t nextpow2(in size_t n) pure nothrow @nogc
{
import core.bitop : bsr;
if (n < 2)
return 1;
return size_t(1) << bsr(n - 1) + 1;
}
pure nothrow @nogc unittest
{
// 0, 1, 2, 3, 4, 5, 6, 7, 8, 9
foreach (const n, const pow2; [1, 1, 2, 4, 4, 8, 8, 8, 8, 16])
assert(nextpow2(n) == pow2);
}
T min(T)(T a, T b) pure nothrow @nogc
{
return a < b ? a : b;
}
T max(T)(T a, T b) pure nothrow @nogc
{
return b < a ? a : b;
}

187
samples/D/arrayops.d Normal file
View File

@@ -0,0 +1,187 @@
/**
* Benchmark for array ops.
*
* Copyright: Copyright Martin Nowak 2016 -.
* License: $(LINK2 http://www.boost.org/LICENSE_1_0.txt, Boost License 1.0)
* Authors: Martin Nowak
*/
import core.cpuid, std.algorithm, std.datetime, std.meta, std.stdio, std.string,
std.range;
float[6] getLatencies(T, string op)()
{
enum N = (64 * (1 << 6) + 64) * T.sizeof;
auto a = Array!T(N), b = Array!T(N), c = Array!T(N);
float[6] latencies = float.max;
foreach (i, ref latency; latencies)
{
auto len = 1 << i;
foreach (_; 1 .. 32)
{
a[] = 24;
b[] = 4;
c[] = 2;
auto sw = StopWatch(AutoStart.yes);
foreach (off; size_t(0) .. size_t(64))
{
off = off * len + off;
enum op = op.replace("const", "2").replace("a",
"a[off .. off + len]").replace("b",
"b[off .. off + len]").replace("c", "c[off .. off + len]");
mixin(op ~ ";");
}
latency = min(latency, sw.peek.nsecs);
}
}
float[6] res = latencies[] / 1024;
return res;
}
float[4] getThroughput(T, string op)()
{
enum N = (40 * 1024 * 1024 + 64 * T.sizeof) / T.sizeof;
auto a = Array!T(N), b = Array!T(N), c = Array!T(N);
float[4] latencies = float.max;
size_t[4] lengths = [
8 * 1024 / T.sizeof, 32 * 1024 / T.sizeof, 512 * 1024 / T.sizeof, 32 * 1024 * 1024 / T
.sizeof
];
foreach (i, ref latency; latencies)
{
auto len = lengths[i] / 64;
foreach (_; 1 .. 4)
{
a[] = 24;
b[] = 4;
c[] = 2;
auto sw = StopWatch(AutoStart.yes);
foreach (off; size_t(0) .. size_t(64))
{
off = off * len + off;
enum op = op.replace("const", "2").replace("a",
"a[off .. off + len]").replace("b",
"b[off .. off + len]").replace("c", "c[off .. off + len]");
mixin(op ~ ";");
}
immutable nsecs = sw.peek.nsecs;
runMasked({latency = min(latency, nsecs);});
}
}
float[4] throughputs = void;
runMasked({throughputs = T.sizeof * lengths[] / latencies[];});
return throughputs;
}
string[] genOps()
{
string[] ops;
foreach (op1; ["+", "-", "*", "/"])
{
ops ~= "a " ~ op1 ~ "= b";
ops ~= "a " ~ op1 ~ "= const";
foreach (op2; ["+", "-", "*", "/"])
{
ops ~= "a " ~ op1 ~ "= b " ~ op2 ~ " c";
ops ~= "a " ~ op1 ~ "= b " ~ op2 ~ " const";
}
}
return ops;
}
void runOp(string op)()
{
foreach (T; AliasSeq!(ubyte, ushort, uint, ulong, byte, short, int, long, float,
double))
writefln("%s, %s, %(%.2f, %), %(%s, %)", T.stringof, op,
getLatencies!(T, op), getThroughput!(T, op));
}
struct Array(T)
{
import core.stdc.stdlib : free, malloc;
this(size_t n)
{
ary = (cast(T*) malloc(T.sizeof * n))[0 .. n];
}
~this()
{
free(ary.ptr);
}
T[] ary;
alias ary this;
}
version (X86)
version = SSE;
else version (X86_64)
version = SSE;
else
static assert(0, "unimplemented");
version (SSE)
{
uint mxcsr()
{
uint ret = void;
asm
{
stmxcsr ret;
}
return ret;
}
void mxcsr(uint val)
{
asm
{
ldmxcsr val;
}
}
// http://softpixel.com/~cwright/programming/simd/sse.php
enum FPU_EXCEPTION_MASKS = 1 << 12 | 1 << 11 | 1 << 10 | 1 << 9 | 1 << 8 | 1 << 7;
enum FPU_EXCEPTION_FLAGS = 1 << 5 | 1 << 4 | 1 << 3 | 1 << 2 | 1 << 1 | 1 << 0;
void maskFPUExceptions()
{
mxcsr = mxcsr | FPU_EXCEPTION_MASKS;
}
void unmaskFPUExceptions()
{
mxcsr = mxcsr & ~FPU_EXCEPTION_MASKS;
}
uint FPUExceptionFlags()
{
return mxcsr & FPU_EXCEPTION_FLAGS;
}
void clearFPUExceptionFlags()
{
mxcsr = mxcsr & ~FPU_EXCEPTION_FLAGS;
}
}
void runMasked(scope void delegate() dg)
{
assert(FPUExceptionFlags == 0);
maskFPUExceptions;
dg();
clearFPUExceptionFlags;
unmaskFPUExceptions;
}
void main()
{
unmaskFPUExceptions;
writefln("type, op, %(latency%s, %), %-(throughput%s, %)", iota(6)
.map!(i => 1 << i), ["8KB", "32KB", "512KB", "32MB"]);
foreach (op; mixin("AliasSeq!(%(%s, %))".format(genOps)))
runOp!op;
maskFPUExceptions;
}

3
samples/D/function.d Normal file
View File

@@ -0,0 +1,3 @@
void foo()
{
}

6
samples/D/hello_world.d Normal file
View File

@@ -0,0 +1,6 @@
import std.stdio;
void main()
{
writeln("Hello World");
}

7
samples/D/template.d Normal file
View File

@@ -0,0 +1,7 @@
template Fib(size_t N)
{
static if (N < 2)
enum Fib = size_t(1);
else
enum Fib = Fib!(N - 2) + Fib!(N - 1);
}

View File

@@ -0,0 +1,3 @@
void bar(T)(T t)
{
}

3
samples/D/unittest1.d Normal file
View File

@@ -0,0 +1,3 @@
unittest
{
}

3
samples/D/unittest2.d Normal file
View File

@@ -0,0 +1,3 @@
unittest("optional name")
{
}

View File

@@ -0,0 +1,9 @@
(package "composer" "0.0.7" "Interface to PHP Composer")
(source "melpa" "https://melpa.org/packages/")
(package-file "composer.el")
(depends-on "f")
(depends-on "s")
(depends-on "request")
(depends-on "seq")

View File

@@ -0,0 +1,7 @@
{"src/*", [
report,
verbose,
{i, "include"},
{outdir, "ebin"},
debug_info
]}.

161
samples/GLSL/SyLens.shader Normal file
View File

@@ -0,0 +1,161 @@
#version 120
/*
Original Lens Distortion Algorithm from SSontech (Syntheyes)
http://www.ssontech.com/content/lensalg.htm
r2 is radius squared.
r2 = image_aspect*image_aspect*u*u + v*v
f = 1 + r2*(k + kcube*sqrt(r2))
u' = f*u
v' = f*v
*/
// Controls
uniform float kCoeff, kCube, uShift, vShift;
uniform float chroma_red, chroma_green, chroma_blue;
uniform bool apply_disto;
// Uniform inputs
uniform sampler2D input1;
uniform float adsk_input1_w, adsk_input1_h, adsk_input1_aspect, adsk_input1_frameratio;
uniform float adsk_result_w, adsk_result_h;
float distortion_f(float r) {
float f = 1 + (r*r)*(kCoeff + kCube * r);
return f;
}
float inverse_f(float r)
{
// Build a lookup table on the radius, as a fixed-size table.
// We will use a vec3 since we will store the multipled number in the Z coordinate.
// So to recap: x will be the radius, y will be the f(x) distortion, and Z will be x * y;
vec3[48] lut;
// Since out LUT is shader-global check if it's been computed alrite
// Flame has no overflow bbox so we can safely max out at the image edge, plus some cushion
float max_r = sqrt((adsk_input1_frameratio * adsk_input1_frameratio) + 1) + 0.1;
float incr = max_r / 48;
float lut_r = 0;
float f;
for(int i=0; i < 48; i++) {
f = distortion_f(lut_r);
lut[i] = vec3(lut_r, f, lut_r * f);
lut_r += incr;
}
float t;
// Now find the nehgbouring elements
// only iterate to 46 since we will need
// 47 as i+1
for(int i=0; i < 47; i++) {
if(lut[i].z < r && lut[i+1].z > r) {
// BAM! our value is between these two segments
// get the T interpolant and mix
t = (r - lut[i].z) / (lut[i+1].z - lut[i]).z;
return mix(lut[i].y, lut[i+1].y, t );
}
}
}
float aberrate(float f, float chroma)
{
return f + (f * chroma);
}
vec3 chromaticize_and_invert(float f)
{
vec3 rgb_f = vec3(aberrate(f, chroma_red), aberrate(f, chroma_green), aberrate(f, chroma_blue));
// We need to DIVIDE by F when we redistort, and x / y == x * (1 / y)
if(apply_disto) {
rgb_f = 1 / rgb_f;
}
return rgb_f;
}
void main(void)
{
vec2 px, uv;
float f = 1;
float r = 1;
px = gl_FragCoord.xy;
// Make sure we are still centered
px.x -= (adsk_result_w - adsk_input1_w) / 2;
px.y -= (adsk_result_h - adsk_input1_h) / 2;
// Push the destination coordinates into the [0..1] range
uv.x = px.x / adsk_input1_w;
uv.y = px.y / adsk_input1_h;
// And to Syntheyes UV which are [1..-1] on both X and Y
uv.x = (uv.x *2 ) - 1;
uv.y = (uv.y *2 ) - 1;
// Add UV shifts
uv.x += uShift;
uv.y += vShift;
// Make the X value the aspect value, so that the X coordinates go to [-aspect..aspect]
uv.x = uv.x * adsk_input1_frameratio;
// Compute the radius
r = sqrt(uv.x*uv.x + uv.y*uv.y);
// If we are redistorting, account for the oversize plate in the input, assume that
// the input aspect is the same
if(apply_disto) {
r = r / (float(adsk_input1_w) / float(adsk_result_w));
}
// Apply or remove disto, per channel honoring chromatic aberration
if(apply_disto) {
f = inverse_f(r);
} else {
f = distortion_f(r);
}
vec2[3] rgb_uvs = vec2[](uv, uv, uv);
// Compute distortions per component
vec3 rgb_f = chromaticize_and_invert(f);
// Apply the disto coefficients, per component
rgb_uvs[0] = rgb_uvs[0] * rgb_f.rr;
rgb_uvs[1] = rgb_uvs[1] * rgb_f.gg;
rgb_uvs[2] = rgb_uvs[2] * rgb_f.bb;
// Convert all the UVs back to the texture space, per color component
for(int i=0; i < 3; i++) {
uv = rgb_uvs[i];
// Back from [-aspect..aspect] to [-1..1]
uv.x = uv.x / adsk_input1_frameratio;
// Remove UV shifts
uv.x -= uShift;
uv.y -= vShift;
// Back to OGL UV
uv.x = (uv.x + 1) / 2;
uv.y = (uv.y + 1) / 2;
rgb_uvs[i] = uv;
}
// Sample the input plate, per component
vec4 sampled;
sampled.r = texture2D(input1, rgb_uvs[0]).r;
sampled.g = texture2D(input1, rgb_uvs[1]).g;
sampled.b = texture2D(input1, rgb_uvs[2]).b;
// and assign to the output
gl_FragColor.rgba = vec4(sampled.rgb, 1.0 );
}

View File

@@ -0,0 +1,630 @@
//// High quality (Some browsers may freeze or crash)
//#define HIGHQUALITY
//// Medium quality (Should be fine on all systems, works on Intel HD2000 on Win7 but quite slow)
//#define MEDIUMQUALITY
//// Defaults
//#define REFLECTIONS
#define SHADOWS
//#define GRASS
//#define SMALL_WAVES
#define RAGGED_LEAVES
//#define DETAILED_NOISE
//#define LIGHT_AA // 2 sample SSAA
//#define HEAVY_AA // 2x2 RG SSAA
//#define TONEMAP
//// Configurations
#ifdef MEDIUMQUALITY
#define SHADOWS
#define SMALL_WAVES
#define RAGGED_LEAVES
#define TONEMAP
#endif
#ifdef HIGHQUALITY
#define REFLECTIONS
#define SHADOWS
//#define GRASS
#define SMALL_WAVES
#define RAGGED_LEAVES
#define DETAILED_NOISE
#define LIGHT_AA
#define TONEMAP
#endif
// Constants
const float eps = 1e-5;
const float PI = 3.14159265359;
const vec3 sunDir = vec3(0.79057,-0.47434, 0.0);
const vec3 skyCol = vec3(0.3, 0.5, 0.8);
const vec3 sandCol = vec3(0.9, 0.8, 0.5);
const vec3 treeCol = vec3(0.8, 0.65, 0.3);
const vec3 grassCol = vec3(0.4, 0.5, 0.18);
const vec3 leavesCol = vec3(0.3, 0.6, 0.2);
const vec3 leavesPos = vec3(-5.1,13.4, 0.0);
#ifdef TONEMAP
const vec3 sunCol = vec3(1.8, 1.7, 1.6);
#else
const vec3 sunCol = vec3(0.9, 0.85, 0.8);
#endif
const float exposure = 1.1; // Only used when tonemapping
// Description : Array and textureless GLSL 2D/3D/4D simplex
// noise functions.
// Author : Ian McEwan, Ashima Arts.
// License : Copyright (C) 2011 Ashima Arts. All rights reserved.
// Distributed under the MIT License. See LICENSE file.
// https://github.com/ashima/webgl-noise
vec3 mod289(vec3 x) {
return x - floor(x * (1.0 / 289.0)) * 289.0;
}
vec4 mod289(vec4 x) {
return x - floor(x * (1.0 / 289.0)) * 289.0;
}
vec4 permute(vec4 x) {
return mod289(((x*34.0)+1.0)*x);
}
vec4 taylorInvSqrt(vec4 r) {
return 1.79284291400159 - 0.85373472095314 * r;
}
float snoise(vec3 v) {
const vec2 C = vec2(1.0/6.0, 1.0/3.0) ;
const vec4 D = vec4(0.0, 0.5, 1.0, 2.0);
// First corner
vec3 i = floor(v + dot(v, C.yyy) );
vec3 x0 = v - i + dot(i, C.xxx) ;
// Other corners
vec3 g = step(x0.yzx, x0.xyz);
vec3 l = 1.0 - g;
vec3 i1 = min( g.xyz, l.zxy );
vec3 i2 = max( g.xyz, l.zxy );
// x0 = x0 - 0.0 + 0.0 * C.xxx;
// x1 = x0 - i1 + 1.0 * C.xxx;
// x2 = x0 - i2 + 2.0 * C.xxx;
// x3 = x0 - 1.0 + 3.0 * C.xxx;
vec3 x1 = x0 - i1 + C.xxx;
vec3 x2 = x0 - i2 + C.yyy; // 2.0*C.x = 1/3 = C.y
vec3 x3 = x0 - D.yyy; // -1.0+3.0*C.x = -0.5 = -D.y
// Permutations
i = mod289(i);
vec4 p = permute( permute( permute(
i.z + vec4(0.0, i1.z, i2.z, 1.0 ))
+ i.y + vec4(0.0, i1.y, i2.y, 1.0 ))
+ i.x + vec4(0.0, i1.x, i2.x, 1.0 ));
// Gradients: 7x7 points over a square, mapped onto an octahedron.
// The ring size 17*17 = 289 is close to a multiple of 49 (49*6 = 294)
float n_ = 0.142857142857; // 1.0/7.0
vec3 ns = n_ * D.wyz - D.xzx;
vec4 j = p - 49.0 * floor(p * ns.z * ns.z); // mod(p,7*7)
vec4 x_ = floor(j * ns.z);
vec4 y_ = floor(j - 7.0 * x_ ); // mod(j,N)
vec4 x = x_ *ns.x + ns.yyyy;
vec4 y = y_ *ns.x + ns.yyyy;
vec4 h = 1.0 - abs(x) - abs(y);
vec4 b0 = vec4( x.xy, y.xy );
vec4 b1 = vec4( x.zw, y.zw );
//vec4 s0 = vec4(lessThan(b0,0.0))*2.0 - 1.0;
//vec4 s1 = vec4(lessThan(b1,0.0))*2.0 - 1.0;
vec4 s0 = floor(b0)*2.0 + 1.0;
vec4 s1 = floor(b1)*2.0 + 1.0;
vec4 sh = -step(h, vec4(0.0));
vec4 a0 = b0.xzyw + s0.xzyw*sh.xxyy ;
vec4 a1 = b1.xzyw + s1.xzyw*sh.zzww ;
vec3 p0 = vec3(a0.xy,h.x);
vec3 p1 = vec3(a0.zw,h.y);
vec3 p2 = vec3(a1.xy,h.z);
vec3 p3 = vec3(a1.zw,h.w);
//Normalise gradients
vec4 norm = taylorInvSqrt(vec4(dot(p0,p0), dot(p1,p1), dot(p2, p2), dot(p3,p3)));
p0 *= norm.x;
p1 *= norm.y;
p2 *= norm.z;
p3 *= norm.w;
// Mix final noise value
vec4 m = max(0.6 - vec4(dot(x0,x0), dot(x1,x1), dot(x2,x2), dot(x3,x3)), 0.0);
m = m * m;
return 42.0 * dot( m*m, vec4( dot(p0,x0), dot(p1,x1),
dot(p2,x2), dot(p3,x3) ) );
}
// Main
float fbm(vec3 p)
{
float final = snoise(p);
p *= 1.94; final += snoise(p) * 0.5;
#ifdef DETAILED_NOISE
p *= 3.75; final += snoise(p) * 0.25;
return final / 1.75;
#else
return final / 1.5;
#endif
}
float waterHeight(vec3 p)
{
float d = length(p.xz);
float h = sin(d * 1.5 + iGlobalTime * 3.0) * 12.0 / d; // Island waves
#ifdef SMALL_WAVES
h += fbm(p*0.5); // Other waves
#endif
return h;
}
vec3 bump(vec3 pos, vec3 rayDir)
{
float s = 2.0;
// Fade out waves to reduce aliasing
float dist = dot(pos, rayDir);
s *= dist < 2.0 ? 1.0 : 1.4142 / sqrt(dist);
// Calculate normal from heightmap
vec2 e = vec2(1e-2, 0.0);
vec3 p = vec3(pos.x, iGlobalTime*0.5, pos.z)*0.7;
float m = waterHeight(p)*s;
return normalize(vec3(
waterHeight(p+e.xyy)*s-m,
1.0,
waterHeight(p+e.yxy)*s-m
));
}
// Ray intersections
vec4 intersectSphere(vec3 rpos, vec3 rdir, vec3 pos, float rad)
{
vec3 op = pos - rpos;
float b = dot(op, rdir);
float det = b*b - dot(op, op) + rad*rad;
if (det > 0.0)
{
det = sqrt(det);
float t = b - det;
if (t > eps)
return vec4(-normalize(rpos+rdir*t-pos), t);
}
return vec4(0.0);
}
vec4 intersectCylinder(vec3 rpos, vec3 rdir, vec3 pos, float rad)
{
vec3 op = pos - rpos;
vec2 rdir2 = normalize(rdir.yz);
float b = dot(op.yz, rdir2);
float det = b*b - dot(op.yz, op.yz) + rad*rad;
if (det > 0.0)
{
det = sqrt(det);
float t = b - det;
if (t > eps)
return vec4(-normalize(rpos.yz+rdir2*t-pos.yz), 0.0, t);
t = b + det;
if (t > eps)
return vec4(-normalize(rpos.yz+rdir2*t-pos.yz), 0.0, t);
}
return vec4(0.0);
}
vec4 intersectPlane(vec3 rayPos, vec3 rayDir, vec3 n, float d)
{
float t = -(dot(rayPos, n) + d) / dot(rayDir, n);
return vec4(n * sign(dot(rayDir, n)), t);
}
// Helper functions
vec3 rotate(vec3 p, float theta)
{
float c = cos(theta), s = sin(theta);
return vec3(p.x * c + p.z * s, p.y,
p.z * c - p.x * s);
}
float impulse(float k, float x) // by iq
{
float h = k*x;
return h * exp(1.0 - h);
}
// Raymarched parts of scene
float grass(vec3 pos)
{
float h = length(pos - vec3(0.0, -7.0, 0.0)) - 8.0;
if (h > 2.0) return h; // Optimization (Avoid noise if too far away)
return h + snoise(pos * 3.0) * 0.1 + pos.y * 0.9;
}
float tree(vec3 pos)
{
pos.y -= 0.5;
float s = sin(pos.y*0.03);
float c = cos(pos.y*0.03);
mat2 m = mat2(c, -s, s, c);
vec3 p = vec3(m*pos.xy, pos.z);
float width = 1.0 - pos.y * 0.02 - clamp(sin(pos.y * 8.0) * 0.1, 0.05, 0.1);
return max(length(p.xz) - width, pos.y - 12.5);
}
vec2 scene(vec3 pos)
{
float vtree = tree(pos);
#ifdef GRASS
float vgrass = grass(pos);
float v = min(vtree, vgrass);
#else
float v = vtree;
#endif
return vec2(v, v == vtree ? 2.0 : 1.0);
}
vec3 normal(vec3 pos)
{
vec2 eps = vec2(1e-3, 0.0);
float h = scene(pos).x;
return normalize(vec3(
scene(pos-eps.xyy).x-h,
scene(pos-eps.yxy).x-h,
scene(pos-eps.yyx).x-h
));
}
float plantsShadow(vec3 rayPos, vec3 rayDir)
{
// Soft shadow taken from iq
float k = 6.0;
float t = 0.0;
float s = 1.0;
for (int i = 0; i < 30; i++)
{
vec3 pos = rayPos+rayDir*t;
vec2 res = scene(pos);
if (res.x < 0.001) return 0.0;
s = min(s, k*res.x/t);
t += max(res.x, 0.01);
}
return s*s*(3.0 - 2.0*s);
}
// Ray-traced parts of scene
vec4 intersectWater(vec3 rayPos, vec3 rayDir)
{
float h = sin(20.5 + iGlobalTime * 2.0) * 0.03;
float t = -(rayPos.y + 2.5 + h) / rayDir.y;
return vec4(0.0, 1.0, 0.0, t);
}
vec4 intersectSand(vec3 rayPos, vec3 rayDir)
{
return intersectSphere(rayPos, rayDir, vec3(0.0,-24.1,0.0), 24.1);
}
vec4 intersectTreasure(vec3 rayPos, vec3 rayDir)
{
return vec4(0.0);
}
vec4 intersectLeaf(vec3 rayPos, vec3 rayDir, float openAmount)
{
vec3 dir = normalize(vec3(0.0, 1.0, openAmount));
float offset = 0.0;
vec4 res = intersectPlane(rayPos, rayDir, dir, 0.0);
vec3 pos = rayPos+rayDir*res.w;
#ifdef RAGGED_LEAVES
offset = snoise(pos*0.8) * 0.3;
#endif
if (pos.y > 0.0 || length(pos * vec3(0.9, 2.0, 1.0)) > 4.0 - offset) res.w = 0.0;
vec4 res2 = intersectPlane(rayPos, rayDir, vec3(dir.xy, -dir.z), 0.0);
pos = rayPos+rayDir*res2.w;
#ifdef RAGGED_LEAVES
offset = snoise(pos*0.8) * 0.3;
#endif
if (pos.y > 0.0 || length(pos * vec3(0.9, 2.0, 1.0)) > 4.0 - offset) res2.w = 0.0;
if (res2.w > 0.0 && res2.w < res.w || res.w <= 0.0)
res = res2;
return res;
}
vec4 leaves(vec3 rayPos, vec3 rayDir)
{
float t = 1e20;
vec3 n = vec3(0.0);
rayPos -= leavesPos;
float sway = impulse(15.0, fract(iGlobalTime / PI * 0.125));
float upDownSway = sway * -sin(iGlobalTime) * 0.06;
float openAmount = sway * max(-cos(iGlobalTime) * 0.4, 0.0);
float angleOffset = -0.1;
for (float k = 0.0; k < 6.2; k += 0.75)
{
// Left-right
float alpha = k + (k - PI) * sway * 0.015;
vec3 p = rotate(rayPos, alpha);
vec3 d = rotate(rayDir, alpha);
// Up-down
angleOffset *= -1.0;
float theta = -0.4 +
angleOffset +
cos(k) * 0.35 +
upDownSway +
sin(iGlobalTime+k*10.0) * 0.03 * (sway + 0.2);
p = rotate(p.xzy, theta).xzy;
d = rotate(d.xzy, theta).xzy;
// Shift
p -= vec3(5.4, 0.0, 0.0);
// Intersect individual leaf
vec4 res = intersectLeaf(p, d, 1.0+openAmount);
if (res.w > 0.0 && res.w < t)
{
t = res.w;
n = res.xyz;
}
}
return vec4(n, t);
}
// Lighting
float shadow(vec3 rayPos, vec3 rayDir)
{
float s = 1.0;
// Intersect sand
//vec4 resSand = intersectSand(rayPos, rayDir);
//if (resSand.w > 0.0) return 0.0;
// Intersect plants
s = min(s, plantsShadow(rayPos, rayDir));
if (s < 0.0001) return 0.0;
// Intersect leaves
vec4 resLeaves = leaves(rayPos, rayDir);
if (resLeaves.w > 0.0 && resLeaves.w < 1e7) return 0.0;
return s;
}
vec3 light(vec3 p, vec3 n)
{
float s = 1.0;
#ifdef SHADOWS
s = shadow(p-sunDir*0.01, -sunDir);
#endif
vec3 col = sunCol * min(max(dot(n, sunDir), 0.0), s);
col += skyCol * (-n.y * 0.5 + 0.5) * 0.3;
return col;
}
vec3 lightLeaves(vec3 p, vec3 n)
{
float s = 1.0;
#ifdef SHADOWS
s = shadow(p-sunDir*0.01, -sunDir);
#endif
float ao = min(length(p - leavesPos) * 0.1, 1.0);
float ns = dot(n, sunDir);
float d = sqrt(max(ns, 0.0));
vec3 col = sunCol * min(d, s);
col += sunCol * max(-ns, 0.0) * vec3(0.3, 0.3, 0.1) * ao;
col += skyCol * (-n.y * 0.5 + 0.5) * 0.3 * ao;
return col;
}
vec3 sky(vec3 n)
{
return skyCol * (1.0 - n.y * 0.8);
}
// Ray-marching
vec4 plants(vec3 rayPos, vec3 rayDir)
{
float t = 0.0;
for (int i = 0; i < 40; i++)
{
vec3 pos = rayPos+rayDir*t;
vec2 res = scene(pos);
float h = res.x;
if (h < 0.001)
{
vec3 col = res.y == 2.0 ? treeCol : grassCol;
float uvFact = res.y == 2.0 ? 1.0 : 10.0;
vec3 n = normal(pos);
vec2 uv = vec2(n.x, pos.y * 0.5) * 0.2 * uvFact;
vec3 tex = texture2D(iChannel0, uv).rgb * 0.6 + 0.4;
float ao = min(length(pos - leavesPos) * 0.1, 1.0);
return vec4(col * light(pos, n) * ao * tex, t);
}
t += h;
}
return vec4(sky(rayDir), 1e8);
}
// Final combination
vec3 traceReflection(vec3 rayPos, vec3 rayDir)
{
vec3 col = vec3(0.0);
float t = 1e20;
// Intersect plants
vec4 resPlants = plants(rayPos, rayDir);
if (resPlants.w > 0.0 && resPlants.w < t)
{
t = resPlants.w;
col = resPlants.xyz;
}
// Intersect leaves
vec4 resLeaves = leaves(rayPos, rayDir);
if (resLeaves.w > 0.0 && resLeaves.w < t)
{
vec3 pos = rayPos + rayDir * resLeaves.w;
vec2 uv = (pos.xz - leavesPos.xz) * 0.3;
float tex = texture2D(iChannel0, uv).r * 0.6 + 0.5;
t = resLeaves.w;
col = leavesCol * lightLeaves(pos, resLeaves.xyz) * tex;
}
if (t > 1e7) return sky(rayDir);
return col;
}
vec3 trace(vec3 rayPos, vec3 rayDir)
{
vec3 col = vec3(0.0);
float t = 1e20;
// Intersect sand
vec4 resSand = intersectSand(rayPos, rayDir);
if (resSand.w > 0.0)
{
vec3 pos = rayPos + rayDir * resSand.w;
t = resSand.w;
col = sandCol * light(pos, resSand.xyz);
}
// Intersect treasure chest
vec4 resTreasure = intersectTreasure(rayPos, rayDir);
if (resTreasure.w > 0.0 && resTreasure.w < t)
{
vec3 pos = rayPos + rayDir * resTreasure.w;
t = resTreasure.w;
col = leavesCol * light(pos, resTreasure.xyz);
}
// Intersect leaves
vec4 resLeaves = leaves(rayPos, rayDir);
if (resLeaves.w > 0.0 && resLeaves.w < t)
{
vec3 pos = rayPos + rayDir * resLeaves.w;
vec2 uv = (pos.xz - leavesPos.xz) * 0.3;
float tex = texture2D(iChannel0, uv).r * 0.6 + 0.5;
t = resLeaves.w;
col = leavesCol * lightLeaves(pos, resLeaves.xyz) * tex;
}
// Intersect plants
vec4 resPlants = plants(rayPos, rayDir);
if (resPlants.w > 0.0 && resPlants.w < t)
{
t = resPlants.w;
col = resPlants.xyz;
}
// Intersect water
vec4 resWater = intersectWater(rayPos, rayDir);
if (resWater.w > 0.0 && resWater.w < t)
{
vec3 pos = rayPos + rayDir * resWater.w;
float dist = t - resWater.w;
vec3 n = bump(pos, rayDir);
float ct = -min(dot(n,rayDir), 0.0);
float fresnel = 0.9 - 0.9 * pow(1.0 - ct, 5.0);
vec3 trans = col * exp(-dist * vec3(1.0, 0.7, 0.4) * 3.0);
vec3 reflDir = normalize(reflect(rayDir, n));
vec3 refl = sky(reflDir);
#ifdef REFLECTIONS
if (dot(pos, rayDir) < -2.0)
refl = traceReflection(pos, reflDir).rgb;
#endif
t = resWater.t;
col = mix(refl, trans, fresnel);
}
if (t > 1e7) return sky(rayDir);
return col;
}
// Ray-generation
vec3 camera(vec2 px)
{
vec2 rd = (px / iResolution.yy - vec2(iResolution.x/iResolution.y*0.5-0.5, 0.0)) * 2.0 - 1.0;
float t = sin(iGlobalTime * 0.1) * 0.2;
vec3 rayDir = normalize(vec3(rd.x, rd.y, 1.0));
vec3 rayPos = vec3(0.0, 3.0, -18.0);
return trace(rayPos, rayDir);
}
void main(void)
{
#ifdef HEAVY_AA
vec3 col = camera(gl_FragCoord.xy+vec2(0.0,0.5))*0.25;
col += camera(gl_FragCoord.xy+vec2(0.25,0.0))*0.25;
col += camera(gl_FragCoord.xy+vec2(0.5,0.75))*0.25;
col += camera(gl_FragCoord.xy+vec2(0.75,0.25))*0.25;
#else
vec3 col = camera(gl_FragCoord.xy);
#ifdef LIGHT_AA
col = col * 0.5 + camera(gl_FragCoord.xy+vec2(0.5,0.5))*0.5;
#endif
#endif
#ifdef TONEMAP
// Optimized Haarm-Peter Duikers curve
vec3 x = max(vec3(0.0),col*exposure-0.004);
col = (x*(6.2*x+.5))/(x*(6.2*x+1.7)+0.06);
#else
col = pow(col, vec3(0.4545));
#endif
gl_FragColor = vec4(col, 1.0);
}

View File

@@ -0,0 +1,98 @@
/**
* The MIT License (MIT)
*
* Copyright (c) 2016 Sascha Willems
*
* Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
*/
#version 450
#extension GL_ARB_separate_shader_objects : enable
#extension GL_ARB_shading_language_420pack : enable
// PN patch data
struct PnPatch
{
float b210;
float b120;
float b021;
float b012;
float b102;
float b201;
float b111;
float n110;
float n011;
float n101;
};
// tessellation levels
layout (binding = 0) uniform UBO
{
float tessLevel;
} ubo;
layout(vertices=3) out;
layout(location = 0) in vec3 inNormal[];
layout(location = 1) in vec2 inUV[];
layout(location = 0) out vec3 outNormal[3];
layout(location = 3) out vec2 outUV[3];
layout(location = 6) out PnPatch outPatch[3];
float wij(int i, int j)
{
return dot(gl_in[j].gl_Position.xyz - gl_in[i].gl_Position.xyz, inNormal[i]);
}
float vij(int i, int j)
{
vec3 Pj_minus_Pi = gl_in[j].gl_Position.xyz
- gl_in[i].gl_Position.xyz;
vec3 Ni_plus_Nj = inNormal[i]+inNormal[j];
return 2.0*dot(Pj_minus_Pi, Ni_plus_Nj)/dot(Pj_minus_Pi, Pj_minus_Pi);
}
void main()
{
// get data
gl_out[gl_InvocationID].gl_Position = gl_in[gl_InvocationID].gl_Position;
outNormal[gl_InvocationID] = inNormal[gl_InvocationID];
outUV[gl_InvocationID] = inUV[gl_InvocationID];
// set base
float P0 = gl_in[0].gl_Position[gl_InvocationID];
float P1 = gl_in[1].gl_Position[gl_InvocationID];
float P2 = gl_in[2].gl_Position[gl_InvocationID];
float N0 = inNormal[0][gl_InvocationID];
float N1 = inNormal[1][gl_InvocationID];
float N2 = inNormal[2][gl_InvocationID];
// compute control points
outPatch[gl_InvocationID].b210 = (2.0*P0 + P1 - wij(0,1)*N0)/3.0;
outPatch[gl_InvocationID].b120 = (2.0*P1 + P0 - wij(1,0)*N1)/3.0;
outPatch[gl_InvocationID].b021 = (2.0*P1 + P2 - wij(1,2)*N1)/3.0;
outPatch[gl_InvocationID].b012 = (2.0*P2 + P1 - wij(2,1)*N2)/3.0;
outPatch[gl_InvocationID].b102 = (2.0*P2 + P0 - wij(2,0)*N2)/3.0;
outPatch[gl_InvocationID].b201 = (2.0*P0 + P2 - wij(0,2)*N0)/3.0;
float E = ( outPatch[gl_InvocationID].b210
+ outPatch[gl_InvocationID].b120
+ outPatch[gl_InvocationID].b021
+ outPatch[gl_InvocationID].b012
+ outPatch[gl_InvocationID].b102
+ outPatch[gl_InvocationID].b201 ) / 6.0;
float V = (P0 + P1 + P2)/3.0;
outPatch[gl_InvocationID].b111 = E + (E - V)*0.5;
outPatch[gl_InvocationID].n110 = N0+N1-vij(0,1)*(P1-P0);
outPatch[gl_InvocationID].n011 = N1+N2-vij(1,2)*(P2-P1);
outPatch[gl_InvocationID].n101 = N2+N0-vij(2,0)*(P0-P2);
// set tess levels
gl_TessLevelOuter[gl_InvocationID] = ubo.tessLevel;
gl_TessLevelInner[0] = ubo.tessLevel;
}

View File

@@ -0,0 +1,103 @@
/**
* The MIT License (MIT)
*
* Copyright (c) 2016 Sascha Willems
*
* Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
*/
#version 450
#extension GL_ARB_separate_shader_objects : enable
#extension GL_ARB_shading_language_420pack : enable
// PN patch data
struct PnPatch
{
float b210;
float b120;
float b021;
float b012;
float b102;
float b201;
float b111;
float n110;
float n011;
float n101;
};
layout (binding = 1) uniform UBO
{
mat4 projection;
mat4 model;
float tessAlpha;
} ubo;
layout(triangles, fractional_odd_spacing, ccw) in;
layout(location = 0) in vec3 iNormal[];
layout(location = 3) in vec2 iTexCoord[];
layout(location = 6) in PnPatch iPnPatch[];
layout(location = 0) out vec3 oNormal;
layout(location = 1) out vec2 oTexCoord;
#define uvw gl_TessCoord
void main()
{
vec3 uvwSquared = uvw * uvw;
vec3 uvwCubed = uvwSquared * uvw;
// extract control points
vec3 b210 = vec3(iPnPatch[0].b210, iPnPatch[1].b210, iPnPatch[2].b210);
vec3 b120 = vec3(iPnPatch[0].b120, iPnPatch[1].b120, iPnPatch[2].b120);
vec3 b021 = vec3(iPnPatch[0].b021, iPnPatch[1].b021, iPnPatch[2].b021);
vec3 b012 = vec3(iPnPatch[0].b012, iPnPatch[1].b012, iPnPatch[2].b012);
vec3 b102 = vec3(iPnPatch[0].b102, iPnPatch[1].b102, iPnPatch[2].b102);
vec3 b201 = vec3(iPnPatch[0].b201, iPnPatch[1].b201, iPnPatch[2].b201);
vec3 b111 = vec3(iPnPatch[0].b111, iPnPatch[1].b111, iPnPatch[2].b111);
// extract control normals
vec3 n110 = normalize(vec3(iPnPatch[0].n110, iPnPatch[1].n110, iPnPatch[2].n110));
vec3 n011 = normalize(vec3(iPnPatch[0].n011, iPnPatch[1].n011, iPnPatch[2].n011));
vec3 n101 = normalize(vec3(iPnPatch[0].n101, iPnPatch[1].n101, iPnPatch[2].n101));
// compute texcoords
oTexCoord = gl_TessCoord[2]*iTexCoord[0] + gl_TessCoord[0]*iTexCoord[1] + gl_TessCoord[1]*iTexCoord[2];
// normal
// Barycentric normal
vec3 barNormal = gl_TessCoord[2]*iNormal[0] + gl_TessCoord[0]*iNormal[1] + gl_TessCoord[1]*iNormal[2];
vec3 pnNormal = iNormal[0]*uvwSquared[2] + iNormal[1]*uvwSquared[0] + iNormal[2]*uvwSquared[1]
+ n110*uvw[2]*uvw[0] + n011*uvw[0]*uvw[1]+ n101*uvw[2]*uvw[1];
oNormal = ubo.tessAlpha*pnNormal + (1.0-ubo.tessAlpha) * barNormal;
// compute interpolated pos
vec3 barPos = gl_TessCoord[2]*gl_in[0].gl_Position.xyz
+ gl_TessCoord[0]*gl_in[1].gl_Position.xyz
+ gl_TessCoord[1]*gl_in[2].gl_Position.xyz;
// save some computations
uvwSquared *= 3.0;
// compute PN position
vec3 pnPos = gl_in[0].gl_Position.xyz*uvwCubed[2]
+ gl_in[1].gl_Position.xyz*uvwCubed[0]
+ gl_in[2].gl_Position.xyz*uvwCubed[1]
+ b210*uvwSquared[2]*uvw[0]
+ b120*uvwSquared[0]*uvw[2]
+ b201*uvwSquared[2]*uvw[1]
+ b021*uvwSquared[0]*uvw[1]
+ b102*uvwSquared[1]*uvw[2]
+ b012*uvwSquared[1]*uvw[0]
+ b111*6.0*uvw[0]*uvw[1]*uvw[2];
// final position and normal
vec3 finalPos = (1.0-ubo.tessAlpha)*barPos + ubo.tessAlpha*pnPos;
gl_Position = ubo.projection * ubo.model * vec4(finalPos,1.0);
}

12
samples/Genie/Class.gs Normal file
View File

@@ -0,0 +1,12 @@
init
new Demo( "Demonstration class" ).run()
class Demo
_message:string = ""
construct ( message:string = "Optional argument - no message passed in constructor" )
_message = message
def run()
print( _message )

2
samples/Genie/Hello.gs Normal file
View File

@@ -0,0 +1,2 @@
init
print( "Hello, World!" )

135
samples/HCL/main.tf Normal file
View File

@@ -0,0 +1,135 @@
resource "aws_security_group" "elb_sec_group" {
description = "Allow traffic from the internet to ELB port 80"
vpc_id = "${var.vpc_id}"
ingress {
from_port = 80
to_port = 80
protocol = "tcp"
cidr_blocks = ["${split(",", var.allowed_cidr_blocks)}"]
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
}
resource "aws_security_group" "dokku_allow_ssh_from_internal" {
description = "Allow git access over ssh from the private subnet"
vpc_id = "${var.vpc_id}"
ingress {
from_port = 22
to_port = 22
protocol = "tcp"
cidr_blocks = ["${var.private_subnet_cidr}"]
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
}
resource "aws_security_group" "allow_from_elb_to_instance" {
description = "Allow traffic from the ELB to the private instance"
vpc_id = "${var.vpc_id}"
ingress {
security_groups = ["${aws_security_group.elb_sec_group.id}"]
from_port = 80
to_port = 80
protocol = "tcp"
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
}
resource "aws_instance" "dokku" {
ami = "ami-47a23a30"
instance_type = "${var.instance_type}"
associate_public_ip_address = false
key_name = "${var.key_name}"
subnet_id = "${var.private_subnet_id}"
vpc_security_group_ids = [
"${var.bastion_sec_group_id}",
"${aws_security_group.allow_from_elb_to_instance.id}",
"${aws_security_group.dokku_allow_ssh_from_internal.id}"
]
tags {
Name = "${var.name}"
}
connection {
user = "ubuntu"
private_key = "${var.private_key}"
bastion_host = "${var.bastion_host}"
bastion_port = "${var.bastion_port}"
bastion_user = "${var.bastion_user}"
bastion_private_key = "${var.bastion_private_key}"
}
provisioner "file" {
source = "${path.module}/../scripts/install-dokku.sh"
destination = "/home/ubuntu/install-dokku.sh"
}
provisioner "remote-exec" {
inline = [
"chmod +x /home/ubuntu/install-dokku.sh",
"HOSTNAME=${var.hostname} /home/ubuntu/install-dokku.sh"
]
}
}
resource "aws_elb" "elb_dokku" {
name = "elb-dokku-${var.name}"
subnets = ["${var.public_subnet_id}"]
security_groups = ["${aws_security_group.elb_sec_group.id}"]
listener {
instance_port = 80
instance_protocol = "http"
lb_port = 80
lb_protocol = "http"
}
health_check {
healthy_threshold = 2
unhealthy_threshold = 2
timeout = 3
target = "HTTP:80/"
interval = 30
}
instances = ["${aws_instance.dokku.id}"]
cross_zone_load_balancing = false
idle_timeout = 400
tags {
Name = "elb-dokku-${var.name}"
}
}
resource "aws_route53_record" "dokku-deploy" {
zone_id = "${var.zone_id}"
name = "deploy.${var.hostname}"
type = "A"
ttl = "300"
records = ["${aws_instance.dokku.private_ip}"]
}
resource "aws_route53_record" "dokku-wildcard" {
zone_id = "${var.zone_id}"
name = "*.${var.hostname}"
type = "CNAME"
ttl = "300"
records = ["${aws_elb.elb_dokku.dns_name}"]
}

89
samples/HLSL/bloom.cginc Normal file
View File

@@ -0,0 +1,89 @@
// From https://github.com/Unity-Technologies/PostProcessing/blob/master/PostProcessing/Resources/Shaders/Bloom.cginc
// Licensed under the MIT license
#ifndef __BLOOM__
#define __BLOOM__
#include "Common.cginc"
// Brightness function
half Brightness(half3 c)
{
return Max3(c);
}
// 3-tap median filter
half3 Median(half3 a, half3 b, half3 c)
{
return a + b + c - min(min(a, b), c) - max(max(a, b), c);
}
// Downsample with a 4x4 box filter
half3 DownsampleFilter(sampler2D tex, float2 uv, float2 texelSize)
{
float4 d = texelSize.xyxy * float4(-1.0, -1.0, 1.0, 1.0);
half3 s;
s = DecodeHDR(tex2D(tex, uv + d.xy));
s += DecodeHDR(tex2D(tex, uv + d.zy));
s += DecodeHDR(tex2D(tex, uv + d.xw));
s += DecodeHDR(tex2D(tex, uv + d.zw));
return s * (1.0 / 4.0);
}
// Downsample with a 4x4 box filter + anti-flicker filter
half3 DownsampleAntiFlickerFilter(sampler2D tex, float2 uv, float2 texelSize)
{
float4 d = texelSize.xyxy * float4(-1.0, -1.0, 1.0, 1.0);
half3 s1 = DecodeHDR(tex2D(tex, uv + d.xy));
half3 s2 = DecodeHDR(tex2D(tex, uv + d.zy));
half3 s3 = DecodeHDR(tex2D(tex, uv + d.xw));
half3 s4 = DecodeHDR(tex2D(tex, uv + d.zw));
// Karis's luma weighted average (using brightness instead of luma)
half s1w = 1.0 / (Brightness(s1) + 1.0);
half s2w = 1.0 / (Brightness(s2) + 1.0);
half s3w = 1.0 / (Brightness(s3) + 1.0);
half s4w = 1.0 / (Brightness(s4) + 1.0);
half one_div_wsum = 1.0 / (s1w + s2w + s3w + s4w);
return (s1 * s1w + s2 * s2w + s3 * s3w + s4 * s4w) * one_div_wsum;
}
half3 UpsampleFilter(sampler2D tex, float2 uv, float2 texelSize, float sampleScale)
{
#if MOBILE_OR_CONSOLE
// 4-tap bilinear upsampler
float4 d = texelSize.xyxy * float4(-1.0, -1.0, 1.0, 1.0) * (sampleScale * 0.5);
half3 s;
s = DecodeHDR(tex2D(tex, uv + d.xy));
s += DecodeHDR(tex2D(tex, uv + d.zy));
s += DecodeHDR(tex2D(tex, uv + d.xw));
s += DecodeHDR(tex2D(tex, uv + d.zw));
return s * (1.0 / 4.0);
#else
// 9-tap bilinear upsampler (tent filter)
float4 d = texelSize.xyxy * float4(1.0, 1.0, -1.0, 0.0) * sampleScale;
half3 s;
s = DecodeHDR(tex2D(tex, uv - d.xy));
s += DecodeHDR(tex2D(tex, uv - d.wy)) * 2.0;
s += DecodeHDR(tex2D(tex, uv - d.zy));
s += DecodeHDR(tex2D(tex, uv + d.zw)) * 2.0;
s += DecodeHDR(tex2D(tex, uv)) * 4.0;
s += DecodeHDR(tex2D(tex, uv + d.xw)) * 2.0;
s += DecodeHDR(tex2D(tex, uv + d.zy));
s += DecodeHDR(tex2D(tex, uv + d.wy)) * 2.0;
s += DecodeHDR(tex2D(tex, uv + d.xy));
return s * (1.0 / 16.0);
#endif
}
#endif // __BLOOM__

View File

@@ -0,0 +1,923 @@
/* generated by jison-lex 0.3.4-159 */
var ccalcLex = (function () {
// See also:
// http://stackoverflow.com/questions/1382107/whats-a-good-way-to-extend-error-in-javascript/#35881508
// but we keep the prototype.constructor and prototype.name assignment lines too for compatibility
// with userland code which might access the derived class in a 'classic' way.
function JisonLexerError(msg, hash) {
Object.defineProperty(this, 'name', {
enumerable: false,
writable: false,
value: 'JisonLexerError'
});
if (msg == null) msg = '???';
Object.defineProperty(this, 'message', {
enumerable: false,
writable: true,
value: msg
});
this.hash = hash;
var stacktrace;
if (hash && hash.exception instanceof Error) {
var ex2 = hash.exception;
this.message = ex2.message || msg;
stacktrace = ex2.stack;
}
if (!stacktrace) {
if (Error.hasOwnProperty('captureStackTrace')) { // V8
Error.captureStackTrace(this, this.constructor);
} else {
stacktrace = (new Error(msg)).stack;
}
}
if (stacktrace) {
Object.defineProperty(this, 'stack', {
enumerable: false,
writable: false,
value: stacktrace
});
}
}
if (typeof Object.setPrototypeOf === 'function') {
Object.setPrototypeOf(JisonLexerError.prototype, Error.prototype);
} else {
JisonLexerError.prototype = Object.create(Error.prototype);
}
JisonLexerError.prototype.constructor = JisonLexerError;
JisonLexerError.prototype.name = 'JisonLexerError';
var lexer = {
EOF: 1,
ERROR: 2,
// JisonLexerError: JisonLexerError, // <-- injected by the code generator
// options: {}, // <-- injected by the code generator
// yy: ..., // <-- injected by setInput()
__currentRuleSet__: null, // <-- internal rule set cache for the current lexer state
__error_infos: [], // INTERNAL USE ONLY: the set of lexErrorInfo objects created since the last cleanup
__decompressed: false, // INTERNAL USE ONLY: mark whether the lexer instance has been 'unfolded' completely and is now ready for use
done: false, // INTERNAL USE ONLY
_backtrack: false, // INTERNAL USE ONLY
_input: '', // INTERNAL USE ONLY
_more: false, // INTERNAL USE ONLY
_signaled_error_token: false, // INTERNAL USE ONLY
conditionStack: [], // INTERNAL USE ONLY; managed via `pushState()`, `popState()`, `topState()` and `stateStackSize()`
match: '', // READ-ONLY EXTERNAL ACCESS - ADVANCED USE ONLY: tracks input which has been matched so far for the lexer token under construction. `match` is identical to `yytext` except that this one still contains the matched input string after `lexer.performAction()` has been invoked, where userland code MAY have changed/replaced the `yytext` value entirely!
matched: '', // READ-ONLY EXTERNAL ACCESS - ADVANCED USE ONLY: tracks entire input which has been matched so far
matches: false, // READ-ONLY EXTERNAL ACCESS - ADVANCED USE ONLY: tracks RE match result for last (successful) match attempt
yytext: '', // ADVANCED USE ONLY: tracks input which has been matched so far for the lexer token under construction; this value is transferred to the parser as the 'token value' when the parser consumes the lexer token produced through a call to the `lex()` API.
offset: 0, // READ-ONLY EXTERNAL ACCESS - ADVANCED USE ONLY: tracks the 'cursor position' in the input string, i.e. the number of characters matched so far
yyleng: 0, // READ-ONLY EXTERNAL ACCESS - ADVANCED USE ONLY: length of matched input for the token under construction (`yytext`)
yylineno: 0, // READ-ONLY EXTERNAL ACCESS - ADVANCED USE ONLY: 'line number' at which the token under construction is located
yylloc: null, // READ-ONLY EXTERNAL ACCESS - ADVANCED USE ONLY: tracks location info (lines + columns) for the token under construction
// INTERNAL USE: construct a suitable error info hash object instance for `parseError`.
constructLexErrorInfo: function lexer_constructLexErrorInfo(msg, recoverable) {
var pei = {
errStr: msg,
recoverable: !!recoverable,
text: this.match, // This one MAY be empty; userland code should use the `upcomingInput` API to obtain more text which follows the 'lexer cursor position'...
token: null,
line: this.yylineno,
loc: this.yylloc,
yy: this.yy,
lexer: this,
// and make sure the error info doesn't stay due to potential
// ref cycle via userland code manipulations.
// These would otherwise all be memory leak opportunities!
//
// Note that only array and object references are nuked as those
// constitute the set of elements which can produce a cyclic ref.
// The rest of the members is kept intact as they are harmless.
destroy: function destructLexErrorInfo() {
// remove cyclic references added to error info:
// info.yy = null;
// info.lexer = null;
// ...
var rec = !!this.recoverable;
for (var key in this) {
if (this.hasOwnProperty(key) && typeof key === 'object') {
this[key] = undefined;
}
}
this.recoverable = rec;
}
};
// track this instance so we can `destroy()` it once we deem it superfluous and ready for garbage collection!
this.__error_infos.push(pei);
return pei;
},
parseError: function lexer_parseError(str, hash) {
if (this.yy.parser && typeof this.yy.parser.parseError === 'function') {
return this.yy.parser.parseError(str, hash) || this.ERROR;
} else if (typeof this.yy.parseError === 'function') {
return this.yy.parseError.call(this, str, hash) || this.ERROR;
} else {
throw new this.JisonLexerError(str);
}
},
// final cleanup function for when we have completed lexing the input;
// make it an API so that external code can use this one once userland
// code has decided it's time to destroy any lingering lexer error
// hash object instances and the like: this function helps to clean
// up these constructs, which *may* carry cyclic references which would
// otherwise prevent the instances from being properly and timely
// garbage-collected, i.e. this function helps prevent memory leaks!
cleanupAfterLex: function lexer_cleanupAfterLex(do_not_nuke_errorinfos) {
var rv;
// prevent lingering circular references from causing memory leaks:
this.setInput('', {});
// nuke the error hash info instances created during this run.
// Userland code must COPY any data/references
// in the error hash instance(s) it is more permanently interested in.
if (!do_not_nuke_errorinfos) {
for (var i = this.__error_infos.length - 1; i >= 0; i--) {
var el = this.__error_infos[i];
if (el && typeof el.destroy === 'function') {
el.destroy();
}
}
this.__error_infos.length = 0;
}
return this;
},
// clear the lexer token context; intended for internal use only
clear: function lexer_clear() {
this.yytext = '';
this.yyleng = 0;
this.match = '';
this.matches = false;
this._more = false;
this._backtrack = false;
},
// resets the lexer, sets new input
setInput: function lexer_setInput(input, yy) {
this.yy = yy || this.yy || {};
// also check if we've fully initialized the lexer instance,
// including expansion work to be done to go from a loaded
// lexer to a usable lexer:
if (!this.__decompressed) {
// step 1: decompress the regex list:
var rules = this.rules;
for (var i = 0, len = rules.length; i < len; i++) {
var rule_re = rules[i];
// compression: is the RE an xref to another RE slot in the rules[] table?
if (typeof rule_re === 'number') {
rules[i] = rules[rule_re];
}
}
// step 2: unfold the conditions[] set to make these ready for use:
var conditions = this.conditions;
for (var k in conditions) {
var spec = conditions[k];
var rule_ids = spec.rules;
var len = rule_ids.length;
var rule_regexes = new Array(len + 1); // slot 0 is unused; we use a 1-based index approach here to keep the hottest code in `lexer_next()` fast and simple!
var rule_new_ids = new Array(len + 1);
if (this.rules_prefix1) {
var rule_prefixes = new Array(65536);
var first_catch_all_index = 0;
for (var i = 0; i < len; i++) {
var idx = rule_ids[i];
var rule_re = rules[idx];
rule_regexes[i + 1] = rule_re;
rule_new_ids[i + 1] = idx;
var prefix = this.rules_prefix1[idx];
// compression: is the PREFIX-STRING an xref to another PREFIX-STRING slot in the rules_prefix1[] table?
if (typeof prefix === 'number') {
prefix = this.rules_prefix1[prefix];
}
// init the prefix lookup table: first come, first serve...
if (!prefix) {
if (!first_catch_all_index) {
first_catch_all_index = i + 1;
}
} else {
for (var j = 0, pfxlen = prefix.length; j < pfxlen; j++) {
var pfxch = prefix.charCodeAt(j);
// first come, first serve:
if (!rule_prefixes[pfxch]) {
rule_prefixes[pfxch] = i + 1;
}
}
}
}
// if no catch-all prefix has been encountered yet, it means all
// rules have limited prefix sets and it MAY be that particular
// input characters won't be recognized by any rule in this
// condition state.
//
// To speed up their discovery at run-time while keeping the
// remainder of the lexer kernel code very simple (and fast),
// we point these to an 'illegal' rule set index *beyond*
// the end of the rule set.
if (!first_catch_all_index) {
first_catch_all_index = len + 1;
}
for (var i = 0; i < 65536; i++) {
if (!rule_prefixes[i]) {
rule_prefixes[i] = first_catch_all_index;
}
}
spec.__dispatch_lut = rule_prefixes;
} else {
for (var i = 0; i < len; i++) {
var idx = rule_ids[i];
var rule_re = rules[idx];
rule_regexes[i + 1] = rule_re;
rule_new_ids[i + 1] = idx;
}
}
spec.rules = rule_new_ids;
spec.__rule_regexes = rule_regexes;
spec.__rule_count = len;
}
this.__decompressed = true;
}
this._input = input || '';
this.clear();
this._signaled_error_token = false;
this.done = false;
this.yylineno = 0;
this.matched = '';
this.conditionStack = ['INITIAL'];
this.__currentRuleSet__ = null;
this.yylloc = {
first_line: 1,
first_column: 0,
last_line: 1,
last_column: 0
};
if (this.options.ranges) {
this.yylloc.range = [0, 0];
}
this.offset = 0;
return this;
},
// consumes and returns one char from the input
input: function lexer_input() {
if (!this._input) {
this.done = true;
return null;
}
var ch = this._input[0];
this.yytext += ch;
this.yyleng++;
this.offset++;
this.match += ch;
this.matched += ch;
// Count the linenumber up when we hit the LF (or a stand-alone CR).
// On CRLF, the linenumber is incremented when you fetch the CR or the CRLF combo
// and we advance immediately past the LF as well, returning both together as if
// it was all a single 'character' only.
var slice_len = 1;
var lines = false;
if (ch === '\n') {
lines = true;
} else if (ch === '\r') {
lines = true;
var ch2 = this._input[1];
if (ch2 === '\n') {
slice_len++;
ch += ch2;
this.yytext += ch2;
this.yyleng++;
this.offset++;
this.match += ch2;
this.matched += ch2;
if (this.options.ranges) {
this.yylloc.range[1]++;
}
}
}
if (lines) {
this.yylineno++;
this.yylloc.last_line++;
} else {
this.yylloc.last_column++;
}
if (this.options.ranges) {
this.yylloc.range[1]++;
}
this._input = this._input.slice(slice_len);
return ch;
},
// unshifts one char (or a string) into the input
unput: function lexer_unput(ch) {
var len = ch.length;
var lines = ch.split(/(?:\r\n?|\n)/g);
this._input = ch + this._input;
this.yytext = this.yytext.substr(0, this.yytext.length - len);
//this.yyleng -= len;
this.offset -= len;
var oldLines = this.match.split(/(?:\r\n?|\n)/g);
this.match = this.match.substr(0, this.match.length - len);
this.matched = this.matched.substr(0, this.matched.length - len);
if (lines.length - 1) {
this.yylineno -= lines.length - 1;
}
this.yylloc.last_line = this.yylineno + 1;
this.yylloc.last_column = (lines ?
(lines.length === oldLines.length ? this.yylloc.first_column : 0)
+ oldLines[oldLines.length - lines.length].length - lines[0].length :
this.yylloc.first_column - len);
if (this.options.ranges) {
this.yylloc.range[1] = this.yylloc.range[0] + this.yyleng - len;
}
this.yyleng = this.yytext.length;
this.done = false;
return this;
},
// When called from action, caches matched text and appends it on next action
more: function lexer_more() {
this._more = true;
return this;
},
// When called from action, signals the lexer that this rule fails to match the input, so the next matching rule (regex) should be tested instead.
reject: function lexer_reject() {
if (this.options.backtrack_lexer) {
this._backtrack = true;
} else {
// when the parseError() call returns, we MUST ensure that the error is registered.
// We accomplish this by signaling an 'error' token to be produced for the current
// .lex() run.
var p = this.constructLexErrorInfo('Lexical error on line ' + (this.yylineno + 1) + '. You can only invoke reject() in the lexer when the lexer is of the backtracking persuasion (options.backtrack_lexer = true).\n' + this.showPosition(), false);
this._signaled_error_token = (this.parseError(p.errStr, p) || this.ERROR);
}
return this;
},
// retain first n characters of the match
less: function lexer_less(n) {
return this.unput(this.match.slice(n));
},
// return (part of the) already matched input, i.e. for error messages.
// Limit the returned string length to `maxSize` (default: 20).
// Limit the returned string to the `maxLines` number of lines of input (default: 1).
// Negative limit values equal *unlimited*.
pastInput: function lexer_pastInput(maxSize, maxLines) {
var past = this.matched.substring(0, this.matched.length - this.match.length);
if (maxSize < 0)
maxSize = past.length;
else if (!maxSize)
maxSize = 20;
if (maxLines < 0)
maxLines = past.length; // can't ever have more input lines than this!
else if (!maxLines)
maxLines = 1;
// `substr` anticipation: treat \r\n as a single character and take a little
// more than necessary so that we can still properly check against maxSize
// after we've transformed and limited the newLines in here:
past = past.substr(-maxSize * 2 - 2);
// now that we have a significantly reduced string to process, transform the newlines
// and chop them, then limit them:
var a = past.replace(/\r\n|\r/g, '\n').split('\n');
a = a.slice(-maxLines);
past = a.join('\n');
// When, after limiting to maxLines, we still have too much to return,
// do add an ellipsis prefix...
if (past.length > maxSize) {
past = '...' + past.substr(-maxSize);
}
return past;
},
// return (part of the) upcoming input, i.e. for error messages.
// Limit the returned string length to `maxSize` (default: 20).
// Limit the returned string to the `maxLines` number of lines of input (default: 1).
// Negative limit values equal *unlimited*.
upcomingInput: function lexer_upcomingInput(maxSize, maxLines) {
var next = this.match;
if (maxSize < 0)
maxSize = next.length + this._input.length;
else if (!maxSize)
maxSize = 20;
if (maxLines < 0)
maxLines = maxSize; // can't ever have more input lines than this!
else if (!maxLines)
maxLines = 1;
// `substring` anticipation: treat \r\n as a single character and take a little
// more than necessary so that we can still properly check against maxSize
// after we've transformed and limited the newLines in here:
if (next.length < maxSize * 2 + 2) {
next += this._input.substring(0, maxSize * 2 + 2); // substring is faster on Chrome/V8
}
// now that we have a significantly reduced string to process, transform the newlines
// and chop them, then limit them:
var a = next.replace(/\r\n|\r/g, '\n').split('\n');
a = a.slice(0, maxLines);
next = a.join('\n');
// When, after limiting to maxLines, we still have too much to return,
// do add an ellipsis postfix...
if (next.length > maxSize) {
next = next.substring(0, maxSize) + '...';
}
return next;
},
// return a string which displays the character position where the lexing error occurred, i.e. for error messages
showPosition: function lexer_showPosition(maxPrefix, maxPostfix) {
var pre = this.pastInput(maxPrefix).replace(/\s/g, ' ');
var c = new Array(pre.length + 1).join('-');
return pre + this.upcomingInput(maxPostfix).replace(/\s/g, ' ') + '\n' + c + '^';
},
// helper function, used to produce a human readable description as a string, given
// the input `yylloc` location object.
// Set `display_range_too` to TRUE to include the string character index position(s)
// in the description if the `yylloc.range` is available.
describeYYLLOC: function lexer_describe_yylloc(yylloc, display_range_too) {
var l1 = yylloc.first_line;
var l2 = yylloc.last_line;
var o1 = yylloc.first_column;
var o2 = yylloc.last_column - 1;
var dl = l2 - l1;
var d_o = (dl === 0 ? o2 - o1 : 1000);
var rv;
if (dl === 0) {
rv = 'line ' + l1 + ', ';
if (d_o === 0) {
rv += 'column ' + o1;
} else {
rv += 'columns ' + o1 + ' .. ' + o2;
}
} else {
rv = 'lines ' + l1 + '(column ' + o1 + ') .. ' + l2 + '(column ' + o2 + ')';
}
if (yylloc.range && display_range_too) {
var r1 = yylloc.range[0];
var r2 = yylloc.range[1] - 1;
if (r2 === r1) {
rv += ' {String Offset: ' + r1 + '}';
} else {
rv += ' {String Offset range: ' + r1 + ' .. ' + r2 + '}';
}
}
return rv;
// return JSON.stringify(yylloc);
},
// test the lexed token: return FALSE when not a match, otherwise return token.
//
// `match` is supposed to be an array coming out of a regex match, i.e. `match[0]`
// contains the actually matched text string.
//
// Also move the input cursor forward and update the match collectors:
// - yytext
// - yyleng
// - match
// - matches
// - yylloc
// - offset
test_match: function lexer_test_match(match, indexed_rule) {
var token,
lines,
backup,
match_str;
if (this.options.backtrack_lexer) {
// save context
backup = {
yylineno: this.yylineno,
yylloc: {
first_line: this.yylloc.first_line,
last_line: this.last_line,
first_column: this.yylloc.first_column,
last_column: this.yylloc.last_column
},
yytext: this.yytext,
match: this.match,
matches: this.matches,
matched: this.matched,
yyleng: this.yyleng,
offset: this.offset,
_more: this._more,
_input: this._input,
yy: this.yy,
conditionStack: this.conditionStack.slice(0),
done: this.done
};
if (this.options.ranges) {
backup.yylloc.range = this.yylloc.range.slice(0);
}
}
match_str = match[0];
lines = match_str.match(/(?:\r\n?|\n).*/g);
if (lines) {
this.yylineno += lines.length;
}
this.yylloc = {
first_line: this.yylloc.last_line,
last_line: this.yylineno + 1,
first_column: this.yylloc.last_column,
last_column: lines ?
lines[lines.length - 1].length - lines[lines.length - 1].match(/\r?\n?/)[0].length :
this.yylloc.last_column + match_str.length
};
this.yytext += match_str;
this.match += match_str;
this.matches = match;
this.yyleng = this.yytext.length;
if (this.options.ranges) {
this.yylloc.range = [this.offset, this.offset + this.yyleng];
}
// previous lex rules MAY have invoked the `more()` API rather than producing a token:
// those rules will already have moved this `offset` forward matching their match lengths,
// hence we must only add our own match length now:
this.offset += match_str.length;
this._more = false;
this._backtrack = false;
this._input = this._input.slice(match_str.length);
this.matched += match_str;
// calling this method:
//
// function lexer__performAction(yy, yy_, $avoiding_name_collisions, YY_START) {...}
token = this.performAction.call(this, this.yy, this, indexed_rule, this.conditionStack[this.conditionStack.length - 1] /* = YY_START */);
// otherwise, when the action codes are all simple return token statements:
//token = this.simpleCaseActionClusters[indexed_rule];
if (this.done && this._input) {
this.done = false;
}
if (token) {
return token;
} else if (this._backtrack) {
// recover context
for (var k in backup) {
this[k] = backup[k];
}
this.__currentRuleSet__ = null;
return false; // rule action called reject() implying the next rule should be tested instead.
} else if (this._signaled_error_token) {
// produce one 'error' token as .parseError() in reject() did not guarantee a failure signal by throwing an exception!
token = this._signaled_error_token;
this._signaled_error_token = false;
return token;
}
return false;
},
// return next match in input
next: function lexer_next() {
if (this.done) {
this.clear();
return this.EOF;
}
if (!this._input) {
this.done = true;
}
var token,
match,
tempMatch,
index;
if (!this._more) {
this.clear();
}
var spec = this.__currentRuleSet__;
if (!spec) {
// Update the ruleset cache as we apparently encountered a state change or just started lexing.
// The cache is set up for fast lookup -- we assume a lexer will switch states much less often than it will
// invoke the `lex()` token-producing API and related APIs, hence caching the set for direct access helps
// speed up those activities a tiny bit.
spec = this.__currentRuleSet__ = this._currentRules();
}
var rule_ids = spec.rules;
// var dispatch = spec.__dispatch_lut;
var regexes = spec.__rule_regexes;
var len = spec.__rule_count;
// var c0 = this._input[0];
// Note: the arrays are 1-based, while `len` itself is a valid index,
// hence the non-standard less-or-equal check in the next loop condition!
//
// `dispatch` is a lookup table which lists the *first* rule which matches the 1-char *prefix* of the rule-to-match.
// By using that array as a jumpstart, we can cut down on the otherwise O(n*m) behaviour of this lexer, down to
// O(n) ideally, where:
//
// - N is the number of input particles -- which is not precisely characters
// as we progress on a per-regex-match basis rather than on a per-character basis
//
// - M is the number of rules (regexes) to test in the active condition state.
//
for (var i = 1 /* (dispatch[c0] || 1) */ ; i <= len; i++) {
tempMatch = this._input.match(regexes[i]);
if (tempMatch && (!match || tempMatch[0].length > match[0].length)) {
match = tempMatch;
index = i;
if (this.options.backtrack_lexer) {
token = this.test_match(tempMatch, rule_ids[i]);
if (token !== false) {
return token;
} else if (this._backtrack) {
match = undefined;
continue; // rule action called reject() implying a rule MISmatch.
} else {
// else: this is a lexer rule which consumes input without producing a token (e.g. whitespace)
return false;
}
} else if (!this.options.flex) {
break;
}
}
}
if (match) {
token = this.test_match(match, rule_ids[index]);
if (token !== false) {
return token;
}
// else: this is a lexer rule which consumes input without producing a token (e.g. whitespace)
return false;
}
if (this._input === '') {
this.done = true;
return this.EOF;
} else {
var p = this.constructLexErrorInfo('Lexical error on line ' + (this.yylineno + 1) + '. Unrecognized text.\n' + this.showPosition(), this.options.lexer_errors_are_recoverable);
token = (this.parseError(p.errStr, p) || this.ERROR);
if (token === this.ERROR) {
// we can try to recover from a lexer error that parseError() did not 'recover' for us, by moving forward at least one character at a time:
if (!this.match.length) {
this.input();
}
}
return token;
}
},
// return next match that has a token
lex: function lexer_lex() {
var r;
// allow the PRE/POST handlers set/modify the return token for maximum flexibility of the generated lexer:
if (typeof this.options.pre_lex === 'function') {
r = this.options.pre_lex.call(this);
}
while (!r) {
r = this.next();
}
if (typeof this.options.post_lex === 'function') {
// (also account for a userdef function which does not return any value: keep the token as is)
r = this.options.post_lex.call(this, r) || r;
}
return r;
},
// backwards compatible alias for `pushState()`;
// the latter is symmetrical with `popState()` and we advise to use
// those APIs in any modern lexer code, rather than `begin()`.
begin: function lexer_begin(condition) {
return this.pushState(condition);
},
// activates a new lexer condition state (pushes the new lexer condition state onto the condition stack)
pushState: function lexer_pushState(condition) {
this.conditionStack.push(condition);
this.__currentRuleSet__ = null;
return this;
},
// pop the previously active lexer condition state off the condition stack
popState: function lexer_popState() {
var n = this.conditionStack.length - 1;
if (n > 0) {
this.__currentRuleSet__ = null;
return this.conditionStack.pop();
} else {
return this.conditionStack[0];
}
},
// return the currently active lexer condition state; when an index argument is provided it produces the N-th previous condition state, if available
topState: function lexer_topState(n) {
n = this.conditionStack.length - 1 - Math.abs(n || 0);
if (n >= 0) {
return this.conditionStack[n];
} else {
return 'INITIAL';
}
},
// (internal) determine the lexer rule set which is active for the currently active lexer condition state
_currentRules: function lexer__currentRules() {
if (this.conditionStack.length && this.conditionStack[this.conditionStack.length - 1]) {
return this.conditions[this.conditionStack[this.conditionStack.length - 1]];
} else {
return this.conditions['INITIAL'];
}
},
// return the number of states currently on the stack
stateStackSize: function lexer_stateStackSize() {
return this.conditionStack.length;
},
options: {},
JisonLexerError: JisonLexerError,
performAction: function lexer__performAction(yy, yy_, $avoiding_name_collisions, YY_START) {
var YYSTATE = YY_START;
switch($avoiding_name_collisions) {
case 0 :
/*! Conditions:: INITIAL */
/*! Rule:: [ \t\r\n]+ */
/* eat up whitespace */
BeginToken(yy_.yytext);
break;
case 1 :
/*! Conditions:: INITIAL */
/*! Rule:: {DIGIT}+ */
BeginToken(yy_.yytext);
yylval.value = atof(yy_.yytext);
return VALUE;
break;
case 2 :
/*! Conditions:: INITIAL */
/*! Rule:: {DIGIT}+\.{DIGIT}* */
BeginToken(yy_.yytext);
yylval.value = atof(yy_.yytext);
return VALUE;
break;
case 3 :
/*! Conditions:: INITIAL */
/*! Rule:: {DIGIT}+[eE]["+""-"]?{DIGIT}* */
BeginToken(yy_.yytext);
yylval.value = atof(yy_.yytext);
return VALUE;
break;
case 4 :
/*! Conditions:: INITIAL */
/*! Rule:: {DIGIT}+\.{DIGIT}*[eE]["+""-"]?{DIGIT}* */
BeginToken(yy_.yytext);
yylval.value = atof(yy_.yytext);
return VALUE;
break;
case 5 :
/*! Conditions:: INITIAL */
/*! Rule:: {ID} */
BeginToken(yy_.yytext);
yylval.string = malloc(strlen(yy_.yytext)+1);
strcpy(yylval.string, yy_.yytext);
return IDENTIFIER;
break;
case 6 :
/*! Conditions:: INITIAL */
/*! Rule:: \+ */
BeginToken(yy_.yytext); return ADD;
break;
case 7 :
/*! Conditions:: INITIAL */
/*! Rule:: - */
BeginToken(yy_.yytext); return SUB;
break;
case 8 :
/*! Conditions:: INITIAL */
/*! Rule:: \* */
BeginToken(yy_.yytext); return MULT;
break;
case 9 :
/*! Conditions:: INITIAL */
/*! Rule:: \/ */
BeginToken(yy_.yytext); return DIV;
break;
case 10 :
/*! Conditions:: INITIAL */
/*! Rule:: \( */
BeginToken(yy_.yytext); return LBRACE;
break;
case 11 :
/*! Conditions:: INITIAL */
/*! Rule:: \) */
BeginToken(yy_.yytext); return RBRACE;
break;
case 12 :
/*! Conditions:: INITIAL */
/*! Rule:: ; */
BeginToken(yy_.yytext); return SEMICOLON;
break;
case 13 :
/*! Conditions:: INITIAL */
/*! Rule:: = */
BeginToken(yy_.yytext); return ASSIGN;
break;
case 14 :
/*! Conditions:: INITIAL */
/*! Rule:: . */
BeginToken(yy_.yytext);
return yy_.yytext[0];
break;
default:
return this.simpleCaseActionClusters[$avoiding_name_collisions];
}
},
simpleCaseActionClusters: {
},
rules: [
/^(?:[ \t\r\n]+)/,
/^(?:(\d)+)/,
/^(?:(\d)+\.(\d)*)/,
/^(?:(\d)+[Ee]["+]?(\d)*)/,
/^(?:(\d)+\.(\d)*[Ee]["+]?(\d)*)/,
/^(?:([^\W\d]\w*))/,
/^(?:\+)/,
/^(?:-)/,
/^(?:\*)/,
/^(?:\/)/,
/^(?:\()/,
/^(?:\))/,
/^(?:;)/,
/^(?:=)/,
/^(?:.)/
],
conditions: {
"INITIAL": {
rules: [
0,
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14
],
inclusive: true
}
}
};
/*--------------------------------------------------------------------
* lex.l
*------------------------------------------------------------------*/;
return lexer;
})();

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,31 @@
/**
* @fileoverview
* @enhanceable
* @public
*/
// GENERATED CODE -- DO NOT EDIT!
goog.provide('proto.google.protobuf.Timestamp');
goog.require('jspb.Message');
/**
* Generated by JsPbCodeGenerator.
* @param {Array=} opt_data Optional initial data array, typically from a
* server response, or constructed directly in Javascript. The array is used
* in place and becomes part of the constructed object. It is not cloned.
* If no data is provided, the constructed object will be empty, but still
* valid.
* @extends {jspb.Message}
* @constructor
*/
proto.google.protobuf.Timestamp = function(opt_data) {
jspb.Message.initialize(this, opt_data, 0, -1, null, null);
};
goog.inherits(proto.google.protobuf.Timestamp, jspb.Message);
if (goog.DEBUG && !COMPILED) {
proto.google.protobuf.Timestamp.displayName = 'proto.google.protobuf.Timestamp';
}
// Remainder elided

View File

@@ -0,0 +1,39 @@
digit [0-9]
id [a-zA-Z][a-zA-Z0-9]*
%%
"//".* /* ignore comment */
"main" return 'MAIN';
"class" return 'CLASS';
"extends" return 'EXTENDS';
"nat" return 'NATTYPE';
"if" return 'IF';
"else" return 'ELSE';
"for" return 'FOR';
"printNat" return 'PRINTNAT';
"readNat" return 'READNAT';
"this" return 'THIS';
"new" return 'NEW';
"var" return 'VAR';
"null" return 'NUL';
{digit}+ return 'NATLITERAL';
{id} return 'ID';
"==" return 'EQUALITY';
"=" return 'ASSIGN';
"+" return 'PLUS';
"-" return 'MINUS';
"*" return 'TIMES';
">" return 'GREATER';
"||" return 'OR';
"!" return 'NOT';
"." return 'DOT';
"{" return 'LBRACE';
"}" return 'RBRACE';
"(" return 'LPAREN';
")" return 'RPAREN';
";" return 'SEMICOLON';
\s+ /* skip whitespace */
"." throw 'Illegal character';
<<EOF>> return 'ENDOFFILE';

View File

@@ -0,0 +1,29 @@
%%
\n+ {yy.freshLine = true;}
\s+ {yy.freshLine = false;}
"y{"[^}]*"}" {yytext = yytext.substr(2, yyleng - 3); return 'ACTION';}
[a-zA-Z_][a-zA-Z0-9_-]* {return 'NAME';}
'"'([^"]|'\"')*'"' {return 'STRING_LIT';}
"'"([^']|"\'")*"'" {return 'STRING_LIT';}
"|" {return '|';}
"["("\]"|[^\]])*"]" {return 'ANY_GROUP_REGEX';}
"(" {return '(';}
")" {return ')';}
"+" {return '+';}
"*" {return '*';}
"?" {return '?';}
"^" {return '^';}
"/" {return '/';}
"\\"[a-zA-Z0] {return 'ESCAPE_CHAR';}
"$" {return '$';}
"<<EOF>>" {return '$';}
"." {return '.';}
"%%" {return '%%';}
"{"\d+(","\s?\d+|",")?"}" {return 'RANGE_REGEX';}
/"{" %{if (yy.freshLine) { this.input('{'); return '{'; } else { this.unput('y'); }%}
"}" %{return '}';%}
"%{"(.|\n)*?"}%" {yytext = yytext.substr(2, yyleng - 4); return 'ACTION';}
. {/* ignore bad characters */}
<<EOF>> {return 'EOF';}

418
samples/Jison/ansic.jison Normal file
View File

@@ -0,0 +1,418 @@
%token IDENTIFIER CONSTANT STRING_LITERAL SIZEOF
%token PTR_OP INC_OP DEC_OP LEFT_OP RIGHT_OP LE_OP GE_OP EQ_OP NE_OP
%token AND_OP OR_OP MUL_ASSIGN DIV_ASSIGN MOD_ASSIGN ADD_ASSIGN
%token SUB_ASSIGN LEFT_ASSIGN RIGHT_ASSIGN AND_ASSIGN
%token XOR_ASSIGN OR_ASSIGN TYPE_NAME
%token TYPEDEF EXTERN STATIC AUTO REGISTER
%token CHAR SHORT INT LONG SIGNED UNSIGNED FLOAT DOUBLE CONST VOLATILE VOID
%token STRUCT UNION ENUM ELLIPSIS
%token CASE DEFAULT IF ELSE SWITCH WHILE DO FOR GOTO CONTINUE BREAK RETURN
%nonassoc IF_WITHOUT_ELSE
%nonassoc ELSE
%start translation_unit
%%
primary_expression
: IDENTIFIER
| CONSTANT
| STRING_LITERAL
| '(' expression ')'
;
postfix_expression
: primary_expression
| postfix_expression '[' expression ']'
| postfix_expression '(' ')'
| postfix_expression '(' argument_expression_list ')'
| postfix_expression '.' IDENTIFIER
| postfix_expression PTR_OP IDENTIFIER
| postfix_expression INC_OP
| postfix_expression DEC_OP
;
argument_expression_list
: assignment_expression
| argument_expression_list ',' assignment_expression
;
unary_expression
: postfix_expression
| INC_OP unary_expression
| DEC_OP unary_expression
| unary_operator cast_expression
| SIZEOF unary_expression
| SIZEOF '(' type_name ')'
;
unary_operator
: '&'
| '*'
| '+'
| '-'
| '~'
| '!'
;
cast_expression
: unary_expression
| '(' type_name ')' cast_expression
;
multiplicative_expression
: cast_expression
| multiplicative_expression '*' cast_expression
| multiplicative_expression '/' cast_expression
| multiplicative_expression '%' cast_expression
;
additive_expression
: multiplicative_expression
| additive_expression '+' multiplicative_expression
| additive_expression '-' multiplicative_expression
;
shift_expression
: additive_expression
| shift_expression LEFT_OP additive_expression
| shift_expression RIGHT_OP additive_expression
;
relational_expression
: shift_expression
| relational_expression '<' shift_expression
| relational_expression '>' shift_expression
| relational_expression LE_OP shift_expression
| relational_expression GE_OP shift_expression
;
equality_expression
: relational_expression
| equality_expression EQ_OP relational_expression
| equality_expression NE_OP relational_expression
;
and_expression
: equality_expression
| and_expression '&' equality_expression
;
exclusive_or_expression
: and_expression
| exclusive_or_expression '^' and_expression
;
inclusive_or_expression
: exclusive_or_expression
| inclusive_or_expression '|' exclusive_or_expression
;
logical_and_expression
: inclusive_or_expression
| logical_and_expression AND_OP inclusive_or_expression
;
logical_or_expression
: logical_and_expression
| logical_or_expression OR_OP logical_and_expression
;
conditional_expression
: logical_or_expression
| logical_or_expression '?' expression ':' conditional_expression
;
assignment_expression
: conditional_expression
| unary_expression assignment_operator assignment_expression
;
assignment_operator
: '='
| MUL_ASSIGN
| DIV_ASSIGN
| MOD_ASSIGN
| ADD_ASSIGN
| SUB_ASSIGN
| LEFT_ASSIGN
| RIGHT_ASSIGN
| AND_ASSIGN
| XOR_ASSIGN
| OR_ASSIGN
;
expression
: assignment_expression
| expression ',' assignment_expression
;
constant_expression
: conditional_expression
;
declaration
: declaration_specifiers ';'
| declaration_specifiers init_declarator_list ';'
;
declaration_specifiers
: storage_class_specifier
| storage_class_specifier declaration_specifiers
| type_specifier
| type_specifier declaration_specifiers
| type_qualifier
| type_qualifier declaration_specifiers
;
init_declarator_list
: init_declarator
| init_declarator_list ',' init_declarator
;
init_declarator
: declarator
| declarator '=' initializer
;
storage_class_specifier
: TYPEDEF
| EXTERN
| STATIC
| AUTO
| REGISTER
;
type_specifier
: VOID
| CHAR
| SHORT
| INT
| LONG
| FLOAT
| DOUBLE
| SIGNED
| UNSIGNED
| struct_or_union_specifier
| enum_specifier
| TYPE_NAME
;
struct_or_union_specifier
: struct_or_union IDENTIFIER '{' struct_declaration_list '}'
| struct_or_union '{' struct_declaration_list '}'
| struct_or_union IDENTIFIER
;
struct_or_union
: STRUCT
| UNION
;
struct_declaration_list
: struct_declaration
| struct_declaration_list struct_declaration
;
struct_declaration
: specifier_qualifier_list struct_declarator_list ';'
;
specifier_qualifier_list
: type_specifier specifier_qualifier_list
| type_specifier
| type_qualifier specifier_qualifier_list
| type_qualifier
;
struct_declarator_list
: struct_declarator
| struct_declarator_list ',' struct_declarator
;
struct_declarator
: declarator
| ':' constant_expression
| declarator ':' constant_expression
;
enum_specifier
: ENUM '{' enumerator_list '}'
| ENUM IDENTIFIER '{' enumerator_list '}'
| ENUM IDENTIFIER
;
enumerator_list
: enumerator
| enumerator_list ',' enumerator
;
enumerator
: IDENTIFIER
| IDENTIFIER '=' constant_expression
;
type_qualifier
: CONST
| VOLATILE
;
declarator
: pointer direct_declarator
| direct_declarator
;
direct_declarator
: IDENTIFIER
| '(' declarator ')'
| direct_declarator '[' constant_expression ']'
| direct_declarator '[' ']'
| direct_declarator '(' parameter_type_list ')'
| direct_declarator '(' identifier_list ')'
| direct_declarator '(' ')'
;
pointer
: '*'
| '*' type_qualifier_list
| '*' pointer
| '*' type_qualifier_list pointer
;
type_qualifier_list
: type_qualifier
| type_qualifier_list type_qualifier
;
parameter_type_list
: parameter_list
| parameter_list ',' ELLIPSIS
;
parameter_list
: parameter_declaration
| parameter_list ',' parameter_declaration
;
parameter_declaration
: declaration_specifiers declarator
| declaration_specifiers abstract_declarator
| declaration_specifiers
;
identifier_list
: IDENTIFIER
| identifier_list ',' IDENTIFIER
;
type_name
: specifier_qualifier_list
| specifier_qualifier_list abstract_declarator
;
abstract_declarator
: pointer
| direct_abstract_declarator
| pointer direct_abstract_declarator
;
direct_abstract_declarator
: '(' abstract_declarator ')'
| '[' ']'
| '[' constant_expression ']'
| direct_abstract_declarator '[' ']'
| direct_abstract_declarator '[' constant_expression ']'
| '(' ')'
| '(' parameter_type_list ')'
| direct_abstract_declarator '(' ')'
| direct_abstract_declarator '(' parameter_type_list ')'
;
initializer
: assignment_expression
| '{' initializer_list '}'
| '{' initializer_list ',' '}'
;
initializer_list
: initializer
| initializer_list ',' initializer
;
statement
: labeled_statement
| compound_statement
| expression_statement
| selection_statement
| iteration_statement
| jump_statement
;
labeled_statement
: IDENTIFIER ':' statement
| CASE constant_expression ':' statement
| DEFAULT ':' statement
;
compound_statement
: '{' '}'
| '{' statement_list '}'
| '{' declaration_list '}'
| '{' declaration_list statement_list '}'
;
declaration_list
: declaration
| declaration_list declaration
;
statement_list
: statement
| statement_list statement
;
expression_statement
: ';'
| expression ';'
;
selection_statement
: IF '(' expression ')' statement %prec IF_WITHOUT_ELSE
| IF '(' expression ')' statement ELSE statement
| SWITCH '(' expression ')' statement
;
iteration_statement
: WHILE '(' expression ')' statement
| DO statement WHILE '(' expression ')' ';'
| FOR '(' expression_statement expression_statement ')' statement
| FOR '(' expression_statement expression_statement expression ')' statement
;
jump_statement
: GOTO IDENTIFIER ';'
| CONTINUE ';'
| BREAK ';'
| RETURN ';'
| RETURN expression ';'
;
translation_unit
: external_declaration
| translation_unit external_declaration
;
external_declaration
: function_definition
| declaration
;
function_definition
: declaration_specifiers declarator declaration_list compound_statement
| declaration_specifiers declarator compound_statement
| declarator declaration_list compound_statement
| declarator compound_statement
;

View File

@@ -0,0 +1,84 @@
/* description: ClassyLang grammar. Very classy. */
/*
To build parser:
$ ./bin/jison examples/classy.jison examples/classy.jisonlex
*/
/* author: Zach Carter */
%right ASSIGN
%left OR
%nonassoc EQUALITY GREATER
%left PLUS MINUS
%left TIMES
%right NOT
%left DOT
%%
pgm
: cdl MAIN LBRACE vdl el RBRACE ENDOFFILE
;
cdl
: c cdl
|
;
c
: CLASS id EXTENDS id LBRACE vdl mdl RBRACE
;
vdl
: VAR t id SEMICOLON vdl
|
;
mdl
: t id LPAREN t id RPAREN LBRACE vdl el RBRACE mdl
|
;
t
: NATTYPE
| id
;
id
: ID
;
el
: e SEMICOLON el
| e SEMICOLON
;
e
: NATLITERAL
| NUL
| id
| NEW id
| THIS
| IF LPAREN e RPAREN LBRACE el RBRACE ELSE LBRACE el RBRACE
| FOR LPAREN e SEMICOLON e SEMICOLON e RPAREN LBRACE el RBRACE
| READNAT LPAREN RPAREN
| PRINTNAT LPAREN e RPAREN
| e PLUS e
| e MINUS e
| e TIMES e
| e EQUALITY e
| e GREATER e
| NOT e
| e OR e
| e DOT id
| id ASSIGN e
| e DOT id ASSIGN e
| id LPAREN e RPAREN
| e DOT id LPAREN e RPAREN
| LPAREN e RPAREN
;

145
samples/Jison/lex.jison Normal file
View File

@@ -0,0 +1,145 @@
// `%nonassoc` tells the parser compiler (JISON) that these tokens cannot occur more than once,
// i.e. input like '//a' (tokens '/', '/' and 'a') is not a legal input while '/a' (tokens '/' and 'a')
// *is* legal input for this grammar.
%nonassoc '/' '/!'
// Likewise for `%left`: this informs the LALR(1) grammar compiler (JISON) that these tokens
// *can* occur repeatedly, e.g. 'a?*' and even 'a**' are considered legal inputs given this
// grammar!
//
// Token `RANGE_REGEX` may seem the odd one out here but really isn't: given the `regex_base`
// choice/rule `regex_base range_regex`, which is recursive, this grammar tells JISON that
// any input matching a sequence like `regex_base range_regex range_regex` *is* legal.
// If you do not want that to be legal, you MUST adjust the grammar rule set you match your
// actual intent.
%left '*' '+' '?' RANGE_REGEX
%%
lex
: definitions include '%%' rules '%%' EOF
{{ $$ = {macros: $1, rules: $4};
if ($2) $$.actionInclude = $2;
return $$; }}
| definitions include '%%' rules EOF
{{ $$ = {macros: $1, rules: $4};
if ($2) $$.actionInclude = $2;
return $$; }}
;
include
: action
|
;
definitions
: definitions definition
{ $$ = $1; $$.concat($2); }
| definition
{ $$ = [$1]; }
;
definition
: name regex
{ $$ = [$1, $2]; }
;
name
: NAME
{ $$ = yytext; }
;
rules
: rules rule
{ $$ = $1; $$.push($2); }
| rule
{ $$ = [$1]; }
;
rule
: regex action
{ $$ = [$1, $2]; }
;
action
: ACTION
{ $$ = yytext; }
;
regex
: start_caret regex_list end_dollar
{ $$ = $1+$2+$3; }
;
start_caret
: '^'
{ $$ = '^'; }
|
{ $$ = ''; }
;
end_dollar
: '$'
{ $$ = '$'; }
|
{ $$ = ''; }
;
regex_list
: regex_list '|' regex_chain
{ $$ = $1+'|'+$3; }
| regex_chain
;
regex_chain
: regex_chain regex_base
{ $$ = $1+$2;}
| regex_base
{ $$ = $1;}
;
regex_base
: '(' regex_list ')'
{ $$ = '('+$2+')'; }
| regex_base '+'
{ $$ = $1+'+'; }
| regex_base '*'
{ $$ = $1+'*'; }
| regex_base '?'
{ $$ = $1+'?'; }
| '/' regex_base
{ $$ = '(?=' + $regex_base + ')'; }
| '/!' regex_base
{ $$ = '(?!' + $regex_base + ')'; }
| name_expansion
| regex_base range_regex
{ $$ = $1+$2; }
| any_group_regex
| '.'
{ $$ = '.'; }
| string
;
name_expansion
: '{' name '}'
{{ $$ = '{'+$2+'}'; }}
;
any_group_regex
: ANY_GROUP_REGEX
{ $$ = yytext; }
;
range_regex
: RANGE_REGEX
{ $$ = yytext; }
;
string
: STRING_LIT
{ $$ = yy.prepareString(yytext.substr(1, yyleng-2)); }
;

37
samples/Jolie/common.iol Normal file
View File

@@ -0,0 +1,37 @@
include "types/Binding.iol"
constants {
Location_Exam = "socket://localhost:8000"
}
type StartExamRequest:void {
.examName:string
.studentName:string
.student:Binding
}
type MakeQuestionRequest:void {
.question:string
.examName:string
.studentName:string
}
type DecisionMessage:void {
.studentName:string
.examName:string
}
interface ExamInterface {
OneWay:
startExam(StartExamRequest),
pass(DecisionMessage), fail(DecisionMessage)
RequestResponse:
makeQuestion(MakeQuestionRequest)(int)
}
interface StudentInterface {
OneWay:
sendMessage(string)
RequestResponse:
makeQuestion(MakeQuestionRequest)(int)
}

39
samples/Jolie/exam.ol Normal file
View File

@@ -0,0 +1,39 @@
include "common.iol"
cset {
studentName:
StartExamRequest.studentName
DecisionMessage.studentName
MakeQuestionRequest.studentName,
examName:
StartExamRequest.examName
DecisionMessage.examName
MakeQuestionRequest.examName
}
execution { concurrent }
outputPort Student {
Interfaces: StudentInterface
}
inputPort ExamInput {
Location: Location_Exam
Protocol: sodep
Interfaces: ExamInterface
}
main
{
startExam( examRequest );
Student << examRequest.student;
makeQuestion( question )( answer ) {
makeQuestion@Student( question )( answer )
};
[ pass( message ) ] {
sendMessage@Student( "You passed!" )
}
[ fail( message ) ] {
sendMessage@Student( "You failed!" )
}
}

26
samples/Jolie/examiner.ol Normal file
View File

@@ -0,0 +1,26 @@
include "common.iol"
include "ui/swing_ui.iol"
include "console.iol"
outputPort Exam {
Location: Location_Exam
Protocol: sodep
Interfaces: ExamInterface
}
main
{
question.studentName = "John";
question.examName = "SPLG";
question.question = "Random question";
makeQuestion@Exam( question )( answer );
showYesNoQuestionDialog@SwingUI( "Do you want to accept answer " + answer + " ?" )( decision );
message.studentName = "John";
message.examName = "SPLG";
if ( decision == 0 ) {
pass@Exam( message )
} else {
fail@Exam( message )
}
}

84
samples/Jolie/hanoi.ol Normal file
View File

@@ -0,0 +1,84 @@
// https://github.com/jolie/website/blob/master/docs/documentation/locations/code/local.ol
include "runtime.iol"
include "string_utils.iol"
type HanoiRequest: void{
.src: string
.aux: string
.dst: string
.n: int
.sid?: string
}
type HanoiReponse: void {
.move?: string
}
interface LocalOperations{
RequestResponse:
hanoiSolver( HanoiRequest )( HanoiReponse )
}
interface ExternalOperations{
RequestResponse:
hanoi( HanoiRequest )( string )
}
outputPort Self{
Interfaces: LocalOperations
}
inputPort Self {
Location: "local"
Interfaces: LocalOperations
}
inputPort PowerService {
Location: "socket://localhost:8000"
Protocol: http{
.format = "html"
}
Interfaces: ExternalOperations
}
execution { concurrent }
init
{
getLocalLocation@Runtime()( Self.location )
}
main
{
[ hanoi( request )( response ){
getRandomUUID@StringUtils()(request.sid);
hanoiSolver@Self( request )( subRes );
response = subRes.move
}]{ nullProcess }
[ hanoiSolver( request )( response ){
if ( request.n > 0 ){
subReq.n = request.n;
subReq.n--;
with( request ){
subReq.aux = .dst;
subReq.dst = .aux;
subReq.src = .src;
subReq.sid = .sid
};
hanoiSolver@Self( subReq )( response );
response.move += "<br>" +
++global.counters.(request.sid) +
") Move from " + request.src +
" to " + request.dst + ";";
with ( request ){
subReq.src = .aux;
subReq.aux = .src;
subReq.dst = .dst
};
hanoiSolver@Self( subReq )( subRes );
response.move += subRes.move
}
}]{ nullProcess }
}

29
samples/Jolie/student.ol Normal file
View File

@@ -0,0 +1,29 @@
include "common.iol"
include "ui/swing_ui.iol"
include "console.iol"
outputPort Exam {
Location: Location_Exam
Protocol: sodep
Interfaces: ExamInterface
}
inputPort StudentInput {
Location: "socket://localhost:8001/"
Protocol: sodep
Interfaces: StudentInterface
}
main
{
request.studentName = "John";
request.examName = "SPLG";
request.student.location = "socket://localhost:8001/";
request.student.protocol = "sodep";
startExam@Exam( request );
makeQuestion( question )( answer ) {
showYesNoQuestionDialog@SwingUI( question.question )( answer )
};
sendMessage( message );
println@Console( message )()
}

View File

@@ -0,0 +1,49 @@
- label: 'desired label name'
- connection: connection_name
- include: filename_or_pattern
# Possibly more include declarations
- persist_for: N (seconds | minutes | hours)
- case_sensitive: true | false
- week_start_day: monday | tuesday | wednesday | thursday | friday | saturday | sunday
- value_formats:
- name: desired_format_name
value_format: 'excel-style formatting string'
# Possibly more value formats
- explore: view_name
label: 'desired label name'
description: 'description string'
symmetric_aggregates: true | false
hidden: true | false
fields: [field_or_set, field_or_set, …]
sql_always_where: SQL WHERE condition
always_filter:
field_name: 'looker filter expression'
conditionally_filter:
field_name: 'looker filter expression'
unless: [field_or_set, field_or_set, …]
access_filter_fields: [fully_scoped_field, fully_scoped_field, …]
always_join: [view_name, view_name, …]
joins:
- join: view_name
type: left_outer | full_outer | inner | cross
relationship: one_to_one | many_to_one | one_to_many | many_to_many
from: view_name
sql_table_name: table_name
view_label: 'desired label name'
fields: [field_or_set, field_or_set, …]
required_joins: [view_name, view_name, …]
foreign_key: dimension_name
sql_on: SQL ON clause
# Possibly more join declarations
persist_for: N (seconds | minutes | hours)
from: view_name
view: view_name
case_sensitive: true | false
sql_table_name: table_name
cancel_grouping_fields: [fully_scoped_field, fully_scoped_field, …]
# Possibly more explore declarations

View File

@@ -0,0 +1,90 @@
- view: view_name
sql_table_name: table_name
suggestions: true | false
derived_table:
sql: SQL query
persist_for: N (seconds | minutes | hours)
sql_trigger_value: SQL query
distribution: column_name
distribution_style: ALL | EVEN
sortkeys: [column_name, column_name, …]
indexes: [column_name, column_name, …]
sets:
set_name:
- field_or_set
- field_or_set
- …
# Possibly more set declarations
fields:
- (dimension | dimension_group | measure | filter): field_name
label: 'desired label name'
view_label: 'desired label name'
group_label: 'desired label name'
description: 'description string'
hidden: true | false
alias: [old_field_name, old_field_name, …]
value_format: 'excel-style formatting string'
value_format_name: format_name
html: HTML expression using Liquid template elements
sql: SQL expression to generate the field value
required_fields: [field_name, field_name, …]
drill_fields: [field_or_set, field_or_set, …]
can_filter: true | false
fanout_on: repeated_record_name
# DIMENSION SPECIFIC PARAMETERS
type: dimension_field_type
primary_key: true | false
sql_case:
value: SQL condition
value: SQL condition
# Possibly more sql_case statements
alpha_sort: true | false
tiers: [N, N, …]
style: classic | interval | integer | relational
sql_latitude: SQL expression to generate a latitude
sql_longitude: SQL expression to generate a longitude
suggestable: true | false
suggest_persist_for: N (seconds | minutes | hours)
suggest_dimension: dimension_name
suggest_explore: explore_name
suggestions: ['suggestion string', 'suggestion string', …]
bypass_suggest_restrictions: true | false
full_suggestions: true | false
skip_drill_filter: true | false
case_sensitive: true | false
order_by_field: dimension_name
map_layer: name_of_map_layer
links:
- label: 'desired label name'
url: desired_url
icon_url: url_of_an_ico_file
# Possibly more links
# DIMENSION GROUP SPECIFIC PARAMETERS
timeframes: [timeframe, timeframe, …]
convert_tz: true | false
datatype: epoch | timestamp | datetime | date | yyyymmdd
# MEASURE SPECIFIC PARAMETERS
type: measure_field_type
direction: row | column
approximate: true | false
approximate_threshold: N
sql_distinct_key: SQL expression to define repeated entities
list_field: dimension_name
filters:
dimension_name: 'looker filter expression'
# Possibly more filters statements
# FILTER SPECIFIC PARAMETERS
default_value: 'desired default value'
# Possibly more dimension or measure declarations

View File

@@ -1,9 +1,14 @@
---
type: grammar
name: css.tmbundle
license: permissive
curated: true
---
# Installation
You can install this bundle in TextMate by opening the preferences and going to the bundles tab. After installation it will be automatically updated for you.
# General
* [Bundle Styleguide](http://kb.textmate.org/bundle_styleguide) — _before you make changes_
* [Commit Styleguide](http://kb.textmate.org/commit_styleguide) — _before you send a pull request_
* [Writing Bug Reports](http://kb.textmate.org/writing_bug_reports) — _before you report an issue_
# License
If not otherwise specified (see below), files in this repository fall under the following license:
@@ -12,4 +17,4 @@ If not otherwise specified (see below), files in this repository fall under the
express or implied warranty, and with no claim as to its
suitability for any purpose.
An exception is made for files in readable text which contain their own license information, or files where an accompanying file exists (in the same directory) with a “-license” suffix added to the base-name name of the original file, and an extension of txt, html, or similar. For example “tidy” is accompanied by “tidy-license.txt”.
An exception is made for files in readable text which contain their own license information, or files where an accompanying file exists (in the same directory) with a “-license” suffix added to the base-name name of the original file, and an extension of txt, html, or similar. For example “tidy” is accompanied by “tidy-license.txt”.

View File

@@ -0,0 +1,192 @@
---
uti: com.xamarin.workbook
platforms:
- Console
---
# Using C# 6
Some examples from Xamarin's [intro to C# 6](https://developer.xamarin.com/guides/cross-platform/advanced/csharp_six/).
* Null-conditional operator
* String Interpolation
* Expression-bodied Function Members
* Auto-property Initialization
* Index Initializers
* using static
## Null-conditional operator
The `?.` operator automatically does a null-check before referencing the
specified member. The example string array below has a `null` entry:
```csharp
var names = new string[] { "Foo", null };
```
In C# 5, a null-check is required before accessing the `.Length` property:
```csharp
// C# 5
int secondLength = 0;
if (names[1] != null)
secondLength = names[1].Length;
```
C# 6 allows the length to be queried in a single line; the entire
statement returns `null` if any object is null.
```csharp
var length0 = names[0]?.Length; // 3
var length1 = names[1]?.Length; // null
```
This can be used in conjunction with the `??` null coalescing operator
to set a default value (such as `0`) in the example below:
```csharp
var lengths = names.Select (names => names?.Length ?? 0); //[3, 0]
```
## String Interpolation
Previously strings were built in a number of different ways:
```csharp
var animal = "Monkeys";
var food = "bananas";
var out1 = String.Format ("{0} love to eat {1}", animal, food);
var out2 = animal + " love to eat " + food;
// or even StringBuilder
```
C# 6 provides a simple syntax where the fieldname can be
embedded directly in the string:
```csharp
$"{animal} love to eat {food}"
```
String-formatting can also be done with this syntax:
```csharp
var values = new int[] { 1, 2, 3, 4, 12, 123456 };
foreach (var s in values.Select (i => $"The value is {i,10:N2}.")) {
Console.WriteLine (s);
}
```
## Expression-bodied Function Members
The `ToString` override in the following class is an expression-bodied
function - a more succinct declaration syntax.
```csharp
class Person
{
public string FirstName { get; }
public string LastName { get; }
public Person (string firstname, string lastname)
{
FirstName = firstname;
LastName = lastname;
}
// note there is no explicit `return` keyword
public override string ToString () => $"{LastName}, {FirstName} {LastName}";
}
```
`void` expression bodied functions are also allowed so long as
the expression is a statement:
```csharp
public void Log(string message) => System.Console.WriteLine($"{DateTime.Now.ToString ("s", System.Globalization.CultureInfo.InvariantCulture )}: {message}");
```
This simple example calls these two methods:
```csharp
Log(new Person("James", "Bond").ToString())
```
## Auto-property Initialization
Properties (ie. specified with `{get;set;}`) can be initialized inline
with C# 6:
```csharp
class Todo
{
public bool Done { get; set; } = false;
public DateTime Created { get; } = DateTime.Now;
public string Description { get; }
public Todo (string description)
{
this.Description = description; // can assign (only in constructor!)
}
public override string ToString () => $"'{Description}' was created on {Created}";
}
```
```csharp
new Todo("buy apples")
```
## Index Initializers
Dictionary-style data structures let you specify key/value
types with a simple object-initializer-like syntax:
```csharp
var userInfo = new Dictionary<string,object> {
["Created"] = DateTime.Now,
["Due"] = DateTime.Now.AddSeconds(60 * 60 * 24),
["Task"] = "buy lettuce"
};
```
## using static
Enumerations, and certain classes such as System.Math, are primarily
holders of static values and functions. In C# 6, you can import all
static members of a type with a single using static statement:
```csharp
using static System.Math;
```
C# 6 code can then reference the static members directly, avoiding
repetition of the class name (eg. `Math.PI` becomes `PI`):
```csharp
public class Location
{
public Location (double lat, double @long) {Latitude = lat; Longitude = @long;}
public double Latitude = 0; public double Longitude = 0;
}
static public double MilesBetween(Location loc1, Location loc2)
{
double rlat1 = PI * loc1.Latitude / 180;
double rlat2 = PI * loc2.Latitude / 180;
double theta = loc1.Longitude - loc2.Longitude;
double rtheta = PI * theta / 180;
double dist =
Sin(rlat1) * Sin(rlat2) + Cos(rlat1) *
Cos(rlat2) * Cos(rtheta);
dist = Acos(dist);
dist = dist*180/PI;
dist = dist*60*1.1515;
return dist; //miles
}
```
```csharp
MilesBetween (new Location(-12,22), new Location(-13,33))
```

View File

@@ -0,0 +1 @@
_This_ is a **Markdown** readme.

View File

@@ -0,0 +1,26 @@
class {
constructor() {
this.state = { count:0 };
}
increment() {
this.state.count++;
}
}
style {
.count {
color:#09c;
font-size:3em;
}
.example-button {
font-size:1em;
padding:0.5em;
}
}
<div.count>
${state.count}
</div>
<button.example-button on-click('increment')>
Click me!
</button>

15
samples/Marko/hello.marko Normal file
View File

@@ -0,0 +1,15 @@
$ var name = 'Frank';
$ var colors = ['red', 'green', 'blue'];
<h1>
Hello ${name}!
</h1>
<ul if(colors.length)>
<li style={color: color} for(color in colors)>
${color}
</li>
</ul>
<div else>
No colors!
</div>

View File

@@ -0,0 +1,36 @@
static const colors = ['red', 'green', 'blue'];
static const defaultColor = [255, 0, 0];
class {
onInput(input) {
this.state = { color: input.color || defaultColor };
}
updateColor() {
this.state.color = colors.map((color) => {
return parseInt(this.getEl(color + 'Input').value, 10);
});
}
getStyleColor() {
return 'rgb(' + this.state.color.join(',') + ')';
}
}
<div.rgb-sliders>
<div.inputs>
<for(i, color in colors)>
<div>
<label for-key=color+"Input">
${color}:
</label>
<input type="range" max="255"
key=color+"Input"
on-input('updateColor')
value=state.color[i] >
</div>
</for>
</div>
<div.color style={backgroundColor: component.getStyleColor()}>
</div>
</div>

View File

@@ -0,0 +1,51 @@
project('test', ['c'],
version: '0.1.0'
)
# This is a comment test('foo')
add_global_arguments(['-foo'])
add_global_link_arguments(['-foo'])
gnome = import('gnome') # As is this
gnome.do_something('test')
meson.source_root()
foreach foo: bar
foreach baz : foo
message(baz)
endforeach
endforeach
blah = '''
afjoakjflajf # Test
lflkasjf
test\'test
test\\\\test
test\ntest
'''
foo = ''
foo = ''''''
foo = 'string'
foo = '''string2'''
foo = 12314
foo = 1231.1231
foo = true
foo = false
foo = ['te\'st', 1, 3.3, '''test''']
foo += 1231
foo = '@0@'.format('test')
foo = include_directories('foo', kwarg: 'bar', include_directories: 'foo')
foo = true ? 'true' : 'false'
foo = 2 - 1 + 3 % 8 / 4 * 3
if true and false
elif false or true
elif true not false
elif foo == 12
elif (foo != 124) and (foo <= 200)
else
endif

View File

@@ -0,0 +1,3 @@
option('with-something', type: 'boolean',
value: true,
)

329
samples/P4/l2.p4 Normal file
View File

@@ -0,0 +1,329 @@
/*
Copyright 2013-present Barefoot Networks, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
/*
* Layer-2 processing
*/
header_type l2_metadata_t {
fields {
lkp_pkt_type : 3;
lkp_mac_sa : 48;
lkp_mac_da : 48;
lkp_mac_type : 16;
l2_nexthop : 16; /* next hop from l2 */
l2_nexthop_type : 1; /* ecmp or nexthop */
l2_redirect : 1; /* l2 redirect action */
l2_src_miss : 1; /* l2 source miss */
l2_src_move : IFINDEX_BIT_WIDTH; /* l2 source interface mis-match */
stp_group: 10; /* spanning tree group id */
stp_state : 3; /* spanning tree port state */
bd_stats_idx : 16; /* ingress BD stats index */
learning_enabled : 1; /* is learning enabled */
port_vlan_mapping_miss : 1; /* port vlan mapping miss */
same_if_check : IFINDEX_BIT_WIDTH; /* same interface check */
}
}
metadata l2_metadata_t l2_metadata;
#ifndef L2_DISABLE
/*****************************************************************************/
/* Spanning tree lookup */
/*****************************************************************************/
action set_stp_state(stp_state) {
modify_field(l2_metadata.stp_state, stp_state);
}
table spanning_tree {
reads {
ingress_metadata.ifindex : exact;
l2_metadata.stp_group: exact;
}
actions {
set_stp_state;
}
size : SPANNING_TREE_TABLE_SIZE;
}
#endif /* L2_DISABLE */
control process_spanning_tree {
#ifndef L2_DISABLE
if (l2_metadata.stp_group != STP_GROUP_NONE) {
apply(spanning_tree);
}
#endif /* L2_DISABLE */
}
#ifndef L2_DISABLE
/*****************************************************************************/
/* Source MAC lookup */
/*****************************************************************************/
action smac_miss() {
modify_field(l2_metadata.l2_src_miss, TRUE);
}
action smac_hit(ifindex) {
bit_xor(l2_metadata.l2_src_move, ingress_metadata.ifindex, ifindex);
}
table smac {
reads {
ingress_metadata.bd : exact;
l2_metadata.lkp_mac_sa : exact;
}
actions {
nop;
smac_miss;
smac_hit;
}
size : MAC_TABLE_SIZE;
}
/*****************************************************************************/
/* Destination MAC lookup */
/*****************************************************************************/
action dmac_hit(ifindex) {
modify_field(ingress_metadata.egress_ifindex, ifindex);
bit_xor(l2_metadata.same_if_check, l2_metadata.same_if_check, ifindex);
}
action dmac_multicast_hit(mc_index) {
modify_field(intrinsic_metadata.mcast_grp, mc_index);
#ifdef FABRIC_ENABLE
modify_field(fabric_metadata.dst_device, FABRIC_DEVICE_MULTICAST);
#endif /* FABRIC_ENABLE */
}
action dmac_miss() {
modify_field(ingress_metadata.egress_ifindex, IFINDEX_FLOOD);
#ifdef FABRIC_ENABLE
modify_field(fabric_metadata.dst_device, FABRIC_DEVICE_MULTICAST);
#endif /* FABRIC_ENABLE */
}
action dmac_redirect_nexthop(nexthop_index) {
modify_field(l2_metadata.l2_redirect, TRUE);
modify_field(l2_metadata.l2_nexthop, nexthop_index);
modify_field(l2_metadata.l2_nexthop_type, NEXTHOP_TYPE_SIMPLE);
}
action dmac_redirect_ecmp(ecmp_index) {
modify_field(l2_metadata.l2_redirect, TRUE);
modify_field(l2_metadata.l2_nexthop, ecmp_index);
modify_field(l2_metadata.l2_nexthop_type, NEXTHOP_TYPE_ECMP);
}
action dmac_drop() {
drop();
}
table dmac {
reads {
ingress_metadata.bd : exact;
l2_metadata.lkp_mac_da : exact;
}
actions {
#ifdef OPENFLOW_ENABLE
openflow_apply;
openflow_miss;
#endif /* OPENFLOW_ENABLE */
nop;
dmac_hit;
dmac_multicast_hit;
dmac_miss;
dmac_redirect_nexthop;
dmac_redirect_ecmp;
dmac_drop;
}
size : MAC_TABLE_SIZE;
support_timeout: true;
}
#endif /* L2_DISABLE */
control process_mac {
#ifndef L2_DISABLE
apply(smac);
apply(dmac);
#endif /* L2_DISABLE */
}
#ifndef L2_DISABLE
/*****************************************************************************/
/* MAC learn notification */
/*****************************************************************************/
field_list mac_learn_digest {
ingress_metadata.bd;
l2_metadata.lkp_mac_sa;
ingress_metadata.ifindex;
}
action generate_learn_notify() {
generate_digest(MAC_LEARN_RECEIVER, mac_learn_digest);
}
table learn_notify {
reads {
l2_metadata.l2_src_miss : ternary;
l2_metadata.l2_src_move : ternary;
l2_metadata.stp_state : ternary;
}
actions {
nop;
generate_learn_notify;
}
size : LEARN_NOTIFY_TABLE_SIZE;
}
#endif /* L2_DISABLE */
control process_mac_learning {
#ifndef L2_DISABLE
if (l2_metadata.learning_enabled == TRUE) {
apply(learn_notify);
}
#endif /* L2_DISABLE */
}
/*****************************************************************************/
/* Validate packet */
/*****************************************************************************/
action set_unicast() {
modify_field(l2_metadata.lkp_pkt_type, L2_UNICAST);
}
action set_unicast_and_ipv6_src_is_link_local() {
modify_field(l2_metadata.lkp_pkt_type, L2_UNICAST);
modify_field(ipv6_metadata.ipv6_src_is_link_local, TRUE);
}
action set_multicast() {
modify_field(l2_metadata.lkp_pkt_type, L2_MULTICAST);
add_to_field(l2_metadata.bd_stats_idx, 1);
}
action set_multicast_and_ipv6_src_is_link_local() {
modify_field(l2_metadata.lkp_pkt_type, L2_MULTICAST);
modify_field(ipv6_metadata.ipv6_src_is_link_local, TRUE);
add_to_field(l2_metadata.bd_stats_idx, 1);
}
action set_broadcast() {
modify_field(l2_metadata.lkp_pkt_type, L2_BROADCAST);
add_to_field(l2_metadata.bd_stats_idx, 2);
}
action set_malformed_packet(drop_reason) {
modify_field(ingress_metadata.drop_flag, TRUE);
modify_field(ingress_metadata.drop_reason, drop_reason);
}
table validate_packet {
reads {
#ifndef __TARGET_BMV2__
l2_metadata.lkp_mac_sa mask 0x010000000000 : ternary;
#else
l2_metadata.lkp_mac_sa : ternary;
#endif
l2_metadata.lkp_mac_da : ternary;
l3_metadata.lkp_ip_type : ternary;
l3_metadata.lkp_ip_ttl : ternary;
l3_metadata.lkp_ip_version : ternary;
#ifndef __TARGET_BMV2__
ipv4_metadata.lkp_ipv4_sa mask 0xFF000000 : ternary;
#else
ipv4_metadata.lkp_ipv4_sa : ternary;
#endif
#ifndef IPV6_DISABLE
#ifndef __TARGET_BMV2__
ipv6_metadata.lkp_ipv6_sa mask 0xFFFF0000000000000000000000000000 : ternary;
#else
ipv6_metadata.lkp_ipv6_sa : ternary;
#endif
#endif /* IPV6_DISABLE */
}
actions {
nop;
set_unicast;
set_unicast_and_ipv6_src_is_link_local;
set_multicast;
set_multicast_and_ipv6_src_is_link_local;
set_broadcast;
set_malformed_packet;
}
size : VALIDATE_PACKET_TABLE_SIZE;
}
control process_validate_packet {
if (ingress_metadata.drop_flag == FALSE) {
apply(validate_packet);
}
}
/*****************************************************************************/
/* Egress BD lookup */
/*****************************************************************************/
action set_egress_bd_properties() {
}
table egress_bd_map {
reads {
egress_metadata.bd : exact;
}
actions {
nop;
set_egress_bd_properties;
}
size : EGRESS_BD_MAPPING_TABLE_SIZE;
}
control process_egress_bd {
apply(egress_bd_map);
}
/*****************************************************************************/
/* Egress VLAN decap */
/*****************************************************************************/
action remove_vlan_single_tagged() {
modify_field(ethernet.etherType, vlan_tag_[0].etherType);
remove_header(vlan_tag_[0]);
}
action remove_vlan_double_tagged() {
modify_field(ethernet.etherType, vlan_tag_[1].etherType);
remove_header(vlan_tag_[0]);
remove_header(vlan_tag_[1]);
}
table vlan_decap {
reads {
vlan_tag_[0] : valid;
vlan_tag_[1] : valid;
}
actions {
nop;
remove_vlan_single_tagged;
remove_vlan_double_tagged;
}
size: VLAN_DECAP_TABLE_SIZE;
}
control process_vlan_decap {
apply(vlan_decap);
}

39
samples/P4/mirror_acl.p4 Normal file
View File

@@ -0,0 +1,39 @@
// Copyright 2015, Barefoot Networks, Inc.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
action set_mirror_id(session_id) {
clone_ingress_pkt_to_egress(session_id);
}
table mirror_acl {
reads {
ingress_metadata.if_label : ternary;
ingress_metadata.bd_label : ternary;
/* ip acl */
ingress_metadata.lkp_ipv4_sa : ternary;
ingress_metadata.lkp_ipv4_da : ternary;
ingress_metadata.lkp_ip_proto : ternary;
/* mac acl */
ingress_metadata.lkp_mac_sa : ternary;
ingress_metadata.lkp_mac_da : ternary;
ingress_metadata.lkp_mac_type : ternary;
}
actions {
nop;
set_mirror_id;
}
size : INGRESS_MIRROR_ACL_TABLE_SIZE;
}

View File

@@ -0,0 +1,12 @@
create or replace procedure print_bool(
p_bool in BOOLEAN,
p_true_value in varchar2 default 'TRUE',
p_false_value in varchar2 := 'FALSE'
)
as
begin
dbms_output.put_line(case when p_bool then p_true_value else p_false_value end);
end print_bool;
/

48
samples/PLSQL/videodb.ddl Normal file
View File

@@ -0,0 +1,48 @@
CREATE TABLE users (
user_name varchar2(40),
first_name varchar2(40),
last_name varchar2(40),
email varchar2(40),
password varchar2(40),
created_date DATE,
total_credits NUMBER,
credit_change_date DATE,
PRIMARY KEY (user_name)
);
/
CREATE TABLE users_videos (
video_id NUMBER,
video_name varchar2(40),
user_name varchar2(40),
description varchar2(512),
upload_date DATE,
PRIMARY KEY (video_id),
CONSTRAINT "USERS_VIDEOS_FK1" FOREIGN KEY ("USER_NAME") REFERENCES "USERS"("USER_NAME")
);
/
create or replace procedure print_user_videos(
p_user_name in users.user_name%type
)
AUTHID DEFINER
as
type t_user_videos is table of users_videos%rowtype
index by pls_integer;
l_videos t_user_videos;
begin
select *
bulk collect into l_videos
from users_videos
where user_name = p_user_name;
for i in 1..l_videos.COUNT
loop
dbms_output.put_line(l_videos(i).video_name);
end loop;
end print_user_videos;
/

34
samples/Pep8/div.pep Normal file
View File

@@ -0,0 +1,34 @@
main: SUBSP 8, i
DECI 0, s
DECI 2, s
CALL div
DECO 4, s
CHARO '\n', i
DECO 6, s
CHARO '\n', i
STOP
; Divides two numbers following the euclidian method
;
; Parameters:
; SP + 2: Dividend
; SP + 4: Divider
; Returns:
; SP + 6: Quotient
; SP + 8: Remain
div: LDX 0, i
LDA dividend, s
divlp: CPA divider, s
BRLT divout
ADDX 1, i
SUBA divider, s
BR divlp
divout: STX quot, s
STA rem, s
RET0
dividend: .EQUATE 2
divider: .EQUATE 4
quot: .EQUATE 6
rem: .EQUATE 8
.END

23
samples/Pep8/flag.pep Normal file
View File

@@ -0,0 +1,23 @@
_start: LDA 0,i
LDX 0,i
LDA 20, i
ADDA 51, i
CPA 0,i
BRLT s3
BR s4
s1: LDBYTEA s3, x
NOTA
STBYTEA s3, x
ADDX 1,i
CPX 12, i
BRNE s1
s2: STOP
s4: LDA 31, d
LDX 50, d
RET0
STOP
s3: CPX -27746, d
ANDX -8241, i
SUBA -12337, sxf
LDX -12289, sx
.END

675
samples/Pep8/linked.pep Normal file
View File

@@ -0,0 +1,675 @@
; Linked list of integers API
;
; Contains the basis of the structure and a
; variety of available functions to call on it.
;
; Calling conventions:
;
; - When the number of arguments is <= 2, the fastcall convention will be used:
; Arguments will be passed by registers, no assumption is made concerning the
; state of the registers during execution, they will need to be saved.
;
; - When the number of arguments exceeds 2, the cdecl convention will be used:
; Arguments will be passed on the stack, no assumption is made concerning the
; state of the registers during execution, they will need to be saved.
; Simple test program, do no include when using the library
main: SUBSP 4, i
DECI mnelmt, s
CALL newlst
LDX mnlst, s
CALL lstgetst
LDX mnlst, s
CALL lstsetst
LDX mnlst, s
CALL lstgetst
LDX mnlst, s
CALL shftest
LDX mnlst, s
CALL ushftest
LDX mnlst, s
CALL shftest
ADDSP 4, i
STOP
; Pointer to the list
mnlst: .EQUATE 0
; Element read
mnelmt: .EQUATE 2
; TESTS
; Simple test for the get operation
; Gets the first element of the list and prints it
;
; REQUIRES: Non-empty list
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; void
lstgetst: SUBSP 2, i
LDA 0, i
CALL lstget
STA 0, s
DECO 0, s
CHARO '\n', i
ADDSP 2, i
RET0
; Test for the set operation
; Sets the first element of the list to a given value
; The value is read from stdin
;
; REQUIRES: Non-empty list
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; void
lstsetst: SUBSP 6, i
STX 0, s
DECI 4, s
LDA 0, i
STA 2, s
CALL lstset
ADDSP 6, i
RET0
; Tests shift operation on a list
; Gets the last element of the list and prints it
;
; REQUIRES: Non-empty list
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; void
shftest: SUBSP 2, i
CALL lstshft
STA 0, s
DECO 0, s
CHARO '\n', i
ADDSP 2, i
RET0
; Tests unshift operation on a list
; Unshifts a new element read from keyboard
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; void
ushftest: SUBSP 2, i
DECI 0, s
LDA 0, s
CALL lstunshf
ADDSP 2, i
RET0
; LIBRARY
; Creates a new list with `element` as head
;
; Parameters:
; SP + 4: Element
;
; Returns:
; SP + 2: Pointer to the list
newlst: LDA lstlen, i
CALL new
STX 2, s
CALL newnode
SUBSP 2, i
STX 0, s
LDX nodeelmt, i
LDA 6, s
STA 0, sxf
LDA 0, s
LDX lsthead, i
STA 4, sxf
ADDSP 2, i
RET0
; Gets a node at specified index in a list
;
; Parameters:
; - A: Index
; - X: Pointer to the list
;
; Returns:
; - A: Error code (0 if no error was produced)
; - X: Pointer to the node
;
; Errors:
; -1: Index < 0
; -2: Index >= list.length
nodeat: SUBSP 10, i
STA ndaind, s
STX ndalst, s
LDX lsthead, i
LDA ndalst, sxf
STA ndanode, s
LDA ndaind, s
CPA 0, i
LDA 0, i
STA ndacurri, s
BRGE ndagez
LDA -1, i
ADDSP 10, i
RET0
ndagez: LDX ndalst, s
CALL listlen
STA ndalstln, s
LDA ndaind, s
CPA ndalstln, s
BRLT ndalp
LDA -2, i
ADDSP 10, i
RET0
ndalp: LDA ndacurri, s
CPA ndaind, s
BREQ ndaout
LDX nodenxt, i
LDA ndanode, sxf
STA ndanode, s
LDA ndacurri, s
ADDA 1, i
STA ndacurri, s
BR ndalp
ndaout: LDX ndanode, s
LDA 0, i
ADDSP 10, i
RET0
ndaind: .EQUATE 0
ndanode: .EQUATE 2
ndalst: .EQUATE 4
ndalstln: .EQUATE 6
ndacurri: .EQUATE 8
; Length of the list passed as a parameter
;
; Parameters:
; - X: List
;
; Returns:
; - A: Length
listlen: SUBSP 4, i
STX lenode, s
LDX lenode, sf
STX lenode, s
LDA 0, i
STA lencpt, s
llenlp: LDA lenode, s
CPA 0, i
BREQ lenout
LDA lencpt, s
ADDA 1, i
STA lencpt, s
LDX nodenxt, i
LDA lenode, sxf
STA lenode, s
BR llenlp
lenout: LDA lencpt, s
ADDSP 4, i
RET0
lenode: .EQUATE 0
lencpt: .EQUATE 2
; Gets an element in a list at a specified index
;
; Parameters:
; - A: Index
; - X: Address of the list
;
; Returns:
; - A: Element value
;
; Error:
; If out of bounds, prints an error message and stops the program
lstget: SUBSP 2, i
STA 0, s
CALL nodeat
CPA 0, i
BRNE getoob
LDA 0, x
ADDSP 2, i
RET0
; Out of bounds
getoob: STRO getstrob, d
DECO 0, s
CHARO '\n', i
STOP
; String for out of bounds error
getstrob: .ASCII "Invalid index on get, index = \x00"
; Sets an element in a list at a specified index to a new value
;
; Parameters:
; - SP + 2: Pointer to the list
; - SP + 4: Index
; - SP + 6: Element
;
; Returns:
; - A: 0 if all went well, an error code otherwise (analogous to the error codes in nodeat)
lstset: CHARO '\n', i
DECO lstsetlp, s
CHARO ' ', i
DECO lstsetin, s
CHARO ' ', i
DECO lstsetel, s
CHARO '\n', i
SUBSP 2, i
LDX lstsetlp, s
LDA lstsetin, s
CALL nodeat
CPA 0, i
BRNE lstsetrt
STX lstsetnp, s
LDA lstsetel, s
LDX nodeelmt, i
STA lstsetnp, sxf
LDA 0, i
lstsetrt: ADDSP 2, i
RET0
; Pointer to the list
lstsetlp: .EQUATE 4
; Element to set the value at
lstsetel: .EQUATE 8
; Index of the node
lstsetin: .EQUATE 6
; Pointer to the node
lstsetnp: .EQUATE 0
; Removes the first element of the list in parameter and returns its value
;
; REQUIRES: Non-empty list
;
; Parameters:
; ⁻ X: Pointer to the list
;
; Returns :
; - A: Element removed
lstshft: SUBSP 8, i
STX lshflp, s
LDX lsthead, i
LDA lshflp, sxf
CPA 0, i
BREQ shfterr
STA lshfohd, s
LDX nodenxt, i
LDA lshfohd, sxf
STA lshfnhd, s
LDX lsthead, i
STA lshflp, sxf
LDX nodeelmt, i
LDA lshfohd, sxf
ADDSP 8, i
RET0
shfterr: STRO shfterrm, d
STOP
; Pointer to the list
lshflp: .EQUATE 0
; Pointer to the old head
lshfohd: .EQUATE 2
; Old head's element
lshfhdel: .EQUATE 4
; Pointer to the new head
lshfnhd: .EQUATE 6
; Error message on shift
shfterrm: .ASCII "Cannot do shift on empty list.\n\x00"
; Inserts a new element at the beginning of a list
;
; Parameters:
; - X: Pointer to the list
; - A: Element to add to the list
;
; Returns:
; - A: Error code, 0 if all right, a code otherwise
lstunshf: SUBSP 8, i
STA lunshelm, s
STX lunslp, s
CALL newnode
STX lunsnhd, s
LDX lsthead, i
LDA lunslp, sxf
STA lunsohd, s
LDX nodenxt, i
LDA lunsohd, s
STA lunsnhd, sxf
LDA lunshelm, s
LDX nodeelmt, i
STA lunsohd, sxf
LDX lsthead, i
LDA lunsnhd, s
STA lunslp, sxf
ADDSP 8, i
RET0
; Pointer to the list
lunslp: .EQUATE 0
; Pointer to the old head
lunsohd: .EQUATE 2
; Pointer to the new head
lunsnhd: .EQUATE 4
; Element to add
lunshelm: .EQUATE 6
; Finds whether or not an element is present in a list
;
; Parameters:
; - X: Pointer to the list
; - A: Element to be found
;
; Returns:
; - A: 0 if element was not found, 1 if it was
lstfnd: SUBSP 6, i
STX lstfndlp, s
STA lstfndel, s
LDX lsthead, i
LDA lstfndlp, sxf
STA lstfndnd, s
fndloop: CPA 0, i
BREQ notfnd
LDX nodeelmt, i
LDA lstfndnd, sxf
CPA lstfndel, s
BREQ found
LDX nodenxt, i
LDA lstfndnd, sxf
STA lstfndnd, s
BR fndloop
notfnd: LDA 0, i
ADDSP 6, i
RET0
found: LDA 1, i
ADDSP 6, i
RET0
; Pointer to the list
lstfndlp: .EQUATE 0
; Element to search
lstfndel: .EQUATE 2
; Current node
lstfndnd: .EQUATE 4
; Pushes a new element at the end of the list
;
; Parameters:
; - X: Pointer to the list
; - A: Element to push
;
; Returns:
; - A: 0 if all went well, an error code otherwise
lstpsh: SUBSP 8, i
STX lpshlp, s
STA lpshel, s
CALL newnode
STX lpshnd, s
LDX lpshlp, s
CALL listlen
CPA 0, i
BREQ lpshshft
SUBA 1, i
LDX lpshlp, s
CALL nodeat
STX lpshlnd, s
LDX nodenxt, i
LDA lpshnd, s
STA lpshlnd, sxf
LDA lpshel, s
LDX nodeelmt, i
STA lpshnd, sxf
ADDSP 8, i
RET0
lpshshft: LDX lpshlp, s
LDA lpshel, s
CALL lstunshf
ADDSP 8, i
RET0
; Pointer to the list
lpshlp: .EQUATE 0
; Element to add to the list
lpshel: .EQUATE 2
; Node to add to the list
lpshnd: .EQUATE 4
; Node to append
lpshlnd: .EQUATE 6
; Pops the last element of a list
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; - A: Element removed from the list
lstpop: SUBSP 6, i
STX lpoplp, s
CALL listlen
CPA 0, i
BRNE poperrem
CPA 1, i
BREQ popshft
SUBA 2, i
LDX lpoplp, s
CALL nodeat
STX lpopndpr, s
LDX nodenxt, i
LDA lpopndpr, sxf
LDA 0, i
LDX nodenxt, i
STA lpopndpr, sxf
STA lpoplnd, s
LDX nodeelmt, i
LDA lpoplnd, s
ADDSP 6, i
RET0
poperrem: STRO poperrsm, d
STOP
popshft: LDX lpoplp, s
CALL lstshft
ADDSP 6, i
RET0
; Pointer to the list
lpoplp: .EQUATE 0
; Node to remove
lpoplnd: .EQUATE 2
;New last node
lpopndpr: .EQUATE 4
; Message to print when popping an empty list
poperrsm: .ASCII "Error: cannot pop an empty list.\n\x00"
; Inserts an element in a list at a given position
;
; REQUIRES: Non-empty list
;
; Parameters:
; - SP + 2: Pointer to the list
; - SP + 4: Index to insert at
; - SP + 6: Element to add
;
; Returns:
; - A: Error code: 0 if all went well, -1 if index < 0, -2 if index > list.length
lstinsat: SUBSP 6, i
LDA lstinsid, s
CPA 0, i
BRLT lstinslz
BREQ lstinush
LDX lstinslp, s
CALL listlen
CPA lstinsel, s
BRLT lstinsgl
BREQ lstinpsh
LDX lstinslp, s
LDA lstinsel, s
SUBA 1, i
CALL nodeat
STX lstinsnd, s
LDX nodenxt, i
LDA lstinsnd, sxf
STA lstinscx, s
CALL newnode
STX lstinscn, s
LDX nodeelmt, i
LDA lstinsel, s
STA lstinscn, sxf
LDX nodenxt, i
LDA lstinscx, s
STA lstinscn, sxf
LDA lstinscn, s
LDX nodenxt, i
STA lstinsnd, sxf
ADDSP 6, i
RET0
lstinush: LDX lstinslp, s
LDA lstinsel, s
CALL lstunshf
ADDSP 6, i
RET0
lstinpsh: LDX lstinslp, s
LDA lstinsel, s
CALL lstpsh
ADDSP 6, i
RET0
; Insert with index < 0
lstinslz: LDA -1, i
ADDSP 6, i
RET0
; Insert with index > list.length
lstinsgl: LDA -2, i
ADDSP 6, i
RET0
; List pointer
lstinslp: .EQUATE 8
; Index of the newly created node
lstinsid: .EQUATE 10
; Element to add
lstinsel: .EQUATE 12
; Node to change the pointer to the next
lstinsnd: .EQUATE 0
; Node to insert
lstinscn: .EQUATE 2
; Pointer to the node after the created one (might be null)
lstinscx: .EQUATE 4
; Removes a node at a given index in a list,
; returns the element previously contained
;
; Parameters:
; - X: Pointer to the list
; - A: Index of the element
;
; Returns:
; - A: Element removed
;
; Error:
; In case of error, the program aborts with an error message
lstremat: SUBSP 8, i
STX lremlp, s
STA lremid, s
CPA 0, i
BRLT lstremob
BREQ lstremz
CALL listlen
CPA lremid, s
BRGE lstremob
SUBA 1, i
CPA lremid, s
BREQ lrempop
LDA lremid, s
LDX lremlp, s
CALL nodeat
STX lremnd, s
LDA lremid, s
SUBA 1, i
LDX lremlp, s
CALL nodeat
STX lrempnd, s
LDX nodenxt, i
LDA lremnd, sxf
STA lrempnd, sxf
LDX nodeelmt, i
LDA lremnd, sxf
ADDSP 8, i
RET0
lstremz: LDX lremlp, s
CALL lstshft
ADDSP 8, i
RET0
lrempop: LDX lremlp, s
CALL lstpop
ADDSP 8, i
RET0
lstremob: STRO lremobst, d
DECO lremid, s
CHARO '\n', i
STOP
; Pointer to the list
lremlp: .EQUATE 0
; Index to remove an element at
lremid: .EQUATE 2
; Pointer to the node before the removed element
lrempnd: .EQUATE 4
; Pointer to the node to remove
lremnd: .EQUATE 6
; Error out of bounds string for remove_at
lremobst: .ASCII "Error: Out of bounds in remove_at, index = \x00"
; Creates a new node from scratch
; Sets its content to 0/NULL
;
; Parameters:
; void
;
; Return:
; - X: Address of the node
newnode: LDA nodeln, i
SUBSP 2, i
CALL new
STX 0, s
LDA 0, i
LDX nodenxt, i
STA 0, sxf
LDX nodeelmt, i
STA 0, sxf
LDX 0, s
ADDSP 2, i
RET0
; Allocates a new structure in the heap
;
; Parameters:
; - A: Length of the structure to allocate (bytes)
;
; Returns:
; - X: Address of the allocated structure
new: ADDA hpptr, d
LDX hpptr, d
STA hpptr, d
RET0
; Node in a linked list
;
; Contains two fields:
; - Element: Offset 0
; - Next: Offset 2
;
nodeln: .EQUATE 4
nodeelmt: .EQUATE 0
nodenxt: .EQUATE 2
; Linked list capsule
;
; Contains one field:
; - Head: Offset 0
;
lstlen: .EQUATE 2
lsthead: .EQUATE 0
; Pointer to the next available byte on the heap
hpptr: .ADDRSS heap
; Start of the heap
heap: .BLOCK 1
.END

434
samples/Pep8/msq.pep Normal file
View File

@@ -0,0 +1,434 @@
; Reads a square from stdin, then computes whether it is a magic square or not.
;
; A Magic Square is a square following a specific set of rules, namely:
; - The sum of each row must be the same as the sum of the diagonal
; - The sum of the anti-diagonal must be the same as the sum of the diagonal
; - The sum of each column must be the same as the sum of the diagonal
;
; If any column, row, or anti-diagonal does not follow the aformented rules,
; the program will output its number to stdout.
;
; Columns are identified by a negative digit, ranging from -1 to -n
; The anti-diagonal is identified by the number 0.
; Finally, rows are identified by a positive integer, ranging from 1 to n.
;
; Formatting:
; First a number `n` is read from Stdin, it will determine the size of the square
; Then, enter the data for the square, `n` entries will be read
; The data is sequentially added to the square in memory, from the upper-left corner
; to the lower-right corner, in a zig-zag pattern
;
; Example:
; 3
; 4 9 3
; 3 5 7
; 8 1 6
;
; Limitation: Since there is no dynamic allocation, the size
; of the square is capped at a maximum of 32*32.
; Any size lower than 1 or higher than 32 will produce
; an error and the termination of the program.
;_start
DECI sidelen, d
LDA sidelen, d
CPA 1, i
BRLT sderror
CPA 32, i
BRGT sderror
LDX sidelen, d
CALL mult
STA sqlen, d
CALL fillsq
LDA sidelen, d
LDX square, i
CALL diagsum
STA dgsm, d
CALL colsums
LDA sidelen, d
LDX square, i
CALL cdiagsum
CPA dgsm, d
BREQ cnt
DECO 0, i
CHARO '\n', i
cnt: STA cdsm, d
CALL rowsums
STOP
el: .BLOCK 2
; Length of a side of the square
sidelen: .WORD 0
; Total length of the square
sqlen: .BLOCK 2
; 32 * 32 square of integers
square: .BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 8
; Prints an error and terminates the program
sderror: STRO stderr, d
STOP
; Parameters: A: Size of a side of the square
; X: Base address of the square
; cscolid: Identifier of the column (0-based)
; Computes the sum of each column
; If the sum is not the same as dgsm, its index will be printed (in negative form)
;
; Parameters: A: Size of a side of the square
; X: Address of the square
;
; Return: void
colsums:STA clsmsqsz, d
STX clsmsqad, d
SUBA 1, i
STA clsmyp, d
clssmlp:CPA 0 ,i
BRLT clsmout
STA cscolid, d
LDA clsmsqsz, d
LDX clsmsqsz, d
CALL colsum
CPA dgsm, d
BREQ clsdecpt
LDX clsmyp, d
NEGX
STX clsmyp, d
DECO clsmyp, d
CHARO '\n', i
LDX clsmyp, d
NEGX
STX clsmyp, d
clsdecpt: LDA clsmyp, d
SUBA 1, i
STA clsmyp, d
BR clssmlp
clsmout: RET0
clsmsqad: .BLOCK 2
clsmsqsz: .BLOCK 2
clsmyp_: .BLOCK 2
; Compute the sum of each row
; Prints its index if the value does not match dgsum
;
; Parameters: A: Size of a side of the square
; X: Address of the square
;
; Returns: void
rowsums: STA maxrows, d
STX rowssqad, d
LDA 0, i
STA tmprwsm, d
STA rowid, d
rwsmslp: CPA maxrows, d
BRGE rwsmsout
STA rwxpos, d
LDA maxrows, d
LDX rowssqad, d
CALL rowsum
CPA dgsm, d
STA tmprwsm, d
BREQ rwinccpt
DECO rowid, d
CHARO '\n', i
rwinccpt: LDA rowid, d
ADDA 1, i
STA rowid, d
BR rwsmslp
rwsmsout: RET0
; Number of rows to compute
maxrows: .BLOCK 2
; Square address
rowssqad: .BLOCK 2
; Current rowid
rowid: .BLOCK 2
; Current rowsum
tmprwsm: .BLOCK 2
; Gets an element at the indexes given as parameter
; The square is supposed to contain only integers
; No check will be made on the correctness of the indexes
;
; Parameters: A: Size of a side of the square (in elements)
; X: Base address of the square
; xpos: Position in X for the element (0-indexed)
; ypos: Position in Y for the element (0-indexed)
;
; Return: A will contain the element
;
; Side-effects: Registers A and X will neither be saved nor restored upon call
; ypos will be altered
elemat: STX elsqaddr, d
ASLA
LDX xpos, d
CALL mult
STA xpos, d
LDX ypos, d
ASLX
STX ypos, d
ADDA ypos, d
ADDA elsqaddr, d
STA elsqaddr, d
LDX elsqaddr, d
LDA 0, x
RET0
; X-index in square (in elements)
xpos: .BLOCK 2
; Y-index in square (in elements)
ypos: .BLOCK 2
; Address to fetch elements at
elsqaddr: .BLOCK 2
; Fills the square with input from the user
;
; Pass via register A the number of inputs to be read
fillsq: LDX 0, i
filloop: SUBA 1, i
CPA 0, i
BRLT fillout
DECI square, x
ADDX 2, i
BR filloop
fillout: RET0
; Computes the sum of the digits of a column
; The square is supposed to contain integers only
;
; Parameters: A: Size of a side of the square
; X: Base address of the square
; cscolid: Identifier of the column (0-based)
;
; Return: A: Sum of the digits of the column
colsum: STA csclsqsz, d
STX csclsqad, d
LDA 0, i
STA csclsum, d
STA csclxpos, d
clsmloop: CPA csclsqsz, d
BRGE colout
LDA cscolid, d
STA ypos, d
LDA csclxpos, d
STA xpos, d
LDA csclsqsz, d
LDX csclsqad, d
CALL elemat
ADDA csclsum, d
STA csclsum, d
LDA csclxpos, d
ADDA 1, i
STA csclxpos, d
BR clsmloop
colout: LDA csclsum, d
RET0
; Identifier of the column which sum is to be computed
cscolid: .BLOCK 2
; Temporary for x position
csclxpos: .BLOCK 2
; Base address of the square
csclsqad: .BLOCK 2
; Size of a side of the square
csclsqsz: .BLOCK 2
; Sum of the column
csclsum: .BLOCK 2
; Computes the sum of the digits of a row
; The square is supposed to contain integers only
;
; Parameters: A: Size of a side of the square
; X: Base address of the square
; rwxpos: Row index (0-based)
;
; Returns: A: Sum of the digits of the row
rowsum: STA rwsqsz, d
STX rwbsqadr, d
LDA 0,i
STA rwsum, d
STA rwypos, d
rwsumlp: LDA rwypos, d
CPA rwsqsz, d
BRGE rwsumout
STA ypos, d
LDA rwxpos, d
STA xpos, d
LDA rwsqsz, d
LDX rwbsqadr, d
CALL elemat
ADDA rwsum, d
STA rwsum, d
LDA rwypos, d
ADDA 1, i
STA rwypos, d
BR rwsumlp
rwsumout: LDA rwsum, d
RET0
; Square size (in elements)
rwsqsz: .BLOCK 2
; Square base address
rwbsqadr: .BLOCK 2
; Position of the row to compute
rwxpos: .BLOCK 2
; Current column visited
rwypos: .BLOCK 2
; Sum of the row
rwsum: .BLOCK 2
; Computes the sum for the antidiagonal of a square
; The square is supposed to contain integers only
;
; Parameters: A: Size of a side of the square (elements)
; X: Base address of the square
;
; Returns: A: Sum of the antidiagonal
cdiagsum: STA cdsqsz, d
SUBA 1,i
STA cdtmpy, d
LDA 0, i
STA cdtmpx, d
STA cdsum, d
STX cdsqaddr, d
cdiaglp: LDA cdtmpx, d
STA xpos, d
LDA cdtmpy, d
STA ypos, d
CPA 0, i
BRLT cdout
LDA cdsqsz, d
LDX cdsqaddr, d
CALL elemat
ADDA cdsum, d
STA cdsum,d
LDA cdtmpx, d
ADDA 1, i
STA cdtmpx, d
LDA cdtmpy, d
SUBA 1, i
STA cdtmpy, d
BR cdiaglp
cdout: LDA cdsum, d
RET0
; Temporary handle for square size (elements)
cdsqsz: .BLOCK 2
; Square address
cdsqaddr: .BLOCK 2
; Keep x address
cdtmpx: .BLOCK 2
; Keep y address
cdtmpy: .BLOCK 2
; Sum of antidiagonal
cdsum: .BLOCK 2
; Computes the sum for the diagonal of a square
; The square is supposed to contain integers only
;
; Parameters: A: Size of a side of the square (elements)
; X: Base address of the square
;
; Returns: A: Sum of the diagonal
;
diagsum: STA dsqsz, d
STX dsqaddr, d
LDA 0, i
STA tmpsum, d
STA curra, d
dglp: CPA dsqsz, d
BRGE dglpout
STA xpos, d
STA ypos, d
LDA dsqsz, d
LDX dsqaddr, d
CALL elemat
ADDA tmpsum, d
STA tmpsum, d
LDA curra, d
ADDA 1, i
STA curra, d
BR dglp
dglpout: LDA tmpsum, d
RET0
; Address of the square
dsqaddr: .BLOCK 2
; Size of a side of the square (elements)
dsqsz: .BLOCK 2
; Current value of the x and y indexes
curra: .BLOCK 2
; Sum of the values
tmpsum: .BLOCK 2
; Muliplies two ints
;
; Parameters:
; Register A : Left part of the multiplication
; Register X : Right part of the multiplication
;
; Return:
; Register A : Result of the multiplication
;
; Side-effects:
; Uses multmp as a temporary value
mult: STA multmp, d
LDA 0, i
muloop: CPX 0, i
BRLE mulout
ADDA multmp, d
SUBX 1, i
BR muloop
mulout: RET0
; Temporary variable for mult function
; Holds the initial value of A
multmp: .WORD 0
; For debugging purposes
; Prints the content of the square to stdout
;
; Parameters: A: Size of a side
; X: Base address of square
;
; Side-effects:
; Consider variables sidesz, sqaddr, sqmaxa as local, they will be written
; Registers A and X will not be saved nor restored upon call
printsq: STA sidesz, d
STX sqaddr, d
LDX sidesz, d
CALL mult
ASLA
ADDA sqaddr, d
STA sqmaxa, d
LDX sqaddr, d
LDA 0, i
priloop: DECO 0, x
CHARO ' ', i
ADDX 2, i
CPX sqmaxa, d
BREQ priout
ADDA 1, i
CPA sidesz, d
BRLT priloop
LDA 0, i
CHARO '\n', i
BR priloop
priout: RET0
; Size of a side of the square
sidesz: .BLOCK 2
; Address of the square
sqaddr: .BLOCK 2
; Maximum address to iterate upon
sqmaxa: .BLOCK 2
; ------------------ GLOBALLY ACCESSIBLE SYMBOLS -------------------- ;
;
; Sum of the diagonal for the square
; Reference value for magic-square
dgsm: .WORD 0
; Sum of the counter-diagonal
cdsm: .WORD 0
; Input error string
stderr: .ASCII "A number between 1 and 32 (both inclusive) must be entered as value for the size of the square for the program to work.\n\x00"
.END

227
samples/Pep8/qsort.pep Normal file
View File

@@ -0,0 +1,227 @@
; Sorts a statically defined array using the recursive implementation
; of the quicksort algorithm.
;
; In this implementation, the pivot is supposed to be the rightmost
; value of the slice of the array being sorted.
;
; Note that the code presented below should work on any array,
; whether defined statically or dynamically.
;
; Calling conventions:
; Except when mentionned otherwise, every parameter is to be passed on the stack.
; The return values are also on the stack.
; No assumption is to be made on the content of a register on a function call.
; The values of the registers are to be locally saved for further use if necessary.
main: SUBSP 4, i
LDA 11, i
ASLA
STA 2, s
LDA arr, i
STA 0, s
CALL printarr
SUBSP 2, i
LDA arr, i
STA 0, s
LDA 0, i
STA 2, s
LDA 10, i
STA 4, s
CALL qsort
ADDSP 2, i
CHARO '\n', i
LDA 11, i
ASLA
STA 2, s
LDA arr, i
STA 0, s
CALL printarr
STOP
; Sorts an array using the quicksort algorithm
;
; Parameters:
; - SP + 2: Address of the array
; - SP + 4: Left bound
; - SP + 6: Right bound
; Returns:
; void
qsort: SUBSP 2, i
LDA qsarrlb, s
CPA qsarrrb, s
BRGE qsortout
SUBSP 6, i
LDA 10, s
STA 0, s
LDA 12, s
STA 2, s
LDA 14, s
STA 4, s
CALL part
LDA 10, s
STA 0, s
LDA 12, s
STA 2, s
LDA 6, s
SUBA 1, i
STA 4, s
CALL qsort
LDA 10, s
STA 0, s
LDA 6, s
ADDA 1, i
STA 2, s
LDA 14, s
STA 4, s
CALL qsort
ADDSP 6, i
qsortout: ADDSP 2, i
RET0
; Address of the array
qsarradd: .EQUATE 4
; Left bound
qsarrlb: .EQUATE 6
; Right bound
qsarrrb: .EQUATE 8
; Pivot value returned by the part command
qsortp: .EQUATE 0
; Partitions an array in two following the quicksort rules.
;
; All the lower values compared to the pivot will be on the left
; All the upper values compared to the pivot will be on the right
; The pivot's final index is then returned
;
; Parameters:
; - SP + 2: Address of the array
; - SP + 4: Left bound
; - SP + 6: Right bound
;
; Returns:
; - SP + 8: Pivot final index
part: SUBSP 8, i
LDA parrrb, s
STA partpiv, s
LDA parrlb, s
STA pstind, s
STA piter, s
partflp: CPA parrrb, s
BRGE partout
LDX piter, s
ASLX
LDA paraddr, sxf
STA parrival, s
LDX partpiv, s
ASLX
LDA paraddr, sxf
CPA parrival, s
BRLT parlpinc
SUBSP 6, i ; Call swap(arr, i, st_index)
LDA 16, s
STA 0, s
LDA 8, s
STA 2, s
LDA 10, s
STA 4, s
CALL swap
ADDSP 6, i
LDA pstind, s
ADDA 1, i
STA pstind, s
parlpinc: LDA piter, s
ADDA 1, i
STA piter, s
BR partflp
partout: SUBSP 6, i ; Call swap(arr, piv, st_index)
LDA 16, s
STA 0, s
LDA 12, s
STA 2, s
LDA 10, s
STA 4, s
CALL swap
ADDSP 6, i
LDA pstind, s
ADDSP 8, i
STA 8, s
RET0
; Address of the array
paraddr: .EQUATE 10
; Left bound
parrlb: .EQUATE 12
; Right bound
parrrb: .EQUATE 14
; Pivot value
partpiv: .EQUATE 6
; st_index
pstind: .EQUATE 4
; For iterator value
piter: .EQUATE 2
; arr[i] value
parrival: .EQUATE 0
; Swaps the value of two elements of an array of integers
;
; Parameters:
; - SP + 2: Address of the array
; - SP + 4: Index of the 1st element to swap
; - SP + 6: Index of the 2nd element to swap
;
; Returns:
; void
swap: SUBSP 2, i
LDX fstelind, s
ASLX
LDA arraddr, sxf
STA swaptmp, s
LDX secelind, s
ASLX
LDA arraddr, sxf
LDX fstelind, s
ASLX
STA arraddr, sxf
LDA swaptmp, s
LDX secelind, s
ASLX
STA arraddr, sxf
ADDSP 2, i
RET0
; Temporary value for the swap
swaptmp: .EQUATE 0
; Address of the array on which the swap is done
arraddr: .EQUATE 4
; Index of the first element
fstelind: .EQUATE 6
; Index of the second element
secelind: .EQUATE 8
; Prints the content of an array
;
; Parameters:
; SP + 2: Address of the array
; SP + 4: Length of the array
;
; Returns:
; void
printarr: LDX 0, i
parrlp: CPX 4, s
BRGE parrout
DECO 2, sxf
CHARO ' ', i
ADDX 2, i
BR parrlp
parrout: RET0
; Unsorted array for testing purposes
arr: .WORD 9
.WORD 5
.WORD 8
.WORD 10
.WORD 4
.WORD 7
.WORD 0
.WORD 3
.WORD 2
.WORD 1
.WORD 6
.END

61
samples/Pep8/stri_buf.pep Normal file
View File

@@ -0,0 +1,61 @@
main:
; Reads a string in stdin, returns the buffer it was read in
; Stops reading at the first encounter of a \n character.
;
; Parameters:
; void
;
; Returns:
; - X: Address of the buffer
stri: SUBSP 2, i
LDA 32, i
CALL new
CPX buflen, s
BRGE strinlrg
strinlrg: LDA buflen, d
LDX 2, i
CALL mult
STA buflen
CALL new
buflen: .EQUATE 0
; Copies the content of a buffer to another one
;
; Parameters:
; - SP + 2: Destination buffer
; - SP + 4: Source buffer
; - SP + 6: Length to copy
memcpy: LDX 0, i
memcplp: CPX cpylen, s
BREQ memcpout
LDBYTEA srcbuf, sxf
STBYTEA dstbuf, sxf
ADDX 1, i
BR memcplp
memcpout: RET0
; Destination buffer
dtsbuf: .EQUATE 2
; Source buffer
srcbuf: .EQUATE 4
; Copy length
cpylen: .EQUATE 6
; Allocates a new structure in the heap
;
; Parameters:
; - A: Length of the structure to allocate (bytes)
;
; Returns:
; - X: Address of the allocated structure
new: ADDA hpptr, d
LDX hpptr, d
STA hpptr, d
RET0
; Pointer to the next available byte on the heap
hpptr: .ADDRSS heap
; Start of the heap
heap: .BLOCK 1
.END

View File

@@ -0,0 +1,50 @@
main: SUBSP 34, i
LDA 31, i
STA 0, s
CALL fgets
ADDSP 2, i
CALL ststro
STOP
; Reads a string from stdin, stops reading when one of the following is true:
; - Read a \n
; - Read a maximum of `max` chars
;
; Parameters:
; - SP + 2: `max`, the maximum number of chars to read
; - SP + 4: `buffer` of length `max` + 1
; Returns:
; void
fgets: LDX 0, i
LDA 0, i
fgetslp: CHARI buffer, sx
LDBYTEA buffer, sx
CPA '\n', i
BREQ fout
CPX max, s
BREQ fout
ADDX 1, i
BR fgetslp
fout: LDA '\x00', i
STBYTEA buffer, sx
RET0
max: .EQUATE 2
buffer: .EQUATE 4
; Prints a string stored in stack
;
; Parameters:
; SP + 2: `string`
; Returns:
; void
ststro: LDX 0, i
LDA 0, i
strolp: LDBYTEA string, sx
CPA '\x00', i
BREQ strout
CHARO string, sx
ADDX 1, i
BR strolp
strout: RET0
string: .EQUATE 2
.END

159
samples/Python/argparse.pyi Normal file
View File

@@ -0,0 +1,159 @@
# Stubs for argparse (Python 3.4)
from typing import (
Any, Callable, Iterable, List, IO, Optional, Sequence, Tuple, Type, Union,
TypeVar, overload
)
import sys
_T = TypeVar('_T')
if sys.version_info >= (3,):
_Text = str
else:
_Text = Union[str, unicode]
ONE_OR_MORE = ... # type: str
OPTIONAL = ... # type: str
PARSER = ... # type: str
REMAINDER = ... # type: str
SUPPRESS = ... # type: str
ZERO_OR_MORE = ... # type: str
class ArgumentError(Exception): ...
class ArgumentParser:
if sys.version_info >= (3, 5):
def __init__(self,
prog: Optional[str] = ...,
usage: Optional[str] = ...,
description: Optional[str] = ...,
epilog: Optional[str] = ...,
parents: Sequence[ArgumentParser] = ...,
formatter_class: Type[HelpFormatter] = ...,
prefix_chars: _Text = ...,
fromfile_prefix_chars: Optional[str] = ...,
argument_default: Optional[str] = ...,
conflict_handler: _Text = ...,
add_help: bool = ...,
allow_abbrev: bool = ...) -> None: ...
else:
def __init__(self,
prog: Optional[_Text] = ...,
usage: Optional[_Text] = ...,
description: Optional[_Text] = ...,
epilog: Optional[_Text] = ...,
parents: Sequence[ArgumentParser] = ...,
formatter_class: Type[HelpFormatter] = ...,
prefix_chars: _Text = ...,
fromfile_prefix_chars: Optional[_Text] = ...,
argument_default: Optional[_Text] = ...,
conflict_handler: _Text = ...,
add_help: bool = ...) -> None: ...
def add_argument(self,
*name_or_flags: Union[_Text, Sequence[_Text]],
action: Union[_Text, Type[Action]] = ...,
nargs: Union[int, _Text] = ...,
const: Any = ...,
default: Any = ...,
type: Union[Callable[[str], _T], FileType] = ...,
choices: Iterable[_T] = ...,
required: bool = ...,
help: _Text = ...,
metavar: Union[_Text, Tuple[_Text, ...]] = ...,
dest: _Text = ...,
version: _Text = ...) -> None: ... # weirdly documented
def parse_args(self, args: Optional[Sequence[_Text]] = ...,
namespace: Optional[Namespace] = ...) -> Namespace: ...
def add_subparsers(self, title: _Text = ...,
description: Optional[_Text] = ...,
prog: _Text = ...,
parser_class: Type[ArgumentParser] = ...,
action: Type[Action] = ...,
option_string: _Text = ...,
dest: Optional[_Text] = ...,
help: Optional[_Text] = ...,
metavar: Optional[_Text] = ...) -> _SubParsersAction: ...
def add_argument_group(self, title: Optional[_Text] = ...,
description: Optional[_Text] = ...) -> _ArgumentGroup: ...
def add_mutually_exclusive_group(self, required: bool = ...) -> _MutuallyExclusiveGroup: ...
def set_defaults(self, **kwargs: Any) -> None: ...
def get_default(self, dest: _Text) -> Any: ...
def print_usage(self, file: Optional[IO[str]] = ...) -> None: ...
def print_help(self, file: Optional[IO[str]] = ...) -> None: ...
def format_usage(self) -> str: ...
def format_help(self) -> str: ...
def parse_known_args(self, args: Optional[Sequence[_Text]] = ...,
namespace: Optional[Namespace] = ...) -> Tuple[Namespace, List[str]]: ...
def convert_arg_line_to_args(self, arg_line: _Text) -> List[str]: ...
def exit(self, status: int = ..., message: Optional[_Text] = ...) -> None: ...
def error(self, message: _Text) -> None: ...
class HelpFormatter:
# not documented
def __init__(self, prog: _Text, indent_increment: int = ...,
max_help_position: int = ...,
width: Optional[int] = ...) -> None: ...
class RawDescriptionHelpFormatter(HelpFormatter): ...
class RawTextHelpFormatter(HelpFormatter): ...
class ArgumentDefaultsHelpFormatter(HelpFormatter): ...
if sys.version_info >= (3,):
class MetavarTypeHelpFormatter(HelpFormatter): ...
class Action:
def __init__(self,
option_strings: Sequence[_Text],
dest: _Text = ...,
nargs: Optional[Union[int, _Text]] = ...,
const: Any = ...,
default: Any = ...,
type: Union[Callable[[str], _T], FileType, None] = ...,
choices: Optional[Iterable[_T]] = ...,
required: bool = ...,
help: Optional[_Text] = ...,
metavar: Union[_Text, Tuple[_Text, ...]] = ...) -> None: ...
def __call__(self, parser: ArgumentParser, namespace: Namespace,
values: Union[_Text, Sequence[Any], None],
option_string: _Text = ...) -> None: ...
class Namespace:
def __getattr__(self, name: _Text) -> Any: ...
def __setattr__(self, name: _Text, value: Any) -> None: ...
class FileType:
if sys.version_info >= (3, 4):
def __init__(self, mode: _Text = ..., bufsize: int = ...,
encoding: Optional[_Text] = ...,
errors: Optional[_Text] = ...) -> None: ...
elif sys.version_info >= (3,):
def __init__(self,
mode: _Text = ..., bufsize: int = ...) -> None: ...
else:
def __init__(self,
mode: _Text = ..., bufsize: Optional[int] = ...) -> None: ...
def __call__(self, string: _Text) -> IO[Any]: ...
class _ArgumentGroup:
def add_argument(self,
*name_or_flags: Union[_Text, Sequence[_Text]],
action: Union[_Text, Type[Action]] = ...,
nargs: Union[int, _Text] = ...,
const: Any = ...,
default: Any = ...,
type: Union[Callable[[str], _T], FileType] = ...,
choices: Iterable[_T] = ...,
required: bool = ...,
help: _Text = ...,
metavar: Union[_Text, Tuple[_Text, ...]] = ...,
dest: _Text = ...,
version: _Text = ...) -> None: ...
def add_mutually_exclusive_group(self, required: bool = ...) -> _MutuallyExclusiveGroup: ...
class _MutuallyExclusiveGroup(_ArgumentGroup): ...
class _SubParsersAction:
# TODO: Type keyword args properly.
def add_parser(self, name: _Text, **kwargs: Any) -> ArgumentParser: ...
# not documented
class ArgumentTypeError(Exception): ...

View File

@@ -0,0 +1,12 @@
# rules for scala
# https://github.com/bazelbuild/rules_scala#getting-started
# pull rule definitions from git
git_repository(
name = "io_bazel_rules_scala",
remote = "https://github.com/bazelbuild/rules_scala.git",
commit = "73743b830ae98d13a946b25ad60cad5fee58e6d3", # update this as needed
)
# load the desired scala rules for this workspace
load("@io_bazel_rules_scala//scala:scala.bzl", "scala_repositories")
scala_repositories()

72
samples/R/import.Rd Normal file
View File

@@ -0,0 +1,72 @@
% Generated by roxygen2: do not edit by hand
% Please edit documentation in R/hello.R
\name{import}
\alias{import}
\title{Import a module into the current scope}
\usage{
import(module, attach, attach_operators = TRUE)
}
\arguments{
\item{module}{an identifier specifying the full module path}
\item{attach}{if \code{TRUE}, attach the newly loaded module to the object
search path (see \code{Details})}
\item{attach_operators}{if \code{TRUE}, attach operators of module to the
object search path, even if \code{attach} is \code{FALSE}}
}
\value{
the loaded module environment (invisible)
}
\description{
\code{module = import('module')} imports a specified module and makes its
code available via the environment-like object it returns.
}
\details{
Modules are loaded in an isolated environment which is returned, and
optionally attached to the object search path of the current scope (if
argument \code{attach} is \code{TRUE}).
\code{attach} defaults to \code{FALSE}. However, in interactive code it is
often helpful to attach packages by default. Therefore, in interactive code
invoked directly from the terminal only (i.e. not within modules),
\code{attach} defaults to the value of \code{options('import.attach')}, which
can be set to \code{TRUE} or \code{FALSE} depending on the users preference.
\code{attach_operators} causes \emph{operators} to be attached by default,
because operators can only be invoked in R if they re found in the search
path. Not attaching them therefore drastically limits a modules usefulness.
Modules are searched in the module search path \code{options('import.path')}.
This is a vector of paths to consider, from the highest to the lowest
priority. The current directory is \emph{always} considered first. That is,
if a file \code{a.r} exists both in the current directory and in a module
search path, the local file \code{./a.r} will be loaded.
Module names can be fully qualified to refer to nested paths. See
\code{Examples}.
}
\note{
Unlike for packages, attaching happens \emph{locally}: if
\code{import} is executed in the global environment, the effect is the same.
Otherwise, the imported module is inserted as the parent of the current
\code{environment()}. When used (globally) \emph{inside} a module, the newly
imported module is only available inside the modules search path, not
outside it (nor in other modules which might be loaded).
}
\examples{
# `a.r` is a file in the local directory containing a function `f`.
a = import('a')
a$f()
# b/c.r is a file in path `b`, containing a function `g`.
import('b/c', attach = TRUE)
g() # No module name qualification necessary
}
\seealso{
\code{unload}
\code{reload}
\code{module_name}
}

483
samples/Reason/JSX.re Normal file
View File

@@ -0,0 +1,483 @@
type component = {displayName: string};
let module Bar = {
let createElement c::c=? children => {
displayName: "test"
};
};
let module Nesting = {
let createElement children => {
displayName: "test"
};
};
let module Much = {
let createElement children => {
displayName: "test"
};
};
let module Foo = {
let createElement a::a=? b::b=? children => {
displayName: "test"
};
};
let module One = {
let createElement
test::test=?
foo::foo=?
children => {
displayName: "test"
};
let createElementobvioustypo
test::test
children => {
displayName: "test"
};
};
let module Two = {
let createElement foo::foo=? children => {
displayName: "test"
};
};
let module Sibling = {
let createElement
foo::foo=?
(children: list component) => {
displayName: "test"
};
};
let module Test = {
let createElement yo::yo=? children => {
displayName: "test"
};
};
let module So = {
let createElement children => {
displayName: "test"
};
};
let module Foo2 = {
let createElement children => {
displayName: "test"
};
};
let module Text = {
let createElement children => {
displayName: "test"
};
};
let module Exp = {
let createElement children => {
displayName: "test"
};
};
let module Pun = {
let createElement intended::intended=? children => {
displayName: "test"
};
};
let module Namespace = {
let module Foo = {
let createElement
intended::intended=?
anotherOptional::x=100
children => {
displayName: "test"
};
};
};
let module LotsOfArguments = {
let createElement
argument1::argument1=?
argument2::argument2=?
argument3::argument3=?
argument4::argument4=?
argument5::argument5=?
argument6::argument6=?
children => {
displayName: "test"
};
};
let div argument1::argument1=? children => {
displayName: "test"
};
let module List1 = {
let createElement children => {
displayName: "test"
};
};
let module List2 = {
let createElement children => {
displayName: "test"
};
};
let module List3 = {
let createElement children => {
displayName: "test"
};
};
let (/><) a b => a + b;
let (><) a b => a + b;
let (/>) a b => a + b;
let (><\/) a b => a + b;
let tag1 = 5 />< 6;
let tag2 = 5 >< 7;
let tag3 = 5 /> 7;
let tag4 = 5 ><\/ 7;
let b = 2;
let selfClosing = <Foo />;
let selfClosing2 = <Foo a=1 b=true />;
let selfClosing3 =
<Foo
a="really long values that should"
b="cause the entire thing to wrap"
/>;
let a = <Foo> <Bar c=(fun a => a + 2) /> </Foo>;
let a3 = <So> <Much> <Nesting /> </Much> </So>;
let a4 =
<Sibling>
<One test=true foo=b />
<Two foo=b />
</Sibling>;
let a5 = <Foo> "testing a string here" </Foo>;
let a6 =
<Foo2>
<Text> "testing a string here" </Text>
<Test yo=1 />
<Text> "another string" </Text>
<Bar />
<Exp> (2 + 4) </Exp>
</Foo2>;
let intended = true;
let punning = <Pun intended />;
let namespace = <Namespace.Foo />;
let c = <Foo />;
let d = <Foo />;
let spaceBefore =
<So> <Much> <Nesting /> </Much> </So>;
let spaceBefore2 = <So> <Much /> </So>;
let siblingNotSpaced =
<So> <Much /> <Much /> </So>;
let jsxInList = [<Foo />];
let jsxInList2 = [<Foo />];
let jsxInListA = [<Foo />];
let jsxInListB = [<Foo />];
let jsxInListC = [<Foo />];
let jsxInListD = [<Foo />];
let jsxInList3 = [<Foo />, <Foo />, <Foo />];
let jsxInList4 = [<Foo />, <Foo />, <Foo />];
let jsxInList5 = [<Foo />, <Foo />];
let jsxInList6 = [<Foo />, <Foo />];
let jsxInList7 = [<Foo />, <Foo />];
let jsxInList8 = [<Foo />, <Foo />];
let testFunc b => b;
let jsxInFnCall = testFunc <Foo />;
let lotsOfArguments =
<LotsOfArguments
argument1=1
argument2=2
argument3=3
argument4=4
argument5=5
argument6="test">
<Namespace.Foo />
</LotsOfArguments>;
let lowerCase = <div argument1=1 />;
let b = 0;
let d = 0;
/*
* Should pun the first example:
*/
let a = <Foo a> 5 </Foo>;
let a = <Foo a=b> 5 </Foo>;
let a = <Foo a=b b=d> 5 </Foo>;
let a = <Foo a> 0.55 </Foo>;
let a = Foo.createElement "" [@JSX];
let ident = <Foo> a </Foo>;
let fragment1 = <> <Foo /> <Foo /> </>;
let fragment2 = <> <Foo /> <Foo /> </>;
let fragment3 = <> <Foo /> <Foo /> </>;
let fragment4 = <> <Foo /> <Foo /> </>;
let fragment5 = <> <Foo /> <Foo /> </>;
let fragment6 = <> <Foo /> <Foo /> </>;
let fragment7 = <> <Foo /> <Foo /> </>;
let fragment8 = <> <Foo /> <Foo /> </>;
let fragment9 = <> 2 2 2 2 </>;
let fragment10 = <> 2.2 3.2 4.6 1.2 </>;
let fragment11 = <> "str" </>;
let fragment12 = <> (6 + 2) (6 + 2) (6 + 2) </>;
let fragment13 = <> fragment11 fragment11 </>;
let listOfItems1 = <List1> 1 2 3 4 5 </List1>;
let listOfItems2 =
<List2> 1.0 2.8 3.8 4.0 5.1 </List2>;
let listOfItems3 =
<List3> fragment11 fragment11 </List3>;
/*
* Several sequential simple jsx expressions must be separated with a space.
*/
let thisIsRight a b => ();
let tagOne children => ();
let tagTwo children => ();
/* thisIsWrong <tagOne /><tagTwo />; */
thisIsRight <tagOne /> <tagTwo />;
/* thisIsWrong <tagOne> </tagOne><tagTwo> </tagTwo>; */
thisIsRight <tagOne /> <tagTwo />;
let a children => ();
let b children => ();
let thisIsOkay =
<List1> <a /> <b /> <a /> <b /> </List1>;
let thisIsAlsoOkay =
<List1> <a /> <b /> </List1>;
/* Doesn't make any sense, but suppose you defined an
infix operator to compare jsx */
<a /> < <b />;
<a /> > <b />;
<a /> < <b />;
<a /> > <b />;
let listOfListOfJsx = [<> </>];
let listOfListOfJsx = [<> <Foo /> </>];
let listOfListOfJsx = [
<> <Foo /> </>,
<> <Bar /> </>
];
let listOfListOfJsx = [
<> <Foo /> </>,
<> <Bar /> </>,
...listOfListOfJsx
];
let sameButWithSpaces = [<> </>];
let sameButWithSpaces = [<> <Foo /> </>];
let sameButWithSpaces = [
<> <Foo /> </>,
<> <Bar /> </>
];
let sameButWithSpaces = [
<> <Foo /> </>,
<> <Bar /> </>,
...sameButWithSpaces
];
/*
* Test named tag right next to an open bracket.
*/
let listOfJsx = [];
let listOfJsx = [<Foo />];
let listOfJsx = [<Foo />, <Bar />];
let listOfJsx = [<Foo />, <Bar />, ...listOfJsx];
let sameButWithSpaces = [];
let sameButWithSpaces = [<Foo />];
let sameButWithSpaces = [<Foo />, <Bar />];
let sameButWithSpaces = [
<Foo />,
<Bar />,
...sameButWithSpaces
];
/**
* Test no conflict with polymorphic variant types.
*/
type thisType = [ | `Foo | `Bar];
type t 'a = [< thisType] as 'a;
let asd =
<One test=true foo=2> "a" "b" </One> [@foo];
let asd2 =
One.createElementobvioustypo
test::false
["a", "b"]
[@JSX]
[@foo];
let span
test::(test: bool)
foo::(foo: int)
children => 1;
let asd =
<span test=true foo=2> "a" "b" </span> [@foo];
/* "video" call doesn't end with a list, so the expression isn't converted to JSX */
let video test::(test: bool) children => children;
let asd2 = video test::false 10 [@JSX] [@foo];
let div children => 1;
((fun () => div) ()) [] [@JSX];
let myFun () =>
<>
<Namespace.Foo
intended=true
anotherOptional=200
/>
<Namespace.Foo
intended=true
anotherOptional=200
/>
<Namespace.Foo
intended=true anotherOptional=200>
<Foo />
<Foo />
<Foo />
<Foo />
<Foo />
<Foo />
<Foo />
</Namespace.Foo>
</>;
let myFun () => <> </>;
let myFun () =>
<>
<Namespace.Foo
intended=true
anotherOptional=200
/>
<Namespace.Foo
intended=true
anotherOptional=200
/>
<Namespace.Foo
intended=true anotherOptional=200>
<Foo />
<Foo />
<Foo />
<Foo />
<Foo />
<Foo />
<Foo />
</Namespace.Foo>
</>;
/**
* Children should wrap without forcing attributes to.
*/
<Foo a=10 b=0>
<Bar />
<Bar />
<Bar />
<Bar />
</Foo>;
/**
* Failing test cases:
*/
/* let res = <Foo a=10 b=(<Foo a=200 />) > */
/* <Bar /> */
/* </Foo>; */
/* let res = <Foo a=10 b=(<Foo a=200 />) />; */

1326
samples/Reason/Layout.re Normal file

File diff suppressed because it is too large Load Diff

344
samples/Reason/Machine.re Normal file
View File

@@ -0,0 +1,344 @@
open Format;
let module Endo = {
type t 'a = 'a => 'a;
};
let module Syntax = {
let module Var = {
type t = int;
};
let module Term = {
type t =
| App t t
| Lam t
| Var Var.t
;
};
let module Sub = {
type t 'a =
| Cmp (t 'a) (t 'a)
| Dot 'a (t 'a)
| Id
| Shift
;
let map f sgm => {
let rec go = fun
| Cmp sgm0 sgm1 => Cmp (go sgm0) (go sgm1)
| Dot a sgm => Dot (f a) (go sgm)
| Id => Id
| Shift => Shift
;
go sgm;
};
let rec apply sgm e =>
switch (sgm, e) {
| (sgm, Term.App e0 e1) => Term.App (apply sgm e0) (apply sgm e1)
| (sgm, Term.Lam e) => Term.Lam (apply (Dot (Term.Var 0) (Cmp sgm Shift)) e)
| (Dot e _, Term.Var 0) => e
| (Dot _ sgm, Term.Var i) => apply sgm (Term.Var (i - 1))
| (Id, Term.Var i) => Term.Var i
| (Shift, Term.Var i) => Term.Var (i + 1)
| (Cmp rho sgm, e) => apply sgm (apply rho e)
};
};
};
let module Zip = {
open Syntax;
type t 'a =
| App0 (t 'a) 'a
| App1 'a (t 'a)
| Halt
| Lam (t 'a)
;
let map f sgm => {
let rec go = fun
| App0 zip e1 => App0 (go zip) (f e1)
| App1 e0 zip => App1 (f e0) (go zip)
| Halt => Halt
| Lam zip => Lam (go zip)
;
go sgm;
};
let rec apply zip acc => switch zip {
| App0 zip e1 => apply zip (Term.App acc e1)
| App1 e0 zip => apply zip (Term.App e0 acc)
| Halt => acc
| Lam zip => apply zip (Term.Lam acc)
};
};
let module Clo = {
open Syntax;
type t =
| Clo Term.t (Sub.t t);
let rec from (Clo term sgm) => Sub.apply (Sub.map from sgm) term;
};
let module Pretty = {
let module Delim = {
type t = string;
let pp prev next fmt token => if (prev < next) { fprintf fmt "%s" token };
};
let module Prec = {
type t = int;
open Syntax.Term;
let calc = fun
| App _ _ => 1
| Lam _ => 2
| Var _ => 0
;
};
let module Name = {
type t = string;
let suffix = {
let script = fun
| 0 => ""
| 1 => ""
| 2 => ""
| 3 => ""
| 4 => ""
| 5 => ""
| 6 => ""
| 7 => ""
| 8 => ""
| 9 => ""
| _ => failwith "bad subscript";
let rec go acc => fun
| 0 => acc
| n => go (script (n mod 10) ^ acc) (n / 10);
go ""
};
let gen = {
let offset = 97;
let width = 26;
fun () i => {
let code = i mod width + offset;
let char = Char.chr code;
let prime = i / width;
let suffix = suffix prime;
let name = Char.escaped char ^ suffix;
Some name;
}
};
};
let module Env = {
type t = {
used: list Name.t,
rest: Stream.t Name.t,
};
let mk () => {
let used = [];
let rest = Stream.from @@ Name.gen ();
{ used, rest };
};
};
type printer 'a = Env.t => Prec.t => formatter => 'a => unit;
let module Term = {
open Syntax.Term;
let rec pp ({ Env.used: used, rest } as env) prev fmt e => {
let next = Prec.calc e;
switch e {
| App e0 e1 =>
fprintf fmt "@[%a%a@ %a%a@]"
(Delim.pp prev next) "("
(pp env 1) e0
(pp env 0) e1
(Delim.pp prev next) ")"
| Lam e =>
let name = Stream.next rest;
let env = { ...env, Env.used: [name, ...used] };
fprintf fmt "%aλ%a.%a%a"
(Delim.pp prev next) "("
(pp_print_string) name
(pp env next) e
(Delim.pp prev next) ")"
| Var index =>
fprintf fmt "%s" @@ try (List.nth used index) {
| _ => "#" ^ string_of_int index
}
}
};
};
let module Sub = {
open Syntax.Sub;
let rec pp pp_elem env prev fmt => fun
| Cmp sgm1 sgm0 =>
fprintf fmt "@[%a;@ %a@]"
(pp pp_elem env prev) sgm1
(pp pp_elem env prev) sgm0
| Dot e sgm =>
fprintf fmt "@[%a@ ·@ %a@]"
(pp_elem env prev) e
(pp pp_elem env prev) sgm
| Id =>
fprintf fmt "ι"
| Shift =>
fprintf fmt ""
;
};
let module Clo = {
let rec pp env prev fmt (Clo.Clo e sgm) => {
let next = Prec.calc e;
fprintf fmt "@[%a%a%a[%a]@]"
(Delim.pp prev next) "("
(Term.pp env next) e
(Delim.pp prev next) ")"
(Sub.pp pp env next) sgm
};
};
let module Zip = {
open Zip;
let rec pp pp_elem env prev fmt => fun
| App0 zip elem =>
fprintf fmt "inl@[<v -1>⟨@,%a@,%a⟩@]"
(pp pp_elem env prev) zip
(pp_elem env prev) elem
| App1 elem zip =>
fprintf fmt "inr@[<v -1>⟨@,%a@,%a⟩@]"
(pp_elem env prev) elem
(pp pp_elem env prev) zip
| Halt =>
fprintf fmt "halt"
| Lam zip =>
fprintf fmt "lam@[<v -1>⟨@,%a⟩@]"
(pp pp_elem env prev) zip
;
};
};
let module Machine = {
type t = {
clo: Clo.t,
ctx: Zip.t Clo.t,
};
let into e => {
open Clo;
open Syntax.Sub;
let clo = Clo e Id;
let ctx = Zip.Halt;
{ clo, ctx }
};
let from { clo, ctx } => Zip.apply (Zip.map Clo.from ctx) (Clo.from clo);
let pp fmt rule state => {
fprintf fmt "@[<v>ctx ::@[<v -5>@,%a@]@,clo ::@[<v -5>@,%a@]@,rule ::@[<v -5>@,%a@]@,term ::@[<v -5>@,%a@]@]@."
(Pretty.Zip.pp Pretty.Clo.pp (Pretty.Env.mk ()) 2) state.ctx
(Pretty.Clo.pp (Pretty.Env.mk ()) 2) state.clo
(pp_print_string) rule
(Pretty.Term.pp (Pretty.Env.mk ()) 2) (from state)
};
let halted state => {
open Clo;
open Syntax.Sub;
open Syntax.Term;
switch state {
| { clo: Clo (Var _) Id, _ } => true
| _ => false
} [@warning "-4"];
};
let step state => {
open Clo;
open Syntax.Sub;
open Syntax.Term;
let rule = ref "";
let state = switch state {
/* left */
| { clo: Clo (App e0 e1) sgm, ctx } =>
let clo = Clo e0 sgm;
let ctx = Zip.App0 ctx (Clo e1 sgm);
rule := "LEFT";
{ clo, ctx };
/* beta */
| { clo: Clo (Lam e) sgm, ctx: Zip.App0 ctx c0 } =>
let clo = Clo e (Cmp (Dot c0 sgm) Id);
rule := "BETA";
{ clo, ctx };
/* lambda */
| { clo: Clo (Lam e) sgm, ctx } =>
let clo = Clo e (Cmp (Dot (Clo (Var 0) Id) (Cmp sgm Shift)) Id);
let ctx = Zip.Lam ctx;
rule := "LAMBDA";
{ clo, ctx };
/* associate */
| { clo: Clo (Var n) (Cmp (Cmp pi rho) sgm), ctx } =>
let clo = Clo (Var n) (Cmp pi (Cmp rho sgm));
rule := "ASSOCIATE";
{ clo, ctx };
/* head */
| { clo: Clo (Var 0) (Cmp (Dot (Clo e pi) _) sgm), ctx } =>
let clo = Clo e (Cmp pi sgm);
rule := "HEAD";
{ clo, ctx };
/* tail */
| { clo: Clo (Var n) (Cmp (Dot (Clo _ _) rho) sgm), ctx } =>
let clo = Clo (Var (n - 1)) (Cmp rho sgm);
rule := "TAIL";
{ clo, ctx };
/* shift */
| { clo: Clo (Var n) (Cmp Shift sgm), ctx } =>
let clo = Clo (Var (n + 1)) sgm;
rule := "SHIFT";
{ clo, ctx };
/* id */
| { clo: Clo (Var n) (Cmp Id sgm), ctx } =>
let clo = Clo (Var n) sgm;
rule := "ID";
{ clo, ctx };
| _ =>
pp std_formatter !rule state;
failwith "bad state";
} [@warning "-4"];
pp std_formatter !rule state;
state;
};
let norm e => {
let count = ref 0;
let state = ref (into e);
while (not (halted !state)) {
fprintf std_formatter "@\n--- step[%d] ---@\n" !count;
incr count;
state := step !state;
};
from !state;
};
};
let module Test = {
open Syntax.Term;
let l e => Lam e;
let ( *@ ) e0 e1 => App e0 e1;
let ff = l (l (Var 1));
let tt = l (l (Var 0));
let zero = l (l (Var 1));
let succ = l (l (l (Var 0 *@ Var 2)));
let one = succ *@ zero;
let two = succ *@ one;
let three = succ *@ two;
let const = l (l (Var 1));
let fix = l (l (Var 1 *@ (Var 0 *@ Var 0)) *@ l (Var 1 *@ (Var 0 *@ Var 0)));
let add = fix *@ l (l (l (Var 1 *@ Var 0 *@ l (succ *@ Var 3 *@ Var 0 *@ Var 1))));
let init = l (l (Var 0) *@ l (l (Var 1)));
};
let module Run = {
let go () => Machine.norm Test.init;
};

View File

@@ -0,0 +1,308 @@
/*
* Copyright (c) 2015-present, Facebook, Inc.
* All rights reserved.
*
*/
let startedMerlin: ref (option Js.Unsafe.any) = {contents: None};
let fixedEnv = Js.Unsafe.js_expr "require('../lib/fixedEnv')";
/* This and the subsequent big js blocks are copied over from Nuclide. More convenient for now. */
let findNearestMerlinFile' = Js.Unsafe.js_expr {|
function findNearestMerlinFile(beginAtFilePath) {
var path = require('path');
var fs = require('fs');
var fileDir = path.dirname(beginAtFilePath);
var currentPath = path.resolve(fileDir);
do {
var fileToFind = path.join(currentPath, '.merlin');
var hasFile = fs.existsSync(fileToFind);
if (hasFile) {
return path.dirname(currentPath);
}
if (path.dirname(currentPath) === currentPath) {
// Bail
return '.';
}
currentPath = path.dirname(currentPath);
} while (true);
}
|};
let findNearestMerlinFile beginAtFilePath::path => {
let result = Js.Unsafe.fun_call findNearestMerlinFile' [|Js.Unsafe.inject (Js.string path)|];
Js.to_string result
};
let createMerlinReaderFnOnce' = Js.Unsafe.js_expr {|
function(ocamlMerlinPath, ocamlMerlinFlags, dotMerlinDir, fixedEnv) {
var spawn = require('child_process').spawn;
// To split while stripping out any leading/trailing space, we match on all
// *non*-whitespace.
var items = ocamlMerlinFlags === '' ? [] : ocamlMerlinFlags.split(/\s+/);
var merlinProcess = spawn(ocamlMerlinPath, items, {cwd: dotMerlinDir});
merlinProcess.stderr.on('data', function(d) {
console.error('Ocamlmerlin: something wrong happened:');
console.error(d.toString());
});
merlinProcess.stdout.on('close', function(d) {
console.error('Ocamlmerlin: closed.');
});
var cmdQueue = [];
var hasStartedReading = false;
var readline = require('readline');
var reader = readline.createInterface({
input: merlinProcess.stdout,
terminal: false,
});
return function(cmd, resolve, reject) {
cmdQueue.push([resolve, reject]);
if (!hasStartedReading) {
hasStartedReading = true;
reader.on('line', function(line) {
var response;
try {
response = JSON.parse(line);
} catch (err) {
response = null;
}
var resolveReject = cmdQueue.shift();
var resolve = resolveReject[0];
var reject = resolveReject[1];
if (!response || !Array.isArray(response) || response.length !== 2) {
reject(new Error('Unexpected ocamlmerlin output format: ' + line));
return;
}
var status = response[0];
var content = response[1];
var errorResponses = {
'failure': true,
'error': true,
'exception': true,
};
if (errorResponses[status]) {
reject(new Error('Ocamlmerlin returned an error: ' + line));
return;
}
resolve(content);
});
}
merlinProcess.stdin.write(JSON.stringify(cmd));
};
}
|};
let createMerlinReaderFnOnce
pathToMerlin::pathToMerlin
merlinFlags::merlinFlags
dotMerlinPath::dotMerlinPath =>
Js.Unsafe.fun_call
createMerlinReaderFnOnce'
[|
Js.Unsafe.inject (Js.string pathToMerlin),
Js.Unsafe.inject (Js.string merlinFlags),
Js.Unsafe.inject (Js.string dotMerlinPath),
Js.Unsafe.inject fixedEnv
|];
let startMerlinProcess path::path =>
switch startedMerlin.contents {
| Some readerFn => ()
| None =>
let atomReasonPathToMerlin = Atom.Config.get "atom-reason.pathToMerlin";
let atomReasonMerlinFlags = Atom.Config.get "atom-reason.merlinFlags";
let atomReasonMerlinLogFile = Atom.Config.get "atom-reason.merlinLogFile";
switch atomReasonMerlinLogFile {
| JsonString "" => ()
| JsonString s => Atom.Env.setEnvVar "MERLIN_LOG" s
| _ => ()
};
let readerFn =
createMerlinReaderFnOnce
pathToMerlin::(Atom.JsonValue.unsafeExtractString atomReasonPathToMerlin)
merlinFlags::(Atom.JsonValue.unsafeExtractString atomReasonMerlinFlags)
dotMerlinPath::(findNearestMerlinFile beginAtFilePath::path);
startedMerlin.contents = Some readerFn
};
let readOneLine cmd::cmd resolve reject =>
switch startedMerlin.contents {
| None => raise Not_found
| Some readerFn =>
Js.Unsafe.fun_call
readerFn
[|
Js.Unsafe.inject cmd,
Js.Unsafe.inject (Js.wrap_callback resolve),
Js.Unsafe.inject (Js.wrap_callback reject)
|]
};
/* contextify is important for avoiding different buffers calling the backing merlin at the same time. */
/* https://github.com/the-lambda-church/merlin/blob/d98a08d318ca14d9c702bbd6eeadbb762d325ce7/doc/dev/PROTOCOL.md#contextual-commands */
let contextify query::query path::path => Js.Unsafe.obj [|
("query", Js.Unsafe.inject query),
("context", Js.Unsafe.inject (Js.array [|Js.string "auto", Js.string path|]))
|];
let prepareCommand text::text path::path query::query resolve reject => {
startMerlinProcess path;
/* These two commands should be run before every main command. */
readOneLine
cmd::(
contextify
/* The protocol command tells Merlin which API version we want to use. (2 for us) */
query::(
Js.array [|
Js.Unsafe.inject (Js.string "protocol"),
Js.Unsafe.inject (Js.string "version"),
Js.Unsafe.inject (Js.number_of_float 2.)
|]
)
path::path
)
(
fun _ =>
readOneLine
cmd::(
contextify
/* The tell command allows us to synchronize our text with Merlin's internal buffer. */
query::(
Js.array [|Js.string "tell", Js.string "start", Js.string "end", Js.string text|]
)
path::path
)
(fun _ => readOneLine cmd::(contextify query::query path::path) resolve reject)
reject
)
reject
};
let positionToJsMerlinPosition (line, col) => Js.Unsafe.obj [|
/* lines (rows) are 1-based for merlin, not 0-based, like for Atom */
("line", Js.Unsafe.inject (Js.number_of_float (float_of_int (line + 1)))),
("col", Js.Unsafe.inject (Js.number_of_float (float_of_int col)))
|];
/* Actual merlin commands we'll use. */
let getTypeHint path::path text::text position::position resolve reject =>
prepareCommand
text::text
path::path
query::(
Js.array [|
Js.Unsafe.inject (Js.string "type"),
Js.Unsafe.inject (Js.string "enclosing"),
Js.Unsafe.inject (Js.string "at"),
Js.Unsafe.inject (positionToJsMerlinPosition position)
|]
)
resolve
reject;
let getAutoCompleteSuggestions
path::path
text::text
position::position
prefix::prefix
resolve
reject =>
prepareCommand
text::text
path::path
query::(
Js.array [|
Js.Unsafe.inject (Js.string "complete"),
Js.Unsafe.inject (Js.string "prefix"),
Js.Unsafe.inject (Js.string prefix),
Js.Unsafe.inject (Js.string "at"),
Js.Unsafe.inject (positionToJsMerlinPosition position),
Js.Unsafe.inject (Js.string "with"),
Js.Unsafe.inject (Js.string "doc")
|]
)
resolve
reject;
let getDiagnostics path::path text::text resolve reject =>
prepareCommand
text::text
path::path
query::(Js.array [|Js.Unsafe.inject (Js.string "errors")|])
resolve
reject;
let locate path::path text::text extension::extension position::position resolve reject =>
prepareCommand
text::text
path::path
query::(
Js.array [|
Js.Unsafe.inject (Js.string "locate"),
Js.Unsafe.inject (Js.string ""),
Js.Unsafe.inject (Js.string extension),
Js.Unsafe.inject (Js.string "at"),
Js.Unsafe.inject (positionToJsMerlinPosition position)
|]
)
resolve
reject;
/* reject */
let getOccurrences path::path text::text position::position resolve reject =>
prepareCommand
text::text
path::path
query::(
Js.array [|
Js.Unsafe.inject (Js.string "occurrences"),
Js.Unsafe.inject (Js.string "ident"),
Js.Unsafe.inject (Js.string "at"),
Js.Unsafe.inject (positionToJsMerlinPosition position)
|]
)
resolve
reject;
let destruct
path::path
text::text
startPosition::startPosition
endPosition::endPosition
resolve
reject =>
prepareCommand
text::text
path::path
query::(
Js.array [|
Js.Unsafe.inject (Js.string "case"),
Js.Unsafe.inject (Js.string "analysis"),
Js.Unsafe.inject (Js.string "from"),
Js.Unsafe.inject (positionToJsMerlinPosition startPosition),
Js.Unsafe.inject (Js.string "to"),
Js.Unsafe.inject (positionToJsMerlinPosition endPosition)
|]
)
resolve
reject;
let getOutline path::path text::text resolve reject =>
prepareCommand
text::text
path::path
query::(Js.array [|Js.Unsafe.inject (Js.string "outline")|])
resolve
reject;

989
samples/Reason/Syntax.re Normal file
View File

@@ -0,0 +1,989 @@
/* Copyright (c) 2015-present, Facebook, Inc. All rights reserved. */
[@@@autoFormat let wrap = 80; let shift = 2];
Modules.run ();
Polymorphism.run ();
Variants.run ();
BasicStructures.run ();
TestUtils.printSection "General Syntax";
/* Won't work! */
/* let matchingFunc a = match a with */
/* `Thingy x => (print_string "matched thingy x"); x */
/* | `Other x => (print_string "matched other x"); x;; */
/* */
let matchingFunc a =>
switch a {
| `Thingy x =>
print_string "matched thingy x";
let zz = 10;
zz
| `Other x =>
print_string "matched other x";
x
};
type firstTwoShouldBeGroupedInParens =
(int => int) => int => int;
type allParensCanBeRemoved =
int => int => int => int;
type firstTwoShouldBeGroupedAndFirstThree =
((int => int) => int) => int;
/* Same thing now but with type constructors instead of each int */
type firstTwoShouldBeGroupedInParens =
(list int => list int) => list int => list int;
type allParensCanBeRemoved =
list int => list int => list int => list int;
type firstTwoShouldBeGroupedAndFirstThree =
((list int => list int) => list int) =>
list int;
type myRecordType = {
firstTwoShouldBeGroupedInParens:
(int => int) => int => int,
allParensCanBeRemoved:
int => int => int => int,
firstTwoShouldBeGroupedAndFirstThree:
((int => int) => int) => int
};
type firstNamedArgShouldBeGroupedInParens =
first::(int => int) => second::int => int;
type allParensCanBeRemoved =
first::int => second::int => third::int => int;
type firstTwoShouldBeGroupedAndFirstThree =
first::((int => int) => int) => int;
/* Same thing now, but with type constructors instead of int */
type firstNamedArgShouldBeGroupedInParens =
first::(list int => list int) =>
second::list int =>
list int;
type allParensCanBeRemoved =
first::list int =>
second::list int =>
third::list int =>
list int;
type firstTwoShouldBeGroupedAndFirstThree =
first::((list int => list int) => list int) =>
list int;
type firstNamedArgShouldBeGroupedInParens =
first::(int => int)? =>
second::int list? =>
int;
/* The arrow necessitates parens around the next two args. The ? isn't what
* makes the parens necessary. */
type firstNamedArgShouldBeGroupedInParensAndSecondNamedArg =
first::(int => int)? =>
second::(int => int)? =>
int;
type allParensCanBeRemoved =
first::int? =>
second::int? =>
third::int? =>
int;
type firstTwoShouldBeGroupedAndFirstThree =
first::((int => int) => int) => int;
type noParens =
one::int => int => int => two::int => int;
type noParensNeeded =
one::int => int => int => two::int => int;
type firstNamedArgNeedsParens =
one::(int => int => int) => two::int => int;
/* Now, let's try type aliasing */
/* Unless wrapped in parens, types between arrows may not be aliased, may not
* themselves be arrows. */
type parensRequiredAroundFirstArg =
(list int as 'a) => int as 'a;
type parensRequiredAroundReturnType =
(list int as 'a) => (int as 'a);
type parensRequiredAroundReturnType =
(list int as 'a) => (int as 'a) as 'b;
type noParensNeededWhenInTuple =
(list int as 'a, list int as 'b) as 'entireThing;
type myTypeDef 'a = list 'a;
type instatiatedTypeDef = myTypeDef int => int;
/* Test a type attribute for good measure */
/* We should clean up all of the attribute tagging eventually, but for now,
* let's make it super ugly to get out of the way of all the formatting/parsing
* implementations (fewer conflicts during parsing, fewer edge cases during
* printing).
*/
type something = (
int,
int [@lookAtThisAttribute]
);
type longWrappingTypeDefinitionExample =
M_RK__G.Types.instance
(TGRecognizer.tGFields unit unit)
(TGRecognizer.tGMethods unit unit);
type semiLongWrappingTypeDefinitionExample =
M_RK__Gesture.Types.instance
TGRecognizerFinal.tGFields
TGRecognizerFinal.tGMethods;
type semiLongWrappingTypeWithConstraint =
M_RK__Gesture.Types.instance
'a
TGRecognizerFinal.tGFields
TGRecognizerFinal.tGMethods
constraint 'a = (unit, unit);
type onelineConstrain = 'a constraint 'a = int;
/* This must be in trunk but not in this branch of OCaml */
/* type withNestedRecords = MyConstructor {myField: int} */
type colors =
| Red int
| Black int
| Green int;
/* Another approach is to require declared variants to wrap any record */
/* type myRecord = MyRecord {name: int}; */
/* let myValue = MyRecord {name: int}; */
/* This would force importing of the module */
/* This would also lend itself naturally to pattern matching - and avoid having
to use `.` operator at all since you normally destructure. */
type nameBlahType = {nameBlah: int};
let myRecord = {nameBlah: 20};
let myRecordName = myRecord.nameBlah;
let {nameBlah}: nameBlahType = {nameBlah: 20};
print_int nameBlah;
let {nameBlah: aliasedToThisVar}: nameBlahType = {
nameBlah: 20
};
print_int aliasedToThisVar;
let desiredFormattingForWrappedLambda:
int => int => int => nameBlahType =
/*
fun is
pre- /firstarg\
fix /-coupled--\
|-\ /-to-prefix--\ */
fun curriedArg anotherArg lastArg => {
nameBlah: 10
};
type longerInt = int;
let desiredFormattingForWrappedLambdaWrappedArrow:
longerInt =>
longerInt =>
longerInt =>
nameBlahType =
/*
fun is
pre- /firstarg\
fix /-coupled--\
|-\ /-to-prefix--\ */
fun curriedArg anotherArg lastArg => {
nameBlah: 10
};
let desiredFormattingForWrappedLambdaReturnOnNewLine
/*
fun is
pre- /firstarg\
fix /-coupled--\
|-\ /-to-prefix--\ */
curriedArg
anotherArg
lastArg => {
nameBlah: 10
};
/*
let is
pre-
fix /-function binding name---\
|-\ / is coupled to prefix \ */
let desiredFormattingForWrappedSugar
curriedArg
anotherArg
lastArg => {
nameBlah: 10
};
/*
let is
pre-
fix /-function binding name---\
|-\ / is coupled to prefix \ */
let desiredFormattingForWrappedSugarReturnOnNewLine
curriedArg
anotherArg
lastArg => {
nameBlah: 10
};
/*
let : type t1 t2. t1 * t2 list -> t1 = ...
let rec f : 't1 't2. 't1 * 't2 list -> 't1 =
fun (type t1) (type t2) -> (... : t1 * t2 list -> t1)
*/
type point = {x: int, y: int};
type point3D = {x: int, y: int, z: int};
let point2D = {x: 20, y: 30};
let point3D: point3D = {
x: 10,
y: 11,
z: 80 /* Optional Comma */
};
let printPoint (p: point) => {
print_int p.x;
print_int p.y
};
let addPoints (p1: point, p2: point) => {
x: p1.x + p2.x,
y: p1.y + p2.y
};
let res1 = printPoint point2D;
let res2 =
printPoint {x: point3D.x, y: point3D.y};
/*
When () were used to indicate sequences, the parser used seq_expr not only
for grouping sequences, but also to form standard precedences.
/------- sequence_expr ------\
let res3 = printPoint (addPoints (point2D, point3D));
Interestingly, it knew that tuples aren't sequences.
To move towards semi delimited, semi-terminated, braces-grouped sequences:
while allowing any non-sequence expression to be grouped on parens, we make
an explicit rule that allows one single non-semi ended expression to be
grouped in parens.
Actually: We will allow an arbitrary number of semi-delimited expressions to
be wrapped in parens, but the braces grouped semi delimited (sequence)
expressions must *also* be terminated with a semicolon.
This allows the parser to distinguish between
let x = {a}; /* Record {a:a} */
let x = {a;}; /* Single item sequence returning identifier {a} */
*/
let res3 =
printPoint (
addPoints (
point2D,
{x: point3D.x, y: point3D.y}
)
);
type person = {age: int, name: string};
type hiredPerson = {
age: string,
name: string,
dateHired: int
};
let o: person = {name: "bob", age: 10};
/* Parens needed? Nope! */
let o: person = {name: "bob", age: 10};
let printPerson (p: person) => {
let q: person = p;
p.name ^ p.name
};
/* let dontParseMeBro x y:int = x = y;*/
/* With this unification, anywhere eyou see `= fun` you can just ommit it */
let blah a => a; /* Done */
let blah a => a; /* Done (almost) */
let blah a b => a; /* Done */
let blah a b => a; /* Done (almost) */
/* More than one consecutive pattern must have a single case */
type blah = {blahBlah: int};
let blah a {blahBlah} => a;
let blah a {blahBlah} => a;
let module TryToExportTwice = {
let myVal = "hello";
};
/*
Unifying top level module syntax with local module syntax is probably a bad
idea at the moment because it makes it more difficult to continue to support
`let .. in` bindings. We can distinguish local modules for `let..in` that
just happen to be defined at the top level (but not exported).
let MyModule = {let myVal = 20;} in
MyModule.x
Wait, where would this ever be valid, even if we continued to support
`let..in`?
*/
let onlyDoingThisTopLevelLetToBypassTopLevelSequence = {
let x = {
print_int 1;
print_int 20 /* Missing trailing SEMI */
};
let x = {
print_int 1;
print_int 20; /* Ensure missing middle SEMI reported well */
print_int 20
};
let x = {
print_int 1;
print_int 20;
10
/* Comment in final position */
}; /* Missing final SEMI */
x + x
};
type hasA = {a: int};
let a = 10;
let returnsASequenceExpressionWithASingleIdentifier
() => a;
let thisReturnsA () => a;
let thisReturnsAAsWell () => a;
let recordVal: int = (thisReturnsARecord ()).a;
Printf.printf
"\nproof that thisReturnsARecord: %n\n"
recordVal;
Printf.printf
"\nproof that thisReturnsA: %n\n"
(thisReturnsA ());
/* Pattern matching */
let blah arg =>
switch arg {
/* Comment before Bar */
| /* Comment between bar/pattern */ Red _ => 1
/* Comment Before non-first bar */
| /* Comment betwen bar/pattern */ Black _ => 0
| Green _ => 0
};
/* Any function that pattern matches a multicase match is interpretted as a
* single arg that is then matched on. Instead of the above `blah` example:*/
let blah =
fun
| Red _ => 1
| Black _ => 0
| Green _ => 1;
/* `fun a => a` is read as "a function that maps a to a". Then the */
/* above example is read: "a function that 'either maps' Red to.. or maps .." */
/* Thc00f564e first bar is read as "either maps" */
/* Curried form is not supported:
let blah x | Red _ => 1 | Black _ => 0;
Theres no sugar rule for dropping => fun, only = fun
*/
/* let blahCurriedX x => fun /* See, nothing says we can drop the => fun */ */
/* |(Red x | Black x | Green x) => 1 /* With some effort, we can ammend the sugar rule that would */ */
/* | Black x => 0 /* Allow us to drop any => fun.. Just need to make pattern matching */ */
/* | Green x => 0; /* Support that */ */
/* */
let blahCurriedX x =>
fun
| Red x
| Black x
| Green x =>
1 /* With some effort, we can ammend the sugar rule that would */
| Black x => 0 /* Allow us to drop any => fun.. Just need to make pattern matching */
| Green x => 0; /* Support that */
let sameThingInLocal = {
let blahCurriedX x =>
fun
| Red x
| Black x
| Green x =>
1 /* With some effort, we can ammend the sugar rule that would */
| Black x => 0 /* Allow us to drop any => fun.. Just need to make pattern matching */
| Green x => 0; /* Support that */
blahCurriedX
};
/* This should be parsed/printed exactly as the previous */
let blahCurriedX x =>
fun
| Red x
| Black x
| Green x => 1
| Black x => 0
| Green x => 0;
/* Any time there are multiple match cases we require a leading BAR */
let v = Red 10;
let Black x | Red x | Green x = v; /* So this NON-function still parses */
/* This doesn't parse, however (and it doesn't in OCaml either):
let | Black x | Red x | Green x = v;
*/
print_int x;
/* Scoping: Let sequences. Familiar syntax for lexical ML style scope and
sequences. */
let res = {
let a = "a starts out as";
{
print_string a;
let a = 20;
print_int a
};
print_string a
};
let res = {
let a = "first its a string";
let a = 20;
print_int a;
print_int a;
print_int a
};
let res = {
let a = "a is always a string";
print_string a;
let b = 30;
print_int b
};
/* let result = LyList.map (fun | [] => true | _ => false) []; */
/* OTHERWISE: You cannot tell if a is the first match case falling through or
* a curried first arg */
/* let blah = fun a | patt => 0 | anotherPatt => 1; */
/* let blah a patt => 0 | anotherPatt => 1; */
/*simple pattern EQUALGREATER expr */
let blah a {blahBlah} => a;
/* match_case */
/* pattern EQUALGREATER expr */
let blah =
fun
| Red _ => 1
| Black _ => 0
| Green _ => 0;
/* Won't work! */
/* let arrowFunc = fun a b => print_string "returning aplusb from arrow"; a + b;; */
let arrowFunc a b => {
print_string "returning aplusb from arrow";
a + b
};
let add a b => {
let extra = {
print_string "adding";
0
};
let anotherExtra = 0;
extra + a + b + anotherExtra
};
print_string (string_of_int (add 4 34));
let dummy _ => 10;
dummy res1;
dummy res2;
dummy res3;
/* Some edge cases */
let myFun firstArg (Red x | Black x | Green x) =>
firstArg + x;
let matchesWithWhen a =>
switch a {
| Red x when 1 > 0 => 10
| Red _ => 10
| Black x => 10
| Green x => 10
};
let matchesWithWhen =
fun
| Red x when 1 > 0 => 10
| Red _ => 10
| Black x => 10
| Green x => 10;
let matchesOne (`Red x) => 10;
/*
Typical OCaml would make you *wrap the functions in parens*! This is because it
can't tell if a semicolon is a sequence operator. Even if we had records use
commas to separate fields,
*/
type adders = {
addTwoNumbers: int => int => int,
addThreeNumbers: int => int => int => int,
addThreeNumbersTupled: (int, int, int) => int
};
let myRecordWithFunctions = {
addTwoNumbers: fun a b => a + b,
addThreeNumbers: fun a b c => a + b + c,
addThreeNumbersTupled: fun (a, b, c) =>
a + b + c
};
let result =
myRecordWithFunctions.addThreeNumbers 10 20 30;
let result =
myRecordWithFunctions.addThreeNumbersTupled (
10,
20,
30
);
let lookTuplesRequireParens = (1, 2);
/* let thisDoesntParse = 1, 2; */
let tupleInsideAParenSequence = {
print_string "look, a tuple inside a sequence";
let x = 10;
(x, x)
};
let tupleInsideALetSequence = {
print_string "look, a tuple inside a sequence";
let x = 10;
(x, x)
};
/* We *require* that function return types be wrapped in
parenthesis. In this example, there's no ambiguity */
let makeIncrementer (delta: int) :(int => int) =>
fun a => a + delta;
/* We could even force that consistency with let bindings - it's allowed
currently but not forced.
*/
let myAnnotatedValBinding: int = 10;
/* Class functions (constructors) and methods are unified in the same way */
class classWithNoArg = {
method x = 0;
method y = 0;
};
/* This parses but doesn't type check
class myClass init => object
method x => init
method y => init
end;
*/
let myFunc (a: int) (b: int) :(int, int) => (
a,
b
);
let myFunc (a: int) (b: int) :list int => [1];
let myFunc (a: int) (b: int) :point => {
x: a,
y: b
};
let myFunc (a: int, b: int) :point => {
x: a,
y: b
};
type myThing = (int, int);
type stillARecord = {name: string, age: int};
/* Rebase latest OCaml to get the following: And fixup
`generalized_constructor_arguments` according to master. */
/* type ('a, 'b) myOtherThing = Leaf {first:'a, second: 'b} | Null; */
type branch 'a 'b = {first: 'a, second: 'b};
type myOtherThing 'a 'b =
| Leaf (branch 'a 'b)
| Null;
type yourThing = myOtherThing int int;
/* Conveniently - this parses exactly how you would intend! No *need* to wrap
in an extra [], but it doesn't hurt */
/* FIXME type lookAtThesePolyVariants = list [`Red] ; */
/* FIXME type bracketsGroupMultipleParamsAndPrecedence = list (list (list [`Red])); */
/* FIXME type youCanWrapExtraIfYouWant = (list [`Red]); */
/* FIXME type hereAreMultiplePolyVariants = list [`Red | `Black]; */
/* FIXME type hereAreMultiplePolyVariantsWithOptionalWrapping = list ([`Red | `Black]); */
/*
/* Proposal: ES6 style lambdas: */
/* Currying */
let lookES6Style = (`Red x) (`Black y) => { };
let lookES6Style (`Red x) (`Black y) => { };
/* Matching the single argument */
let lookES6Style = oneArg => match oneArg with
| `Red x => x
| `Black x => x;
/* The "trick" to currying that we already have is basically the same - we just
* have to reword it a bit:
* From:
* "Any time you see [let x = fun ...] just replace it with [let x ...]"
* To:
* "Any time you see [let x = ... => ] just replace it with [let x ... => ]"
*/
let lookES6Style oneArg => match oneArg with
| `Red x => x
| `Black x => x;
*/
/** Current OCaml Named Arguments. Any aliasing is more than just aliasing!
OCaml allows full on pattern matching of named args. */
/*
A: let named ~a ~b = aa + bb in
B: let namedAlias ~a:aa ~b:bb = aa + bb in
C: let namedAnnot ~(a:int) ~(b:int) = a + b in
D: let namedAliasAnnot ~a:(aa:int) ~b:(bb:int) = aa + bb in
E: let optional ?a ?b = 10 in
F: let optionalAlias ?a:aa ?b:bb = 10 in
G: let optionalAnnot ?(a:int option) ?(b:int option) = 10 in
H: let optionalAliasAnnot ?a:(aa:int option) ?b:(bb:int option) = 10 in
/*
Look! When a default is provided, annotation causes inferred type of argument
to not be "option" since it's automatically destructured (because we know it
will always be available one way or another.)
*/
I: let defOptional ?(a=10) ?(b=10) = 10 in
J: let defOptionalAlias ?a:(aa=10) ?b:(bb=10) = 10 in
K: let defOptionalAnnot ?(a:int=10) ?(b:int=10) = 10 in
\ \
\label_let_pattern opt_default: no longer needed in SugarML
L: let defOptionalAliasAnnot ?a:(aa:int=10) ?b:(bb:int=10) = 10 in
\ \
\let_pattern: still a useful syntactic building block in SugarML
*/
/**
* In Reason, the syntax for named args uses double semicolon, since
* the syntax for lists uses ES6 style [], freeing up the ::.
*/
let a = 10;
let b = 20;
/*A*/
let named a::a b::b => a + b;
type named = a::int => b::int => int;
/*B*/
let namedAlias a::aa b::bb => aa + bb;
let namedAlias a::aa b::bb => aa + bb;
type namedAlias = a::int => b::int => int;
/*C*/
let namedAnnot a::(a: int) b::(b: int) => 20;
/*D*/
let namedAliasAnnot a::(aa: int) b::(bb: int) => 20;
/*E*/
let myOptional a::a=? b::b=? () => 10;
type named = a::int? => b::int? => unit => int;
/*F*/
let optionalAlias a::aa=? b::bb=? () => 10;
/*G*/
let optionalAnnot a::(a: int)=? b::(b: int)=? () => 10;
/*H*/
let optionalAliasAnnot
a::(aa: int)=?
b::(bb: int)=?
() => 10;
/*I: */
let defOptional a::a=10 b::b=10 () => 10;
type named = a::int? => b::int? => unit => int;
/*J*/
let defOptionalAlias a::aa=10 b::bb=10 () => 10;
/*K*/
let defOptionalAnnot
a::(a: int)=10
b::(b: int)=10
() => 10;
/*L*/
let defOptionalAliasAnnot
a::(aa: int)=10
b::(bb: int)=10
() => 10;
/*M: Invoking them - Punned */
let resNotAnnotated = named a::a b::b;
/*N:*/
let resAnnotated: int = named a::a b::b;
/*O: Invoking them */
let resNotAnnotated = named a::a b::b;
/*P: Invoking them */
let resAnnotated: int = named a::a b::b;
/*Q: Here's why "punning" doesn't work! */
/* Is b:: punned with a final non-named arg, or is b:: supplied b as one named arg? */
let b = 20;
let resAnnotated = named a::a b::b;
/*R: Proof that there are no ambiguities with return values being annotated */
let resAnnotated: ty = named a::a b;
/*S: Explicitly passed optionals are a nice way to say "use the default value"*/
let explictlyPassed =
myOptional a::?None b::?None;
/*T: Annotating the return value of the entire function call */
let explictlyPassedAnnotated: int =
myOptional a::?None b::?None;
/*U: Explicitly passing optional with identifier expression */
let a = None;
let explictlyPassed = myOptional a::?a b::?None;
let explictlyPassedAnnotated: int =
myOptional a::?a b::?None;
let nestedLet = {
let _ = 1;
()
};
let nestedLet = {
let _ = 1;
()
};
let nestedLet = {
let _ = 1;
()
};
let nestedLet = {
let _ = 1;
2
};
/*
* Showing many combinations of type annotations and named arguments.
*/
type typeWithNestedNamedArgs =
outerOne::(
innerOne::int => innerTwo::int => int
) =>
outerTwo::int =>
int;
type typeWithNestedOptionalNamedArgs =
outerOne::
(innerOne::int => innerTwo::int => int)? =>
outerTwo::int? =>
int;
type typeWithNestedOptionalNamedArgs =
outerOne::list string? => outerTwo::int? => int;
let x =
callSomeFunction
withArg::10 andOtherArg::wrappedArg;
let res = {
(constraintedSequenceItem: string);
(dontKnowWheYoudWantToActuallyDoThis: string)
};
let res = {
(
butTheyWillBePrintedWithAppropriateSpacing: string
);
(soAsToInstillBestDevelopmentPractices: string)
};
let x = [
(eachItemInListCanBeAnnotated: int),
(typeConstraints: float),
(
tupleConstraints: int,
andNotFunctionInvocations: int
)
];
let x = [
(butWeWillPrint: int),
(themAsSpaceSeparated: float),
(toInfluenceYour: int, developmentHabbits: int)
];
let newRecord = {
...(annotatedSpreadRecord: someRec),
x: y
};
let newRecord = {
...(annotatedSpreadRecord: someRec),
blah: 0,
foo: 1
};
let newRecord = {
...(
youCanEvenCallMethodsHereAndAnnotate them: someRec
),
blah: 0,
foo: 1
};
let newRecord = {
...(
youCanEvenCallMethodsHereAndAnnotate
them named::10: someRec
),
blah: 0,
foo: 1
};
let something: thing blah = aTypeAnnotation;
let something: thing blah = thisIsANamedArg;
let something: thing blah = aTypeAnnotation;
let something: blah = thisIsANamedArg thing;
let something: blah = typeAnnotation thing;
let newRecord = {
...(
heresAFunctionWithNamedArgs argOne::i: annotatedResult
),
soAsToInstill: 0,
developmentHabbits: 1
};
[@@@thisIsAThing];
let x = 10;
/* Ensure that the parenthesis are preserved here because they are
* important:
*/
let something =
fun
| None => (
fun
| [] => "emptyList"
| [_, ..._] => "nonEmptyList"
)
| Some _ => (
fun
| [] => "emptyList"
| [_, ..._] => "nonEmptyList"
);
/* A | B = X; */
let A | B = X;
/* A | (B | C) = X; */
let A | (B | C) = X;
/* (A | B) | (C | D) = X; */
let A | B | (C | D) = X;
/* A | B | (C | D) = X; */
let A | B | (C | D) = X;
/* (A | B) | C = X; */
let A | B | C = X;
/* A | B | C = X; */
let A | B | C = X;
/** External function declaration
*
*/
external f : int => int = "foo";
let x = {contents: 0};
let unitVal = x.contents = 210;

View File

@@ -0,0 +1,19 @@
-\*-
(?:
\s*
(?= [^:;\s]+ \s* -\*-)
|
(?:
.*?[;\s]
|
(?<=-\*-)
)
mode\s*:\s*
)
([^:;\s]+)
(?=
[\s;] | (?<![-*]) -\*-
)
.*?
-\*-

View File

@@ -0,0 +1,27 @@
(?:
(?:\s|^)
vi
(?:m[<=>]?\d+|m)?
|
[\t\x20]
ex
)
(?=
: (?=\s* set? \s [^\n:]+ :) |
: (?!\s* set? \s)
)
(?:
(?:\s|\s*:\s*)
\w*
(?:
\s*=
(?:[^\n\\\s]|\\.)*
)?
)*
[\s:]
(?:filetype|ft|syntax)
\s*=
(MODE_NAME_HERE)
(?=\s|:|$)

View File

@@ -0,0 +1 @@
\b(\d*1[1-3]th|\d*0th|(?:(?!11st)\d)*1st|\d*2nd|(?:(?!13rd)\d*)3rd|\d*[4-9]th)\b

View File

@@ -0,0 +1 @@
/^([^\/#\?]*:?\/\/)?(\/?(?:[^\/#\?]+\/)*)?([^\/#\?]+)?(?:\/(?=$))?(\?[^#]*)?(#.*)?$/

897
samples/Roff/trekmanual.nr Normal file
View File

@@ -0,0 +1,897 @@
.\" $FreeBSD: src/games/trek/DOC/trekmanual.nr,v 1.1.1.1.14.1 2002/08/15 18:10:56 schweikh Exp $
.\" $DragonFly: src/games/trek/DOC/trekmanual.nr,v 1.2 2003/06/17 04:25:25 dillon Exp $
.br
.po 10
.if n \!.
.sp 15
.tr ^ \"
.ce 88
^****^^^^*****^^^^^^*^^^^^^****^
*^^^^^^^^^^*^^^^^^^*^*^^^^^*^^^*
^***^^^^^^^*^^^^^^*****^^^^****^
^^^^*^^^^^^*^^^^^^*^^^*^^^^*^^*^
****^^^^^^^*^^^^^^*^^^*^^^^*^^^*
*****^^^^****^^^^^*****^^^^*^^^*
^^*^^^^^^*^^^*^^^^*^^^^^^^^*^^*^
^^*^^^^^^****^^^^^***^^^^^^***^^
^^*^^^^^^*^^*^^^^^*^^^^^^^^*^^*^
^^*^^^^^^*^^^*^^^^*****^^^^*^^^*
by
Eric Allman
University of California
Berkeley
.ce 0
.tr ^^
.de HE
'sp 4
'tl 'STAR TREK''%'
'sp 3
..
.de FO
'bp
..
.wh 0 HE
.wh -5 FO
.de pp
.sp
.ti +4
..
.bp 1
.ce
INTRODUCTION
.pp
Well, the federation is once again at war with the Klingon empire.
It is up to you,
as captain of the U.S.S. Enterprise,
to wipe out the invasion fleet and save the Federation.
.pp
For the purposes of the game
the galaxy is divided into 64 quadrants
on an eight by eight grid,
with quadrant 0,0 in the upper left hand corner.
Each quadrant is divided into 100 sectors
on a ten by ten grid.
Each sector contains one object
(e.g., the Enterprise, a Klingon, or a star).
.pp
Navigation is handled in degrees,
with zero being straight up
and ninety being to the right.
Distances are measured in quadrants.
One tenth quadrant is one sector.
.pp
The galaxy contains starbases,
at which you can dock to refuel,
repair damages, etc.
The galaxy also contains stars.
Stars usually have a knack for getting in your way,
but they can be triggered into going nova
by shooting a photon torpedo at one,
thereby (hopefully) destroying any adjacent Klingons.
This is not a good practice however,
because you are penalized for destroying stars.
Also, a star will sometimes go supernova,
which obliterates an entire quadrant.
You must never stop in a supernova quadrant,
although you may "jump over" one.
.pp
Some starsystems
have inhabited planets.
Klingons can attack inhabited planets
and enslave the populace,
which they then put to work building more Klingon battle cruisers.
.bp
.ce
STARTING UP THE GAME
.pp
To request the game, issue the command
.sp
.ti +12
/usr/games/trek
.sp
from the shell.
If a filename is stated,
a log of the game is written
onto that file.
If omitted,
the file is not written.
If the "-a" flag is stated before the filename,
that file is appended to
rather than created.
.pp
The game will ask you what length game
you would like.
Valid responses are "short", "medium", and "long".
Ideally the length of the game does not
affect the difficulty,
but currently the shorter games
tend to be harder than the longer ones.
You may also type "restart",
which restarts a previously saved game.
.pp
You will then be prompted for the skill,
to which you must respond
"novice", "fair", "good", "expert",
"commodore", or "impossible".
You should start out with a novice
and work up,
but if you really want to see how fast
you can be slaughtered,
start out with an impossible game.
.pp
In general,
throughout the game,
if you forget what is appropriate
the game will tell you what it expects
if you just type in
a question mark.
.pp
To get a copy of these rules,
execute the command
.sp
.ti +12
nroff /usr/games/trekmanual.nr
.sp
.bp
.ce
ISSUING COMMANDS
.pp
If the game expects you to enter a command,
.hc ^
it will say ^"Command:\ "
and wait for your response.
Most commands can be abbreviated.
.pp
At almost any time you can type more than one thing on a line.
For example,
to move straight up one quadrant,
you can type
.ti +12
move 0 1
.br
or you could just type
.ti +12
move
.br
and the game would prompt you with
.ti +12
Course:
.br
to which you could type
.ti +12
0 1
.br
The "1" is the distance,
which could be put on still another line.
Also, the "move" command
could have been abbreviated
"mov", "mo", or just "m".
.pp
If you are partway through a command
and you change your mind,
you can usually type "-1"
to cancel the command.
.pp
Klingons generally cannot hit you
if you don't consume anything
(e.g., time or energy),
so some commands are considered "free".
As soon as you consume anything though -- POW!
.bp
.de **
.if \\n+l .**
.as x *
..
.de bl
.nr l \\w'\\$1' -\\w'*'
.ds x ****
.**
.sp 3
.ne 3
\\*x
.br
.if t *\h'\w'*'u'\fB\\$1\fP\h'\w'*'u'*
.if n * \\$1 *
.br
\\*x
.sp
.in +8
.nf
..
.de FF
.in -8
.fi
..
.if !\n(.V .ta \w'Full Commands: '+1
.if \n(.V .ta \w'Full Commands: 'u
.ce
THE COMMANDS
.bl "Short Range Scan"
Mnemonic: srscan
Shortest Abbreviation: s
Full Commands: srscan
srscan yes/no
Consumes: nothing
.FF
.pp
The short range scan
gives you a picture
of the quadrant you are in,
and (if you say "yes")
a status report
which tells you
a whole bunch
of interesting stuff.
You can get a status report alone
by using the
.ul
status
command.
An example follows:
.sp
.nf
.in +4
Short range sensor scan
0 1 2 3 4 5 6 7 8 9
0 . . . . . . . * . * 0 stardate 3702.16
1 . . E . . . . . . . 1 condition RED
2 . . . . . . . . . * 2 position 0,3/1,2
3 * . . . . # . . . . 3 warp factor 5.0
4 . . . . . . . . . . 4 total energy 4376
5 . . * . * . . . . . 5 torpedoes 9
6 . . . @ . . . . . 6 shields down, 78%
7 . . . . . . . . . . 7 Klingons left 3
8 . . . K . . . . . . 8 time left 6.43
9 . . . . . . * . . . 9 life support damaged, reserves = 2.4
0 1 2 3 4 5 6 7 8 9
Distressed Starsystem Marcus XII
.in +8
.ti -8
The cast of characters is as follows:
E the hero
K the villain
# the starbase
* stars
@ inhabited starsystem
\&. empty space
a black hole
.in -12
.fi
.pp
The name of the starsystem is listed underneath
the short range scan.
The word "distressed", if present,
means that the starsystem
is under attack.
.pp
Short range scans are absolutely free.
They use no time, no energy,
and they don't give the Klingons
another chance to hit you.
.bl "Status Report"
Mnemonic: status
Shortest Abbreviation: st
Consumes: nothing
.FF
.pp
This command gives you information
about the current status
of the game and your ship, as follows:
.in +8
.de qq
.sp
.ti -4
..
.qq
Stardate -- The current stardate.
.qq
Condition -- as follows:
.in +4
.nf
RED -- in battle
YELLOW -- low on energy
GREEN -- normal state
DOCKED -- docked at starbase
CLOAKED -- the cloaking device is activated
.fi
.in -4
.qq
Position -- Your current quadrant and sector.
.qq
Warp Factor -- The speed you will move at
when you move under warp power
(with the
.ul
move
command).
.qq
Total Energy -- Your energy reserves.
If they drop to zero,
you die.
Energy regenerates,
but the higher the skill of the game,
the slower it regenerates.
.qq
Torpedoes -- How many photon torpedoes you have left.
.qq
Shields -- Whether your shields are up or down,
and how effective they are if up
(what percentage of a hit they will absorb).
.qq
Klingons Left -- Guess.
.qq
Time Left -- How long the Federation can hold out
if you sit on your fat ass and do nothing.
If you kill Klingons quickly,
this number goes up,
otherwise,
it goes down.
If it hits zero,
the Federation is conquered.
.qq
Life Support -- If "active", everything is fine.
If "damaged", your reserves tell you
how long you have
to repair your life support
or get to a starbase
before you starve, suffocate,
or something equally unpleasant.
.qq
Current Crew -- The number of crew members
left.
This figures does not include officers.
.qq
Brig Space -- The space left in your brig
for Klingon captives.
.qq
Klingon Power -- The number of units
needed to kill a Klingon.
Remember, as Klingons fire at you
they use up their own energy,
so you probably need somewhat less
than this.
.qq
Skill, Length -- The skill and length
of the game you are playing.
.in -8
.pp
Status information is absolutely free.
.bl "Long Range Scan"
Mnemonic: lrscan
Shortest Abbreviation: l
Consumes: nothing
.FF
.pp
Long range scan gives you information about the
eight quadrants
that surround the quadrant
you're in.
A sample long range scan follows:
.sp
.in +12
.nf
Long range scan for quadrant 0,3
2 3 4
-------------------
! * ! * ! * !
-------------------
0 ! 108 ! 6 ! 19 !
-------------------
1 ! 9 ! /// ! 8 !
-------------------
.sp
.in -12
.fi
.pp
The three digit numbers
tell the number of objects
in the quadrants.
The units digit tells the number of stars,
the tens digit the number of starbases,
and the hundreds digit is the number of Klingons.
"*" indicates the negative energy barrier
at the edge of the galaxy,
which you cannot enter.
"///" means that that is a supernova quadrant
and must not be entered.
.bl "Damage Report"
Mnemonic: damages
Shortest Abbreviation: da
Consumes: nothing
.FF
.pp
A damage report tells you what devices are damaged
and how long it will take to repair them.
Repairs proceed faster
when you are docked
at a starbase.
.bl "Set Warp Factor"
Mnemonic: warp
Shortest Abbreviation: w
Full Command: warp factor
Consumes: nothing
.FF
.pp
The warp factor tells the speed of your starship
when you move under warp power
(with the
.ul
move
command).
The higher the warp factor,
the faster you go,
and the more energy you use.
.pp
The minimum warp factor is 1.0
and the maximum is 10.0.
At speeds above warp 6
there is danger of the warp engines
being damaged.
The probability of this
increases at higher warp speeds.
Above warp 9.0 there is a chance of entering
a time warp.
.bl "Move Under Warp Power"
Mnemonic: move
Shortest Abbreviation: m
Full Command: move course distance
Consumes: time and energy
.FF
.pp
This is the usual way of moving.
The course is in degrees and the distance is in quadrants.
To move one sector specify a distance of 0.1.
.pp
Time is consumed proportionately to
the inverse of the warp factor squared,
and directly to the distance.
Energy is consumed as the warp factor cubed,
and directly to the distance.
If you move with your shields up
it doubles the amount of energy consumed.
.pp
When you move in a quadrant containing Klingons,
they get a chance to attack you.
.pp
The computer detects navigation errors.
If the computer is out,
you run the risk of running into things.
.pp
The course is determined by the
Space Inertial Navigation System
[SINS].
As described in
Star Fleet Technical Order TO:02:06:12,
the SINS is calibrated,
after which it becomes the base for navigation.
If damaged,
navigation becomes inaccurate.
When it is fixed,
Spock recalibrates it,
however,
it cannot be calibrated extremely accurately
until you dock at starbase.
.bl "Move Under Impulse Power"
Mnemonic: impulse
Shortest Abbreviation: i
Full Command: impulse course distance
Consumes: time and energy
.FF
.pp
The impulse engines give you a chance to maneuver
when your warp engines are damaged;
however, they are incredibly slow
(0.095 quadrants/stardate).
They require 20 units of energy to engage,
and ten units per sector to move.
.pp
The same comments about the computer and the SINS
apply as above.
.pp
There is no penalty to move under impulse power
with shields up.
.bl "Deflector Shields"
Mnemonic: shields
Shortest Abbreviation: sh
Full Command: shields up/down
Consumes: energy
.FF
.pp
Shields protect you from Klingon attack
and nearby novas.
As they protect you,
they weaken.
A shield which is 78% effective
will absorb 78% of a hit
and let 22% in to hurt you.
.pp
The Klingons have a chance to attack you
every time you raise or lower shields.
Shields do not rise and lower
instantaneously,
so the hit you receive
will be computed with the shields
at an intermediate effectiveness.
.pp
It takes energy to raise shields,
but not to drop them.
.bl "Cloaking Device"
Mnemonic: cloak
Shortest Abbreviation: cl
Full Command: cloak up/down
Consumes: energy
.FF
.pp
When you are cloaked,
Klingons cannot see you,
and hence they do not fire at you.
They are useful for entering
a quadrant
and selecting a good position,
however,
weapons cannot be fired through
the cloak
due to the huge energy drain
that it requires.
.pp
The cloak up command
only starts the cloaking process;
Klingons will continue
to fire at you
until you do something
which consumes time.
.bl "Fire Phasers"
Mnemonic: phasers
Shortest Abbreviation: p
Full Commands: phasers automatic amount
phasers manual amt1 course1 spread1 ...
Consumes: energy
.FF
.pp
Phasers are energy weapons;
the energy comes from your ship's reserves
("total energy" on a srscan).
It takes about 250 units of hits
to kill a Klingon.
Hits are cumulative as long as you stay
in the quadrant.
.pp
Phasers become less effective
the further from a Klingon you are.
Adjacent Klingons receive about
90% of what you fire,
at five sectors about 60%,
and at ten sectors about 35%.
They have no effect outside of the quadrant.
.pp
Phasers cannot be fired while shields are up;
to do so would fry you.
They have no effect on starbases or stars.
.pp
In automatic mode
the computer decides how to divide up the energy
among the Klingons present;
in manual mode you do that yourself.
.pp
In manual mode firing
you specify a direction,
amount (number of units to fire)
and spread (0 -> 1.0)
for each of the six phaser banks.
A zero amount
terminates the manual input.
.bl "Fire Photon Torpedoes"
Mnemonic: torpedo
Shortest Abbreviation: t
Full Command: torpedo course [yes/no] [burst angle]
Consumes: torpedoes
.FF
.pp
Torpedoes are projectile weapons -- there are no partial hits.
You either hit your target or you don't.
A hit on a Klingon destroys him.
A hit on a starbase destroys that starbase
(woops!).
Hitting a star usually causes it to go nova,
and occasionally supernova.
.pp
Photon torpedoes cannot be aimed precisely.
They can be fired with shields up,
but they get even more random
as they pass through the shields.
.pp
Torpedoes may be fired in bursts of three.
If this is desired,
the burst angle is the angle
between the three shots,
which may vary from one to fifteen.
The word "no"
says that a burst is not wanted;
the word "yes"
(which may be omitted
if stated on the same line as the course)
says that a burst is wanted.
.pp
Photon torpedoes
have no effect
outside the quadrant.
.bl "Onboard Computer Request"
Mnemonic: computer
Shortest Abbreviation: c
Full Command: computer request; request;...
Consumes: nothing
.FF
.pp
The computer command gives you access to the facilities
of the onboard computer,
which allows you to do all sorts of fascinating stuff.
Computer requests are:
.in +8
.qq
score -- Shows your current score.
.qq
course quad/sect -- Computes the course and distance from wherever
you are to the given location.
If you type "course /x,y"
you will be given the course
to sector x,y in the current quadrant.
.qq
move quad/sect -- Identical to the course
request,
except that the move is executed.
.qq
chart -- prints a chart of the known galaxy,
i.e.,
everything that you have seen with a long range scan.
The format is the same as on a long range scan,
except that "..." means
that you don't yet know what is there,
and ".1." means that you know that a starbase
exists, but you don't know anything else.
"$$$" mans the quadrant
that you are currently in.
.qq
trajectory -- prints the course and distance
to all the Klingons in the quadrant.
.qq
warpcost dist warp_factor -- computes the cost in time and energy
to move `dist' quadrants at warp `warp_factor'.
.qq
impcost dist -- same as warpcost for impulse engines.
.qq
pheff range -- tells how effective your phasers are
at a given range.
.qq
distresslist -- gives a list of currently distressed
starbases
and starsystems.
.in -8
.pp
More than one request may be stated
on a line
by separating them
with semicolons.
.bl "Dock at Starbase"
Mnemonic: dock
Shortest Abbreviation: do
Consumes: nothing
.FF
.pp
You may dock at a starbase
when you are in one of the eight
adjacent sectors.
.pp
When you dock you are resupplied
with energy, photon torpedoes, and life support reserves.
Repairs are also done faster at starbase.
Any prisoners you have taken
are unloaded.
You do not receive points
for taking prisoners
until this time.
.pp
Starbases have their own deflector shields,
so you are safe from attack while docked.
.bl "Undock from Starbase"
Mnemonic: undock
Shortest Abbreviation: u
Consumes: nothing
.FF
.pp
This just allows you to leave starbase
so that you may proceed on your way.
.bl "Rest"
Mnemonic: rest
Shortest Abbreviation: r
Full Command: rest time
Consumes: time
.FF
.pp
This command allows you to rest to repair damages.
It is not advisable to rest while under attack.
.bl "Call Starbase For Help"
Mnemonic: help
Shortest Abbreviation: help
Consumes: nothing
.FF
.pp
You may call starbase for help via your subspace radio.
Starbase has long range transporter beams to get you.
Problem is,
they can't always rematerialize you.
.pp
You should avoid using this command unless absolutely necessary,
for the above reason and because it counts heavily against you
in the scoring.
.bl "Capture Klingon"
Mnemonic: capture
Shortest Abbreviation: ca
Consumes: time
.FF
.pp
You may request that a Klingon surrender
to you.
If he accepts,
you get to take captives
(but only as many as your brig
can hold).
It is good if you do this,
because you get points for captives.
Also,
if you ever get captured,
you want to be sure that the Federation
has prisoners to exchange for you.
.pp
You must go to a starbase
to turn over your prisoners
to Federation authorities.
.bl "Visual Scan"
Mnemonic: visual
Shortest Abbreviation: v
Full Command: visual course
Consumes: time
.FF
.pp
When your short range scanners are out,
you can still see what is out "there"
by doing a visual scan.
Unfortunately,
you can only see three sectors at one time,
and it takes 0.005 stardates to perform.
.pp
The three sectors in the general direction
of the course specified
are examined
and displayed.
.bl "Abandon Ship"
Mnemonic: abandon
Shortest Abbreviation: abandon
Consumes: nothing
.FF
.pp
The officers escape the Enterprise in the shuttlecraft.
If the transporter is working
and there is an inhabitable starsystem
in the area,
the crew beams down,
otherwise you leave them to die.
You are given an old but still usable ship,
the Faire Queene.
.bl "Ram"
Mnemonic: ram
Shortest Abbreviation: ram
Full Command: ram course distance
Consumes: time and energy
.FF
.pp
This command is identical to "move",
except that the computer
doesn't stop you
from making navigation errors.
.pp
You get very nearly slaughtered
if you ram anything.
.bl "Self Destruct"
Mnemonic: destruct
Shortest Abbreviation: destruct
Consumes: everything
.FF
.pp
Your starship is self-destructed.
Chances are you will destroy
any Klingons
(and stars,
and starbases)
left in your quadrant.
.bl "Terminate the Game"
Mnemonic: terminate
Shortest Abbreviation: terminate
Full Command: terminate yes/no
.FF
.pp
Cancels the current game.
No score is computed.
If you answer yes,
a new game will be started,
otherwise trek exits.
.bl "Call the Shell"
Mnemonic: shell
Shortest Abbreviation: shell
.FF
.pp
Temporarily escapes to the shell.
When you log out of the shell
you will return to the game.
.bp
.ce
SCORING
.in +4
.pp
The scoring algorithm is rather complicated.
Basically,
you get points for each Klingon you kill,
for your Klingon per stardate kill rate,
and a bonus if you win the game.
You lose
points for the number of Klingons left
in the galaxy
at the end of the game,
for getting killed,
for each star, starbase, or inhabited starsystem
you destroy,
for calling for help,
and for each casualty you incur.
.pp
You will be promoted
if you play very well.
You will never get a promotion if you
call for help,
abandon the Enterprise,
get killed,
destroy a starbase or inhabited starsystem,
or destroy too many stars.
.bp
.ce
REFERENCE PAGE
.sp 2
.ta 36 56
.nf
.ul
Command Uses Consumes
ABANDON shuttlecraft, -
transporter
CApture subspace radio time
CLoak Up/Down cloaking device energy
Computer request; request;... computer -
DAmages - -
DESTRUCT computer -
DOck - -
HELP subspace radio -
Impulse course distance impulse engines, time, energy
computer, SINS
Lrscan L.R. sensors -
Move course distance warp engines, time, energy
computer, SINS
Phasers Automatic amount phasers, computer energy
Phasers Manual amt1 course1 spread1 ... phasers energy
Torpedo course [Yes] angle/No torpedo tubes torpedoes
RAM course distance warp engines, time, energy
computer, SINS
Rest time - time
SHELL - -
SHields Up/Down shields energy
Srscan [Yes/No] S.R. sensors -
STatus - -
TERMINATE Yes/No - -
Undock - -
Visual course - time
Warp warp_factor - -
.fi

View File

@@ -0,0 +1,2 @@
require "pp"
IRB.conf[:AUTO_INDENT] = true

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,95 @@
// From https://github.com/Unity-Technologies/PostProcessing,
// licensed under MIT licence.
Shader "Hidden/Post FX/Depth Of Field"
{
Properties
{
_MainTex ("", 2D) = "black"
}
CGINCLUDE
#pragma exclude_renderers d3d11_9x
#pragma target 3.0
ENDCG
SubShader
{
Cull Off ZWrite Off ZTest Always
// (0) Downsampling, prefiltering & CoC
Pass
{
CGPROGRAM
#pragma multi_compile __ UNITY_COLORSPACE_GAMMA
#pragma vertex VertDOF
#pragma fragment FragPrefilter
#include "DepthOfField.cginc"
ENDCG
}
// (1) Pass 0 + temporal antialiasing
Pass
{
CGPROGRAM
#pragma vertex VertDOF
#pragma fragment FragPrefilter
#define PREFILTER_TAA
#include "DepthOfField.cginc"
ENDCG
}
// (2-5) Bokeh filter with disk-shaped kernels
Pass
{
CGPROGRAM
#pragma vertex VertDOF
#pragma fragment FragBlur
#define KERNEL_SMALL
#include "DepthOfField.cginc"
ENDCG
}
Pass
{
CGPROGRAM
#pragma vertex VertDOF
#pragma fragment FragBlur
#define KERNEL_MEDIUM
#include "DepthOfField.cginc"
ENDCG
}
Pass
{
CGPROGRAM
#pragma vertex VertDOF
#pragma fragment FragBlur
#define KERNEL_LARGE
#include "DepthOfField.cginc"
ENDCG
}
Pass
{
CGPROGRAM
#pragma vertex VertDOF
#pragma fragment FragBlur
#define KERNEL_VERYLARGE
#include "DepthOfField.cginc"
ENDCG
}
// (6) Postfilter blur
Pass
{
CGPROGRAM
#pragma vertex VertDOF
#pragma fragment FragPostBlur
#include "DepthOfField.cginc"
ENDCG
}
}
FallBack Off
}

View File

@@ -0,0 +1,112 @@
// From https://github.com/Unity-Technologies/PostProcessing,
// licensed under MIT licence.
Shader "Hidden/Post FX/Fog"
{
Properties
{
_MainTex("Main Texture", 2D) = "white" {}
}
CGINCLUDE
#pragma multi_compile __ FOG_LINEAR FOG_EXP FOG_EXP2
#include "UnityCG.cginc"
#include "Common.cginc"
#define SKYBOX_THREASHOLD_VALUE 0.9999
struct Varyings
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};
Varyings VertFog(AttributesDefault v)
{
Varyings o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = UnityStereoScreenSpaceUVAdjust(v.texcoord, _MainTex_ST);
return o;
}
sampler2D _CameraDepthTexture;
half4 _FogColor;
float _Density;
float _Start;
float _End;
half ComputeFog(float z)
{
half fog = 0.0;
#if FOG_LINEAR
fog = (_End - z) / (_End - _Start);
#elif FOG_EXP
fog = exp2(-_Density * z);
#else // FOG_EXP2
fog = _Density * z;
fog = exp2(-fog * fog);
#endif
return saturate(fog);
}
float ComputeDistance(float depth)
{
float dist = depth * _ProjectionParams.z;
dist -= _ProjectionParams.y;
return dist;
}
half4 FragFog(Varyings i) : SV_Target
{
half4 color = tex2D(_MainTex, i.uv);
float depth = SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, i.uv);
depth = Linear01Depth(depth);
float dist = ComputeDistance(depth) - _Start;
half fog = 1.0 - ComputeFog(dist);
return lerp(color, _FogColor, fog);
}
half4 FragFogExcludeSkybox(Varyings i) : SV_Target
{
half4 color = tex2D(_MainTex, i.uv);
float depth = SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, i.uv);
depth = Linear01Depth(depth);
float skybox = depth < SKYBOX_THREASHOLD_VALUE;
float dist = ComputeDistance(depth) - _Start;
half fog = 1.0 - ComputeFog(dist);
return lerp(color, _FogColor, fog * skybox);
}
ENDCG
SubShader
{
Cull Off ZWrite Off ZTest Always
Pass
{
CGPROGRAM
#pragma vertex VertFog
#pragma fragment FragFog
ENDCG
}
Pass
{
CGPROGRAM
#pragma vertex VertFog
#pragma fragment FragFogExcludeSkybox
ENDCG
}
}
}

View File

@@ -0,0 +1,337 @@
// From https://github.com/Unity-Technologies/PostProcessing,
// licensed under MIT licence.
Shader "Hidden/Post FX/Uber Shader"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
_AutoExposure ("", 2D) = "" {}
_BloomTex ("", 2D) = "" {}
_Bloom_DirtTex ("", 2D) = "" {}
_GrainTex ("", 2D) = "" {}
_LogLut ("", 2D) = "" {}
_UserLut ("", 2D) = "" {}
_Vignette_Mask ("", 2D) = "" {}
_ChromaticAberration_Spectrum ("", 2D) = "" {}
_DitheringTex ("", 2D) = "" {}
}
CGINCLUDE
#pragma target 3.0
#pragma multi_compile __ UNITY_COLORSPACE_GAMMA
#pragma multi_compile __ EYE_ADAPTATION
#pragma multi_compile __ CHROMATIC_ABERRATION
#pragma multi_compile __ DEPTH_OF_FIELD DEPTH_OF_FIELD_COC_VIEW
#pragma multi_compile __ BLOOM
#pragma multi_compile __ BLOOM_LENS_DIRT
#pragma multi_compile __ COLOR_GRADING COLOR_GRADING_LOG_VIEW
#pragma multi_compile __ USER_LUT
#pragma multi_compile __ GRAIN
#pragma multi_compile __ VIGNETTE_CLASSIC VIGNETTE_ROUND VIGNETTE_MASKED
#pragma multi_compile __ DITHERING
#include "UnityCG.cginc"
#include "Bloom.cginc"
#include "ColorGrading.cginc"
#include "UberSecondPass.cginc"
// Auto exposure / eye adaptation
sampler2D _AutoExposure;
// Chromatic aberration
half _ChromaticAberration_Amount;
sampler2D _ChromaticAberration_Spectrum;
// Depth of field
sampler2D_float _CameraDepthTexture;
sampler2D _DepthOfFieldTex;
float4 _DepthOfFieldTex_TexelSize;
float2 _DepthOfFieldParams; // x: distance, y: f^2 / (N * (S1 - f) * film_width * 2)
// Bloom
sampler2D _BloomTex;
float4 _BloomTex_TexelSize;
half2 _Bloom_Settings; // x: sampleScale, y: bloom.intensity
sampler2D _Bloom_DirtTex;
half _Bloom_DirtIntensity;
// Color grading & tonemapping
sampler2D _LogLut;
half3 _LogLut_Params; // x: 1 / lut_width, y: 1 / lut_height, z: lut_height - 1
half _ExposureEV; // EV (exp2)
// User lut
sampler2D _UserLut;
half4 _UserLut_Params; // @see _LogLut_Params
// Vignette
half3 _Vignette_Color;
half2 _Vignette_Center; // UV space
half3 _Vignette_Settings; // x: intensity, y: smoothness, z: roundness
sampler2D _Vignette_Mask;
half _Vignette_Opacity; // [0;1]
struct VaryingsFlipped
{
float4 pos : SV_POSITION;
float2 uv : TEXCOORD0;
float2 uvSPR : TEXCOORD1; // Single Pass Stereo UVs
float2 uvFlipped : TEXCOORD2; // Flipped UVs (DX/MSAA/Forward)
float2 uvFlippedSPR : TEXCOORD3; // Single Pass Stereo flipped UVs
};
VaryingsFlipped VertUber(AttributesDefault v)
{
VaryingsFlipped o;
o.pos = UnityObjectToClipPos(v.vertex);
o.uv = v.texcoord.xy;
o.uvSPR = UnityStereoScreenSpaceUVAdjust(v.texcoord.xy, _MainTex_ST);
o.uvFlipped = v.texcoord.xy;
#if UNITY_UV_STARTS_AT_TOP
if (_MainTex_TexelSize.y < 0.0)
o.uvFlipped.y = 1.0 - o.uvFlipped.y;
#endif
o.uvFlippedSPR = UnityStereoScreenSpaceUVAdjust(o.uvFlipped, _MainTex_ST);
return o;
}
half4 FragUber(VaryingsFlipped i) : SV_Target
{
float2 uv = i.uv;
half autoExposure = 1.0;
// Store the auto exposure value for later
#if EYE_ADAPTATION
{
autoExposure = tex2D(_AutoExposure, uv).r;
}
#endif
half3 color = (0.0).xxx;
#if DEPTH_OF_FIELD && CHROMATIC_ABERRATION
half4 dof = (0.0).xxxx;
#endif
//
// HDR effects
// ---------------------------------------------------------
// Chromatic Aberration
// Inspired by the method described in "Rendering Inside" [Playdead 2016]
// https://twitter.com/pixelmager/status/717019757766123520
#if CHROMATIC_ABERRATION
{
float2 coords = 2.0 * uv - 1.0;
float2 end = uv - coords * dot(coords, coords) * _ChromaticAberration_Amount;
float2 diff = end - uv;
int samples = clamp(int(length(_MainTex_TexelSize.zw * diff / 2.0)), 3, 16);
float2 delta = diff / samples;
float2 pos = uv;
half3 sum = (0.0).xxx, filterSum = (0.0).xxx;
#if DEPTH_OF_FIELD
float2 dofDelta = delta;
float2 dofPos = pos;
if (_MainTex_TexelSize.y < 0.0)
{
dofDelta.y = -dofDelta.y;
dofPos.y = 1.0 - dofPos.y;
}
half4 dofSum = (0.0).xxxx;
#endif
for (int i = 0; i < samples; i++)
{
half t = (i + 0.5) / samples;
half3 s = tex2Dlod(_MainTex, float4(UnityStereoScreenSpaceUVAdjust(pos, _MainTex_ST), 0, 0)).rgb;
half3 filter = tex2Dlod(_ChromaticAberration_Spectrum, float4(t, 0, 0, 0)).rgb;
sum += s * filter;
filterSum += filter;
pos += delta;
#if DEPTH_OF_FIELD
half4 sdof = tex2Dlod(_DepthOfFieldTex, float4(UnityStereoScreenSpaceUVAdjust(dofPos, _MainTex_ST), 0, 0)).rgba;
dofSum += sdof * half4(filter, 1);
dofPos += dofDelta;
#endif
}
color = sum / filterSum;
#if DEPTH_OF_FIELD
dof = dofSum / half4(filterSum, samples);
#endif
}
#else
{
color = tex2D(_MainTex, i.uvSPR).rgb;
}
#endif
// Apply auto exposure if any
color *= autoExposure;
// Gamma space... Gah.
#if UNITY_COLORSPACE_GAMMA
{
color = GammaToLinearSpace(color);
}
#endif
// Depth of field
#if DEPTH_OF_FIELD
{
#if !CHROMATIC_ABERRATION
half4 dof = tex2D(_DepthOfFieldTex, i.uvFlippedSPR);
#endif
color = color * dof.a + dof.rgb * autoExposure;
}
#elif DEPTH_OF_FIELD_COC_VIEW
{
// Calculate the radiuses of CoC.
half4 src = tex2D(_DepthOfFieldTex, uv);
float depth = LinearEyeDepth(SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, i.uvFlippedSPR));
float coc = (depth - _DepthOfFieldParams.x) * _DepthOfFieldParams.y / depth;
coc *= 80;
// Visualize CoC (white -> red -> gray)
half3 rgb = lerp(half3(1, 0, 0), half3(1.0, 1.0, 1.0), saturate(-coc));
rgb = lerp(rgb, half3(0.4, 0.4, 0.4), saturate(coc));
// Black and white image overlay
rgb *= AcesLuminance(color) + 0.5;
// Gamma correction
#if !UNITY_COLORSPACE_GAMMA
{
rgb = GammaToLinearSpace(rgb);
}
#endif
color = rgb;
}
#endif
// HDR Bloom
#if BLOOM
{
half3 bloom = UpsampleFilter(_BloomTex, i.uvFlippedSPR, _BloomTex_TexelSize.xy, _Bloom_Settings.x) * _Bloom_Settings.y;
color += bloom;
#if BLOOM_LENS_DIRT
{
half3 dirt = tex2D(_Bloom_DirtTex, i.uvFlipped).rgb * _Bloom_DirtIntensity;
color += bloom * dirt;
}
#endif
}
#endif
// Procedural vignette
#if VIGNETTE_CLASSIC
{
half2 d = abs(uv - _Vignette_Center) * _Vignette_Settings.x;
d = pow(d, _Vignette_Settings.z); // Roundness
half vfactor = pow(saturate(1.0 - dot(d, d)), _Vignette_Settings.y);
color *= lerp(_Vignette_Color, (1.0).xxx, vfactor);
}
// Perfectly round vignette
#elif VIGNETTE_ROUND
{
half2 d = abs(uv - _Vignette_Center) * _Vignette_Settings.x;
d.x *= _ScreenParams.x / _ScreenParams.y;
half vfactor = pow(saturate(1.0 - dot(d, d)), _Vignette_Settings.y);
color *= lerp(_Vignette_Color, (1.0).xxx, vfactor);
}
// Masked vignette
#elif VIGNETTE_MASKED
{
half vfactor = tex2D(_Vignette_Mask, uv).a;
half3 new_color = color * lerp(_Vignette_Color, (1.0).xxx, vfactor);
color = lerp(color, new_color, _Vignette_Opacity);
}
#endif
// HDR color grading & tonemapping
#if COLOR_GRADING
{
color *= _ExposureEV; // Exposure is in ev units (or 'stops')
half3 colorLogC = saturate(LinearToLogC(color));
color = ApplyLut2d(_LogLut, colorLogC, _LogLut_Params);
}
#elif COLOR_GRADING_LOG_VIEW
{
color *= _ExposureEV;
color = saturate(LinearToLogC(color));
}
#endif
//
// All the following effects happen in LDR
// ---------------------------------------------------------
color = saturate(color);
// Back to gamma space if needed
#if UNITY_COLORSPACE_GAMMA
{
color = LinearToGammaSpace(color);
}
#endif
// LDR user lut
#if USER_LUT
{
color = saturate(color);
half3 colorGraded;
#if !UNITY_COLORSPACE_GAMMA
{
colorGraded = ApplyLut2d(_UserLut, LinearToGammaSpace(color), _UserLut_Params.xyz);
colorGraded = GammaToLinearSpace(colorGraded);
}
#else
{
colorGraded = ApplyLut2d(_UserLut, color, _UserLut_Params.xyz);
}
#endif
color = lerp(color, colorGraded, _UserLut_Params.w);
}
#endif
color = UberSecondPass(color, uv);
// Done !
return half4(color, 1.0);
}
ENDCG
SubShader
{
Cull Off ZWrite Off ZTest Always
// (0)
Pass
{
CGPROGRAM
#pragma vertex VertUber
#pragma fragment FragUber
ENDCG
}
}
}

View File

@@ -0,0 +1,54 @@
$OpenBSD: COPYRIGHT,v 1.3 2003/06/02 20:18:36 millert Exp $
Copyright 1992, 1993, 1994 Henry Spencer. All rights reserved.
This software is not subject to any license of the American Telephone
and Telegraph Company or of the Regents of the University of California.
Permission is granted to anyone to use this software for any purpose on
any computer system, and to alter it and redistribute it, subject
to the following restrictions:
1. The author is not responsible for the consequences of use of this
software, no matter how awful, even if they arise from flaws in it.
2. The origin of this software must not be misrepresented, either by
explicit claim or by omission. Since few users ever read sources,
credits must appear in the documentation.
3. Altered versions must be plainly marked as such, and must not be
misrepresented as being the original software. Since few users
ever read sources, credits must appear in the documentation.
4. This notice may not be removed or altered.
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
/*-
* Copyright (c) 1994
* The Regents of the University of California. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. Neither the name of the University nor the names of its contributors
* may be used to endorse or promote products derived from this software
* without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
* SUCH DAMAGE.
*
* @(#)COPYRIGHT 8.1 (Berkeley) 3/16/94
*/

View File

@@ -0,0 +1,339 @@
GNU GENERAL PUBLIC LICENSE
Version 2, June 1991
Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users. This
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to
using it. (Some other Free Software Foundation software is covered by
the GNU Lesser General Public License instead.) You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have. You must make sure that they, too, receive or can get the
source code. And you must show them these terms so they know their
rights.
We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary. To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.
The precise terms and conditions for copying, distribution and
modification follow.
GNU GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License. The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language. (Hereinafter, translation is included without limitation in
the term "modification".) Each licensee is addressed as "you".
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.
You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) You must cause the modified files to carry prominent notices
stating that you changed the files and the date of any change.
b) You must cause any work that you distribute or publish, that in
whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License.
c) If the modified program normally reads commands interactively
when run, you must cause it, when started running for such
interactive use in the most ordinary way, to print or display an
announcement including an appropriate copyright notice and a
notice that there is no warranty (or else, saying that you provide
a warranty) and that users may redistribute the program under
these conditions, and telling the user how to view a copy of this
License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on
the Program is not required to print an announcement.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.
In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange; or,
c) Accompany it with the information you received as to the offer
to distribute corresponding source code. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form with such
an offer, in accord with Subsection b above.)
The source code for a work means the preferred form of the work for
making modifications to it. For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable. However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.
If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.
4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License. Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.
5. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
this License.
7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all. For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded. In such case, this License incorporates
the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation. If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.
10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission. For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this. Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License along
with this program; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) year name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
`Gnomovision' (which makes passes at compilers) written by James Hacker.
<signature of Ty Coon>, 1 April 1989
Ty Coon, President of Vice
This General Public License does not permit incorporating your program into
proprietary programs. If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License.

View File

@@ -0,0 +1,24 @@
README for users interested in using MySQL as a triplestore backend
===================================================================
The KiWi Triple Store used by Apache Marmotta supports different database
backends, including H2, PostgreSQL and MySQL. However, for legal reasons,
we are not allowed to distribute the MySQL connector library together with
the Apache Marmotta source code or binaries, as it is licensed under GPL
license.
Nonetheless, it is possible to use MySQL by downloading and installing the
connector manually:
1. download and unpack the MySQL Connector/J from
http://dev.mysql.com/downloads/connector/j/
2. copy the mysql-connector-java-5.x.x.jar file to
a. the library directory of the application server
(e.g. $TOMCAT_HOME/lib)
-- OR --
b. the library directory of the Apache Marmotta Web application
(e.g. $TOMCAT_HOME/webapps/marmotta/WEB-INF/lib)
3. restart the application server
Apache Marmotta will then automatically be able to use the MySQL connector
to connect to a MySQL database. Please note that Marmotta requires at least
MySQL 5.x, because it makes use of nested queries and foreign keys.

View File

@@ -0,0 +1,384 @@
// Fixture taken from https://github.com/graphcool/console/blob/dev/src/components/onboarding/PlaygroundCPopup/PlaygroundCPopup.tsx
import * as React from 'react'
import {withRouter} from 'react-router'
import {connect} from 'react-redux'
import {bindActionCreators} from 'redux'
import {nextStep, selectExample} from '../../../actions/gettingStarted'
import {classnames} from '../../../utils/classnames'
import Loading from '../../Loading/Loading'
import {GettingStartedState} from '../../../types/gettingStarted'
import {Example} from '../../../types/types'
const classes: any = require('./PlaygroundCPopup.scss')
import {$p} from 'graphcool-styles'
import * as cx from 'classnames'
interface Tutorial {
title: string
description: string
image: string
link: string
}
const guides: Tutorial[] = [
{
title: 'Learnrelay.org',
description: 'A comprehensive, interactive introduction to Relay',
link: 'https://learnrelay.org/',
image: require('../../../assets/graphics/relay.png'),
},
{
title: 'GraphQL and the amazing Apollo Client',
description: 'Explore an Application built using React and Angular 2',
link: 'https://medium.com/google-developer-experts/graphql-and-the-amazing-apollo-client-fe57e162a70c',
image: require('../../../assets/graphics/apollo.png'),
},
{
title: 'Introducing Lokka',
description: 'A Simple JavaScript Client for GraphQL',
link: 'https://voice.kadira.io/introducing-lokka-a-simple-javascript-client-for-graphql-e0802695648c',
image: require('../../../assets/graphics/lokka.png'),
},
]
const examples = {
ReactRelay: {
path: 'react-relay-instagram-example',
description: 'React + Relay',
},
ReactApollo: {
path: 'react-apollo-instagram-example',
description: 'React + Apollo',
},
ReactNativeApollo: {
path: 'react-native-apollo-instagram-example',
description: 'React Native + Apollo',
},
AngularApollo: {
path: 'angular-apollo-instagram-example',
description: 'Angular + Apollo',
},
}
interface Props {
id: string
projectId: string
nextStep: () => Promise<void>
selectExample: (selectedExample: Example) => any
gettingStartedState: GettingStartedState
}
interface State {
mouseOver: boolean
}
class PlaygroundCPopup extends React.Component<Props, State> {
state = {
mouseOver: false,
}
refs: {
[key: string]: any
exampleAnchor: HTMLDivElement
congratsAnchor: HTMLDivElement
scroller: HTMLDivElement,
}
componentDidUpdate(prevProps: Props, prevState: State) {
if (prevProps.gettingStartedState.selectedExample !== this.props.gettingStartedState.selectedExample) {
this.refs.scroller.scrollTop += this.refs.exampleAnchor.getBoundingClientRect().top
}
if (prevProps.gettingStartedState.isCurrentStep('STEP5_WAITING')
&& this.props.gettingStartedState.isCurrentStep('STEP5_DONE')) {
this.refs.scroller.scrollTop += this.refs.congratsAnchor.getBoundingClientRect().top
const snd = new Audio(require('../../../assets/success.mp3') as string)
snd.volume = 0.5
snd.play()
}
}
render() {
const {mouseOver} = this.state
const {selectedExample} = this.props.gettingStartedState
const hovering = !this.props.gettingStartedState.isCurrentStep('STEP4_CLICK_TEASER_STEP5')
const downloadUrl = (example) => `${__BACKEND_ADDR__}/resources/getting-started-example?repository=${examples[example].path}&project_id=${this.props.projectId}&user=graphcool-examples` // tslint:disable-line
return (
<div
className='flex justify-center items-start w-100 h-100'
style={{
transition: 'background-color 0.3s ease',
backgroundColor: hovering ? 'rgba(255,255,255,0.7)' : 'transparent',
width: 'calc(100% - 266px)',
pointerEvents: 'none',
overflow: 'hidden',
}}
>
<div
ref='scroller'
className='flex justify-center w-100'
style={{
transition: 'height 0.5s ease',
height: hovering ? '100%' : mouseOver ? '190%' : '210%',
pointerEvents: hovering ? 'all' : 'none',
cursor: hovering ? 'auto' : 'pointer',
overflow: hovering ? 'auto' : 'hidden',
alignItems: selectedExample ? 'flex-start' : 'center',
}}
>
<div
className='bg-white br-2 shadow-2 mv-96'
style={{
minWidth: 600,
maxWidth: 800,
pointerEvents: 'all',
}}
onMouseLeave={() => this.setState({ mouseOver: false })}
onMouseEnter={() => {
this.setState({ mouseOver: true })
}}
onMouseOver={(e: any) => {
if (this.props.gettingStartedState.isCurrentStep('STEP4_CLICK_TEASER_STEP5')) {
this.props.nextStep()
}
}}
>
<div className='ma-16 tc pb-25'>
<div className='fw3 ma-38 f-38'>
You did it! Time to run an example.
</div>
<div className='fw2 f-16 mh-96 lh-1-4'>
You have successfully set up your own Instagram backend.{' '}
When building an app with Graphcool you can easily explore queries in the{' '}
playground and "copy &amp; paste" selected queries into your code.{' '}
Of course, to do so, you need to implement the frontend first.
</div>
<div className='fw2 f-16 mh-96 lh-1-4 mt-16'>
<div>We put together a simple app to show and add posts</div>
<div>using the backend you just built, to test and run it locally.</div>
</div>
</div>
<div className='ma-16 tc pb-25'>
<div className='fw3 ma-38 f-25'>
Select your preferred technology to download the example.
</div>
<div className='flex justify-between items-center w-100' ref='exampleAnchor'>
<div
className={classnames(
classes.exampleButton,
selectedExample === 'ReactRelay' ? classes.active : '',
)}
onClick={() => this.props.selectExample('ReactRelay')}
>
React + Relay
</div>
<div
className={classnames(
classes.exampleButton,
selectedExample === 'ReactApollo' ? classes.active : '',
)}
onClick={() => this.props.selectExample('ReactApollo')}
>
React + Apollo
</div>
<div
className={classnames(
classes.exampleButton,
selectedExample === 'ReactNativeApollo' ? classes.active : '',
)}
onClick={() => this.props.selectExample('ReactNativeApollo')}
>
React Native + Apollo
</div>
<div
className={classnames(
classes.exampleButton,
selectedExample === 'AngularApollo' ? classes.active : '',
)}
onClick={() => this.props.selectExample('AngularApollo')}
>
Angular + Apollo
</div>
</div>
</div>
{selectedExample &&
<div>
<div className='w-100'>
<iframe
className='w-100'
height='480'
allowFullScreen
frameBorder='0'
src={`https://www.youtube.com/embed/${this.getExampleVideoUrl(selectedExample)}`}
/>
</div>
<div
className='w-100 pa-25'
style={{
backgroundColor: '#FEF5D2',
}}
>
<div className='mt-25 mb-38 w-100 flex justify-center'>
<a
href={downloadUrl(selectedExample)}
className='pa-16 white'
style={{
backgroundColor: '#4A90E2',
}}
>
Download example
</a>
</div>
<div className='code dark-gray'>
<div className='black-50'>
# To see the example in action, run the following commands:
</div>
<div className='mv-16'>
<div className='black'>
npm install
</div>
<div className='black'>
npm start
</div>
</div>
<div className='black-50'>
# You can now open the app on localhost:3000
</div>
<div className='black-50'>
# Please come back to this page once you're done. We're waiting here. 💤
</div>
<div className={cx($p.w100, $p.flex, $p.flexRow, $p.justifyCenter, $p.mt25)}>
<a href='#' onClick={
(e: any) => {
e.preventDefault()
// we need to skip the 'STEP5_WAITING' step
this.props.nextStep()
this.props.nextStep()
this.props.nextStep()
}
}>
Skip
</a>
</div>
</div>
{this.props.gettingStartedState.isCurrentStep('STEP5_WAITING') &&
<div className='w-100 mv-96 flex justify-center'>
<Loading />
</div>
}
</div>
</div>
}
{this.props.gettingStartedState.isCurrentStep('STEP5_DONE') &&
<div className='w-100 mb-96' ref='congratsAnchor'>
<div className='flex items-center flex-column pv-60 fw1'>
<div className='f-96'>
🎉
</div>
<div className='f-38 mt-38'>
Congratulations!
</div>
<div className='f-38 mt-16'>
We knew you had it in you.
</div>
<div className='f-16 mv-38'>
Now go out there and build amazing things!
</div>
</div>
<div className='flex justify-between ph-25 pv-16'>
<div className='w-50 pr-16'>
<div className='ttu ls-2 f-16 fw1 lh-1-4'>
Get started on your own<br />with those excellent tutorials
</div>
<div className='mv-38'>
{guides.map(guide => this.renderBox(guide))}
</div>
</div>
<div className='w-50 pl-16'>
<div className='ttu ls-2 f-16 fw1 lh-1-4'>
Get more out of Graphcool<br />with our guides
</div>
<div className={`h-100 justify-start flex flex-column mv-38 ${classes.guides}`}>
<a
href='https://graph.cool/docs/tutorials/quickstart-2-daisheeb9x'
className={`${classes.one} fw4 black db flex items-center mb-25`}
target='_blank'
>
Declaring Relations
</a>
<a
href='https://graph.cool/docs/tutorials/quickstart-3-saigai7cha'
className={`${classes.two} fw4 black db flex items-center mb-25`}
target='_blank'
>
Implementing Business Logic
</a>
<a
href='https://graph.cool/docs/tutorials/thinking-in-terms-of-graphs-ahsoow1ool'
target='_blank'
className={`${classes.three} fw4 black db flex items-center mb-25`}
>
Thinking in terms of graphs
</a>
</div>
</div>
</div>
<div className='flex w-100 justify-center'>
<div
className='f-25 mv-16 pv-16 ph-60 ls-1 ttu pointer bg-accent white dim'
onClick={this.props.nextStep}
>
Finish Onboarding
</div>
</div>
</div>
}
</div>
</div>
</div>
)
}
private renderBox = (tutorial: Tutorial) => {
return (
<div key={tutorial.title} className='pa-16 mb-16 lh-1-4' style={{background: 'rgba(0,0,0,0.03)'}}>
<a className='flex items-center' href={tutorial.link} target='_blank'>
<div className='flex items-center justify-center' style={{ flex: '0 0 60px', height: 60 }}>
<img src={tutorial.image}/>
</div>
<div className='flex flex-column space-between ml-38'>
<div className='mb-6 dark-gray f-16'>
{tutorial.title}
</div>
<div className='fw1 mid-gray'>
{tutorial.description}
</div>
</div>
</a>
</div>
)
}
private getExampleVideoUrl = (example: Example): string => {
switch (example) {
case 'ReactRelay': return '_dj9Os2ev4M'
case 'ReactApollo': return '9nlwyPUPXjQ'
case 'ReactNativeApollo': return '9nlwyPUPXjQ'
case 'AngularApollo': return 'EzD5fJ-uniI'
}
}
}
const mapStateToProps = (state) => {
return {
gettingStartedState: state.gettingStarted.gettingStartedState,
}
}
const mapDispatchToProps = (dispatch) => {
return bindActionCreators({nextStep, selectExample}, dispatch)
}
export default connect(mapStateToProps, mapDispatchToProps)(withRouter(PlaygroundCPopup))

Some files were not shown because too many files have changed in this diff Show More