Compare commits

..

103 Commits

Author SHA1 Message Date
Colin Seymour
e60384b018 Release v5.1.0 (#3725)
* sublime-spintools now has a license so no need for whitelist

* Bump version: 5.0.12

* Use the more apt release of v5.1.0
2017-07-22 14:16:16 +01:00
Santiago M. Mola
470a82d9f5 shell: add more interpreters (#3708)
* ash: only interpreter, extension is more commonly used for
  Kingdom of Loathing scripting, e.g. github.com/twistedmage/assorted-kol-scripts

* dash: only interpreter, extension is more commonly used for
  dashboarding-related stuff

* ksh: extension was already present

* mksh

* pdksh
2017-07-20 10:33:28 +01:00
Santiago M. Mola
37979b26b0 improve .ms disambiguation (Unix Assembly / MAXScript) (#3707)
A few MAXScript files were misclassified as Unix Assembly.
Some of them can be found at github.com/davestewart/maxscript

* This commit changes the heuristic which looked for labels
  such as ".LG7E0:" to actually match the full label including
  the colon. This reduced the number of MAXScript files
  misclassified as Unix Assembly, without any new Unix Assembly
  misclassified so far.

* add MAXScript sample rolloutCreator.ms, extrated from MIT repo:
  https://github.com/davestewart/maxscript/blob/master/3D/3ds2ae/02_resources/max%20scripts/3ds%20ax%20scripts/rolloutCreator.ms
2017-07-10 10:03:12 +01:00
Santiago M. Mola
1a6df12902 fix Coq sample JsNumber.v (#3710)
It was fetched as HTML from GitHub instead of raw.
2017-07-10 09:41:36 +01:00
John Gardner
24e196df4e Add NCSA to license whitelist (#3714)
References:
* https://github.com/github/linguist/pull/3689#issuecomment-313665976
2017-07-08 00:59:05 +10:00
James Adams
8d178bfaed Improve Pan language support (#3691)
* Add a larger set of sample files for Pan

This is a fairly good cross section of Pan based on code from:
* https://github.com/quattor/template-library-examples
* https://github.com/quattor/template-library-core

* Add Pan language grammar
2017-07-03 18:49:15 +02:00
Chris Wilkinson
e9ec699931 Add xspec as XML file extension (#3702) 2017-07-03 18:39:39 +02:00
John Gardner
9a6c3f2c4d Register "cpanfile" as a Perl filename (#3704) 2017-07-03 20:45:20 +10:00
Theodore Dubois
648720301d Add misclassified C sample (#3652)
This sample is misclassified as Objective-C.
2017-06-30 17:11:50 +01:00
David Aylaian
c552e25bd7 Add C sample (#3698)
* Add C sample

Sample was incorrectly being identified as C++

* Changed asm.h license to the Unlicense

* Changed asm.h license to Apache 2.0
2017-06-30 09:32:16 +01:00
Justin Lecher
d5c8db3fb9 Add new language for the Easybuild framework (#3466)
The hpcugent/easybuild-framework is a python framework for the installation
of application in an HPC context. The actual package build description are
written in python but having .eb as extension.

Signed-off-by: Justin Lecher <jlec@gentoo.org>
2017-06-26 09:07:36 +01:00
Mahmoud Samir Fayed
632bcdc1ad Added Ring Language Support (#3662)
* update .gitmodules

* Update grammars.yml

* Create hello.ring

* Create Natural.ring

* Create weblib.ring

* vendor/grammars/language-ring

* fix order in grammars.yml

* remove two files from samples

* delete hello.ring

* Update languages.yml - add the R

* Create hello.ring

* Create natural.ring

* Create weblib.ring

* Create grammars.yml

* Create .gitmodules

* Create languages.yml

* Create languages.yml

* Create language-ring.txt

* Update .gitmodules

Prefer HTTPS links.

* Update hello.ring

Sample file from "real" applications (under permissive license) to train the Bayesian classifier.

* Update languages.yml

* Update weblib.ring

Reduce the file size

* Update .gitmodules

* Update .gitmodules

* Update .gitmodules

* Update .gitmodules

* Update submodule : language-ring

* Update weblib.ring

Sample : Using the web library.

* Create weighthistory

Add Sample

* Rename weighthistory to weighthistory.ring

* Update weblib.ring
2017-06-24 16:22:01 +01:00
Colby Pines
6b221172c0 Update vendor.yml: skeleton.css (#3682) 2017-06-24 13:19:43 +02:00
Robert Koeninger
6f0d801375 Added syntax definition for Fantom language (#3660)
* Added mgiannini/sublime-factor as a submodule

Provided better color for Fantom
Added license for sublime-fantom
Specified tm_scope for Fantom

* Redirected submodule for Fantom to fork with updated grammar

* Triggering build

* Updating sublime-fantom submodule

* Updated submodule sublime-fantom

* Adding Fantom samples
2017-06-21 09:29:13 +02:00
John Gardner
128abe3533 Fix spelling of Perl 6 (#3672)
Resolves #3671.
2017-06-20 19:39:39 +10:00
Colin Seymour
9312353d20 Improve running from cloned repo docs (#3669)
* Improve running from cloned repo docs
2017-06-20 10:29:17 +02:00
John Gardner
b6460f8ed6 Add recognition and classification of WebAssembly (#3650) 2017-05-30 18:02:03 +10:00
andretshurotshka
60f864a138 Support for Type Language (#3593)
* Support for Type Language

* Update Type Language

* Add one more sample for Type Language

* Update Type Language grammar
2017-05-29 06:46:56 +01:00
Colin Seymour
ca6121e3ea Update MD5 digest for testing under Ruby 2.4 (#3643)
* Update md5 sums for Ruby 2.4

Ruby 2.4 deprecated Fixnum & Bignum into Integer. This means the MD5 digests for the integers in our tests have a class of Integer instead of Fixnum which means we need to update the digests specifically for 2.4.

* Use Gem::Version for safer version comparison
2017-05-26 08:16:12 +01:00
Colin Seymour
7c17b1f10c Bump to v5.0.11 (#3642) 2017-05-25 16:12:34 +01:00
Paul Chaignon
d490fc303f Support for CWeb language (#3592)
Move .w file extension for CWeb to its own entry.
2017-05-25 09:22:40 +01:00
Michael Hadley
20fdac95f6 Add Closure Templates (#3634)
* Add Closure Templates to languages.yml

* Run script/add-grammar

* Add sample

* Run script/set-language-ids

* Add codemirror_mime_type
2017-05-25 09:15:32 +01:00
Colin Seymour
234ee8b6d2 Update location of Reason grammar (#3639) 2017-05-25 08:59:02 +01:00
Ross Kirsling
58ab593a64 Switch Dart grammars (Sublime → Atom). (#3633) 2017-05-20 17:41:46 +01:00
John Gardner
ec1f6a4cd6 Add ".nr" as a Roff file extension (#3630) 2017-05-18 03:03:47 +10:00
Colin Seymour
3eea8212f4 Revert "Use Textmate's HAML grammar" (#3629)
* Revert "Use Textmate's HAML grammar (#3627)"

This reverts commit a1e09ae3e6.

* Add back missing grammar sources
2017-05-16 15:58:39 +01:00
Vicent Martí
a1e09ae3e6 Use Textmate's HAML grammar (#3627)
* Use Textmate's HAML grammar

* Whitelist license
2017-05-16 12:46:04 +02:00
Robert Koeninger
c1f76c26e5 Add Shen grammar to vendor/README.md (#3626)
* Added sublime-shen as submodule

* Specified tm_scope in languages.yml

* Imported copy of license

* Added Shen grammar repo to vendor/README.md
2017-05-16 08:12:45 +01:00
Robert Koeninger
0983f62e02 Add syntax grammar for Shen language (#3625)
* Added sublime-shen as submodule

* Specified tm_scope in languages.yml

* Imported copy of license
2017-05-15 15:06:09 +01:00
Samuel Gunadi
190e54c020 Add comp, tesc, and tese as GLSL extensions (#3614)
* Add comp, tesc, and tese as GLSL file extensions

* Add GLSL compute shader sample

* Add GLSL tessellation control shader sample

* Add GLSL tessellation evaluation shader sample

* Remove .comp from GLSL extensions

We have to be sure that most of the .comp files on GitHub are indeed GLSL compute shaders.

* Remove GLSL compute shader sample
2017-05-15 09:05:07 +01:00
Lucas Bajolet
ded651159d Add Pep8 Assembly language (#2070)
Pep/8 is a toy assembly language used in some universities for teaching
the basics of assembly and low-level programming.

Signed-off-by: Lucas Bajolet <lucas.bajolet@gmail.com>
2017-05-15 09:02:06 +01:00
Serghei Iakovlev
acbab53198 Update Zephir links (#3608) 2017-05-10 15:56:21 +01:00
Simen Bekkhus
fba4babdcd Don't show npm lockfiles by default (#3611) 2017-05-10 15:55:16 +01:00
Colin Seymour
eb6a213921 Revert "Revert "Switch the PHP grammar to the upstream repo (#3575)"" (#3616)
* Revert "Revert "Switch the PHP grammar to the upstream repo (#3575)" (#3603)"

This reverts commit e93f41f097.
2017-05-10 15:53:15 +01:00
Colin Seymour
5e2c79e950 Bump version to v5.0.10 (#3604) 2017-05-05 18:49:35 +01:00
Colin Seymour
e93f41f097 Revert "Switch the PHP grammar to the upstream repo (#3575)" (#3603)
* Revert "Switch the PHP grammar to the upstream repo (#3575)"

Manually reverting this as it breaks PHP syntax highlighting on
github.com.

* Update submodule ref
2017-05-05 17:11:29 +01:00
Colin Seymour
994bc1f135 Release v5.0.9 (#3597)
* Update all grammars

* Update atom-language-clean grammar to match

* Don't update reason grammer

There seems to be a problem with the 1.3.5 release in that the conversion isn't producing a reason entry so doesn't match whats in grammar.yml

* Bump version to 5.0.9

* Update grammars

* Don't update javascript grammar

The current grammar has a known issue and is pending the fix in https://github.com/atom/language-javascript/pull/497
2017-05-03 14:49:26 +01:00
John Gardner
44f03e64c1 Merge heuristics for disambiguating ".t" files (#3587)
References: github/linguist#3546
2017-04-29 11:15:39 +02:00
Jacob Elder
4166f2e89d Clarify support for generated code (#3588)
* Clarify support for generated code

* Incorporate feedback

* TIL about how .gitattributes matching works
2017-04-28 16:20:22 -07:00
John Gardner
1a8f19c6f2 Fix numbering of ordered lists (#3586) 2017-04-28 14:02:38 -07:00
Santiago M. Mola
c0e242358a Fix heuristics after rename (#3556)
* fix Roff detection in heuristics

This affects extensions .l, .ms, .n and .rno.

Groff was renamed to Roff in 673aeb32b9851cc58429c4b598c876292aaf70c7,
but heuristic was not updated.

* replace FORTRAN with Fortran

It was already renamed in most places since 4fd8fce08574809aa58e9771e2a9da5d135127be
heuristics.rb was missing though.

* fix caseness of GCC Machine Description
2017-04-26 15:31:36 -07:00
thesave
eb38c8dcf8 [Add Language] Jolie (#3574)
* added support for Jolie language

* added support for Jolie language

* added samples for Jolie
2017-04-26 11:04:25 -07:00
Trent Schafer
f146b4afbd New extension support for PL/SQL language (#2735)
* Add additional PL/SQL file extensions

* Add PL/SQL samples for .ddl and .prc

* Fix sort order of PL/SQL extensions

* Restore vendor/grammars/assembly.

* Restore `pls` as primary PL/SQL extension

* Add tpb to go with tps
2017-04-26 11:03:01 -07:00
Nicolas Garnier
db15d0f5d2 Added MJML as an XML extension (#3582) 2017-04-26 19:24:57 +10:00
Michael Grafnetter
e6d57c771d Add .admx and .adml as extensions for XML (#3580)
* Add .admx and .adml as extensions for XML

* Fixed the order of extensions
2017-04-24 09:55:22 -07:00
Nathan Phillip Brink
eef0335c5f Clarify description of implicit alias. (#3578)
* Clarify description of implicit alias.

I was trying to look up the alias to use for DNS Zone. From the docs
the alias I should use would be dns zone, but in reality it is dns-zone.
This change updates the comments to describe how to derive the
implicit name of a given alias.

* Further clarify description of implicit alias.

@pchaigno requested replacing the Ruby with English.
2017-04-24 09:54:37 -07:00
Christoph Pojer
461c27c066 Revert "Added Jest snapshot test files as generated src (#3572)" (#3579)
This reverts commit f38d6bd124.
2017-04-22 14:20:54 +02:00
Matvei Stefarov
59d67d6743 Treat vstemplate and vsixmanifest as XML (#3517) 2017-04-22 09:25:50 +01:00
Sandy Armstrong
7aeeb82d3d Treat Xamarin .workbook files as markdown (#3500)
* Treat Xamarin .workbook files as markdown

Xamarin Workbook files are interactive coding documents for C#, serialized as
markdown files. They include a YAML front matter header block with some
metadata. Interactive code cells are included as `csharp` fenced code blocks.

An example can be found here:
https://github.com/xamarin/Workbooks/blob/master/csharp/csharp6/csharp6.workbook

Treated as markdown, it would appear like so:
https://gist.github.com/sandyarmstrong/e331dfeaf89cbce89043a1c31faa1297

* Add .workbook sample

Source: https://github.com/xamarin/Workbooks/blob/master/csharp/csharp6/csharp6.workbook
2017-04-20 15:29:17 +01:00
Christophe Coevoet
c98ca20076 Switch the PHP grammar to the upstream repo (#3575)
* Switch the PHP grammar to the upstream repo

* Update all URLs pointing to the PHP grammar bundle
2017-04-20 14:40:44 +01:00
Paul Chaignon
4e0b5f02aa Fix usage line in binary (#3564)
Linguist cannot work on any directory; it needs to be a Git
repository.
2017-04-20 10:18:03 +01:00
Tim Jones
8da7cb805e Add .cginc extension to HLSL language (#3491)
* Add .cginc extension to HLSL language

* Move extension to correct position

* Add representative sample .cginc file
2017-04-20 09:48:48 +01:00
Dorian
e5e81a8560 Add .irbc and Rakefile to matching ruby filenames (#3457) 2017-04-20 09:41:31 +01:00
Tim Jones
dd53fa1585 Add ShaderLab language (#3490)
* Add ShaderLab language

* Update HLSL and ShaderLab grammars to latest version

* Add .shader extension back to GLSL language

* Add sample GLSL .shader files

Note that these are copies of existing GLSL samples, renamed to have
the .shader extension.
2017-04-20 09:04:08 +01:00
Daniel F Moisset
354a8f079a Add suport for python typeshed (.pyi) extension (#3548) 2017-04-20 09:01:41 +01:00
Hank Brekke
f38d6bd124 Added Jest snapshot test files as generated src (#3572) 2017-04-20 08:58:39 +01:00
Santiago M. Mola
e80b92e407 Fix heuristic for Unix Assembly with .ms extension (#3550) 2017-04-06 22:01:42 +10:00
Martin Nowak
fa6ae1116f better heuristic distinction of .d files (#3145)
* fix benchmark

- require json for Hash.to_json

* better heuristic distinction of .d files

- properly recongnize dtrace probes
- recongnize \ in Makefile paths
- recongnize single line `file.ext : dep.ext` make targets
- recognize D module, import, function, and unittest declarations
- add more representative D samples

D changed from 31.2% to 28.1%
DTrace changed from 33.5% to 32.5%
Makefile changed from 35.3% to 39.4%

See
https://gist.github.com/MartinNowak/fda24fdef64f2dbb05c5a5ceabf22bd3
for the scraper used to get a test corpus.
2017-03-30 18:25:53 +01:00
Yuki Izumi
b7e27a9f58 .pod disambiguation heuristic fix (#3541)
Look for any line starting with "=\w+", not full lines, otherwise we
miss e.g. "=head1 HEADING".
2017-03-27 14:10:17 +11:00
Javier Honduvilla Coto
69ba4c5586 Update the Instrumenter doc ... (#3530)
... with an instance of the given`Instrumenter` instead of the class itself.
2017-03-23 06:11:45 +01:00
Rafer Hazen
c39d7fd6e8 Add data-engineering staff to maintainers list (#3533) 2017-03-22 07:06:58 -06:00
Yuki Izumi
44ed47cea1 Release v5.0.8 (#3535) 2017-03-22 16:41:36 +11:00
Yuki Izumi
de51cb08d2 Add .mdwn for Markdown (#3534) 2017-03-22 16:28:59 +11:00
Ronald Wampler
3dd2d08190 Add .mdown as an extension for Markdown (#3525)
* Add `.mdown` as an extension for Markdown

* Add `.mdown` sample
2017-03-22 16:14:54 +11:00
Yuki Izumi
3b625e1954 Release v5.0.7 (#3524)
* grammar update
* Release v5.0.7
2017-03-20 14:13:04 +11:00
Yuki Izumi
5c6f690b97 Prefer Markdown over GCC Machine Description (#3523)
* Add minimal Markdown sample
* Heuristic defaults to Markdown on no match
* Allow Linguist to detect empty blobs
2017-03-20 13:07:54 +11:00
Michael Rawlings
3bbfc907f3 [Add Language] Marko (#3519)
* add marko

* update marko
2017-03-17 09:46:20 +00:00
Colin Seymour
053b8bca97 GitHub.com now uses gitattributes overrides for syntax highlighting (#3518)
See https://github.com/github/linguist/issues/1792#issuecomment-286379822 for more details.
2017-03-15 22:42:08 -07:00
Yves Siegrist
7fb3db6203 Add .eye files to be used as ruby (#3509)
Usually files that are used for [eye](https://github.com/kostya/eye) have the file extension `.eye`.
A eye definition file always contains ruby code.
2017-03-13 17:22:56 -07:00
Liav Turkia
ba09394f85 Added a demos folder and updated regexes (#3512)
I added a check for case-sensitivity to the regex's. In my repositories, I have both a Docs and Demos folder and those wouldn't have been matched before. Now, they would.
2017-03-13 17:20:36 -07:00
Paul Chaignon
c59c88f16e Update grammar whitelist (#3510)
* Remove a few hashes for grammars with BSD licenses

There was an error in Licensee v8.8.2, which caused it to not
recognize some BSD licenses. v8.8.3 fixes it.

* Update submodules

Remove 2 grammars from the whitelist because their licenses were
added to a LICENSE file which a proper format (one that Licensee
detects).

MagicPython now supports all scopes that were previously supported
by language-python.
2017-03-13 17:19:06 -07:00
Brandon Black
8a6e74799a Merge branch 'master' of https://github.com/github/linguist 2017-03-13 17:13:48 -07:00
Brandon Black
4268769d2e adjusting travis config 2017-03-13 17:13:24 -07:00
NN
6601864084 Add wixproj as XML (#3511)
* Add wixproj as XML

WiX uses wixproj for projects.

* Add wixproj sample
2017-03-13 17:01:58 -07:00
Paul Chaignon
d57aa37fb7 Grammar for OpenSCAD from Textmate bundle (#3502) 2017-03-13 17:00:27 -07:00
Karl Pettersson
e72347fd98 Add alias for pandoc (#3493) 2017-03-13 16:59:35 -07:00
Brandon Black
1b429ea46b updating rubies 2017-03-10 00:00:19 -08:00
Paul Chaignon
9468ad4947 Fix grammar hashes (#3504)
* Update Licensee hashes for grammar licenses

Licensee v8.8 changed the way licenses are normalized, thus changing hashes for
some grammars

* Update Licensee

Prevent automatic updates to major releases
2017-03-09 23:57:35 -08:00
Nate Whetsell
733ef63193 Add Jison (#3488) 2017-02-22 00:24:50 -08:00
Brandon Black
9ca6a5841e Release v5.0.6 (#3489)
* grammar update

* bumping linguist version

* fixes for grammar updates
2017-02-21 23:13:15 -08:00
Brandon Black
41ace5fba0 using fork for php.tmbundle since updates are broken 2017-02-21 17:13:55 -08:00
Alex Arslan
cc4295b3b3 Update URL for Julia TextMate repo (#3487) 2017-02-21 17:05:59 -08:00
doug tangren
1e4ce80fd9 add support for detecting bazel WORKSPACE files (#3459)
* add support for detecting bazel WORKSPACE files

* Update languages.yml
2017-02-21 16:48:44 -08:00
Brandon Black
74a71fd90d fixing merge conflict 2017-02-21 16:28:34 -08:00
TingPing
9b08318456 Add Meson language (#3463) 2017-02-21 16:24:58 -08:00
Tim Jones
fa5b6b03dc Add grammar for HLSL (High Level Shading Language) (#3469) 2017-02-21 16:05:25 -08:00
Garen Torikian
cb59296fe0 Like ^docs?, sometimes one sample is enough (#3485) 2017-02-20 10:29:30 -08:00
Eloy Durán
f1be771611 Disambiguate TypeScript with tsx extension. (#3464)
Using the technique as discussed in #2761.
2017-02-20 10:17:18 +00:00
Alex Louden
b66fcb2529 Improve Terraform (.tf) / HCL (.hcl) syntax highlighting (#3392)
* Add Terraform grammar, and change .tf and .hcl files from using Ruby to Terraform sublime syntax

* Expand Terraform sample to demonstrate more language features

* Revert terraform sample change

* Add terraform sample - Dokku AWS deploy

* Updated to latest Terraform

* Update terraform string interpolation

* Update terraform to latest
2017-02-20 10:09:59 +00:00
Brandon Black
f7fe1fee66 Release v5.0.5 -- part Deux (#3479)
* bumping to v5.0.5

* relaxing rugged version requirement
2017-02-15 21:29:04 -08:00
Brandon Black
94367cc460 Update LICENSE 2017-02-15 14:11:37 -08:00
Phineas
72bec1fddc Update LICENSE Copyright Date to 2017 (#3476) 2017-02-15 14:11:13 -08:00
Brandon Black
4e2eba4ef8 Revert "Release v5.0.5" (#3477) 2017-02-15 12:48:45 -08:00
Colin Seymour
10457b6639 Release v5.0.5 (#3473)
Release v5.0.5

* Update submodules

* Update grammars

* Bump version to 5.0.5

* Relax dependency on rugged

It's probably not wise to depend on a beta version just yet.

* revert php.tmbundle grammar update

One of the changes in 3ed4837b43...010cc1c22c leads to breakage in snippet highlighting on github.com
2017-02-15 11:12:53 +00:00
Paul Chaignon
d58cbc68a6 Support for the P4 language
P4 is a language to describe the processing pipeline of network devices
2017-02-15 06:53:46 +01:00
Colin Seymour
01de40faaa Return early in Classifier.classify if no languages supplied (#3471)
* Return early if no languages supplied

There's no need to tokenise the data when attempting to classify without a limited language scope as no action will be performed when it comes to scoring anyway.

* Add test for empty languages array
2017-02-13 18:22:54 +00:00
Paul Chaignon
62d285fce6 Fix head commit for TXL grammar (#3470) 2017-02-13 14:35:38 +01:00
Stefan Stölzle
0056095e8c Add .lkml to LookML (#3454)
* Add .lkml to LookML

* Limit .lkml to .view.lkml and .model.lkml

* Add lkml samples

* Fix extension order
2017-02-03 11:50:30 +01:00
Lars Brinkhoff
d6dc3a3991 Accomodate Markdown lines which begin with '>'. (#3452) 2017-02-02 11:58:52 -08:00
Greg Zimmerman
b524461b7c Add PowerShell color. Matches default console color for PowerShell. (#3448) 2017-02-02 11:13:01 -08:00
fix-fix
76d41697aa Use official HTML primary color (#3447)
Use primary color of HTML5 logo as defined on Logo FAQ page
2017-02-02 11:09:55 -08:00
John Gardner
32147b629e Register "Emakefile" as an Erlang filename (#3443) 2017-02-02 11:09:07 -08:00
John Gardner
e7b5e25bf8 Add support for regular expression data (#3441) 2017-02-02 11:08:20 -08:00
294 changed files with 14555 additions and 1297 deletions

66
.gitmodules vendored
View File

@@ -67,9 +67,6 @@
[submodule "vendor/grammars/language-javascript"]
path = vendor/grammars/language-javascript
url = https://github.com/atom/language-javascript
[submodule "vendor/grammars/language-python"]
path = vendor/grammars/language-python
url = https://github.com/atom/language-python
[submodule "vendor/grammars/language-shellscript"]
path = vendor/grammars/language-shellscript
url = https://github.com/atom/language-shellscript
@@ -115,9 +112,6 @@
[submodule "vendor/grammars/fancy-tmbundle"]
path = vendor/grammars/fancy-tmbundle
url = https://github.com/fancy-lang/fancy-tmbundle
[submodule "vendor/grammars/dart-sublime-bundle"]
path = vendor/grammars/dart-sublime-bundle
url = https://github.com/guillermooo/dart-sublime-bundle
[submodule "vendor/grammars/sublimetext-cuda-cpp"]
path = vendor/grammars/sublimetext-cuda-cpp
url = https://github.com/harrism/sublimetext-cuda-cpp
@@ -177,7 +171,7 @@
url = https://github.com/mokus0/Agda.tmbundle
[submodule "vendor/grammars/Julia.tmbundle"]
path = vendor/grammars/Julia.tmbundle
url = https://github.com/nanoant/Julia.tmbundle
url = https://github.com/JuliaEditorSupport/Julia.tmbundle
[submodule "vendor/grammars/ooc.tmbundle"]
path = vendor/grammars/ooc.tmbundle
url = https://github.com/nilium/ooc.tmbundle
@@ -390,7 +384,7 @@
url = https://github.com/textmate/c.tmbundle
[submodule "vendor/grammars/zephir-sublime"]
path = vendor/grammars/zephir-sublime
url = https://github.com/vmg/zephir-sublime
url = https://github.com/phalcon/zephir-sublime
[submodule "vendor/grammars/llvm.tmbundle"]
path = vendor/grammars/llvm.tmbundle
url = https://github.com/whitequark/llvm.tmbundle
@@ -802,7 +796,7 @@
url = https://github.com/perl6/atom-language-perl6
[submodule "vendor/grammars/reason"]
path = vendor/grammars/reason
url = https://github.com/facebook/reason
url = https://github.com/chenglou/sublime-reason
[submodule "vendor/grammars/language-xcompose"]
path = vendor/grammars/language-xcompose
url = https://github.com/samcv/language-xcompose
@@ -815,3 +809,57 @@
[submodule "vendor/grammars/language-css"]
path = vendor/grammars/language-css
url = https://github.com/atom/language-css
[submodule "vendor/grammars/language-regexp"]
path = vendor/grammars/language-regexp
url = https://github.com/Alhadis/language-regexp
[submodule "vendor/grammars/Terraform.tmLanguage"]
path = vendor/grammars/Terraform.tmLanguage
url = https://github.com/alexlouden/Terraform.tmLanguage
[submodule "vendor/grammars/shaders-tmLanguage"]
path = vendor/grammars/shaders-tmLanguage
url = https://github.com/tgjones/shaders-tmLanguage
[submodule "vendor/grammars/language-meson"]
path = vendor/grammars/language-meson
url = https://github.com/TingPing/language-meson
[submodule "vendor/grammars/atom-language-p4"]
path = vendor/grammars/atom-language-p4
url = https://github.com/TakeshiTseng/atom-language-p4
[submodule "vendor/grammars/language-jison"]
path = vendor/grammars/language-jison
url = https://github.com/cdibbs/language-jison
[submodule "vendor/grammars/openscad.tmbundle"]
path = vendor/grammars/openscad.tmbundle
url = https://github.com/tbuser/openscad.tmbundle
[submodule "vendor/grammars/marko-tmbundle"]
path = vendor/grammars/marko-tmbundle
url = https://github.com/marko-js/marko-tmbundle
[submodule "vendor/grammars/language-jolie"]
path = vendor/grammars/language-jolie
url = https://github.com/fmontesi/language-jolie
[submodule "vendor/grammars/language-typelanguage"]
path = vendor/grammars/language-typelanguage
url = https://github.com/goodmind/language-typelanguage
[submodule "vendor/grammars/sublime-shen"]
path = vendor/grammars/sublime-shen
url = https://github.com/rkoeninger/sublime-shen
[submodule "vendor/grammars/Sublime-Pep8"]
path = vendor/grammars/Sublime-Pep8
url = https://github.com/R4PaSs/Sublime-Pep8
[submodule "vendor/grammars/dartlang"]
path = vendor/grammars/dartlang
url = https://github.com/dart-atom/dartlang
[submodule "vendor/grammars/language-closure-templates"]
path = vendor/grammars/language-closure-templates
url = https://github.com/mthadley/language-closure-templates
[submodule "vendor/grammars/language-webassembly"]
path = vendor/grammars/language-webassembly
url = https://github.com/Alhadis/language-webassembly
[submodule "vendor/grammars/language-ring"]
path = vendor/grammars/language-ring
url = https://github.com/MahmoudFayed/atom-language-ring
[submodule "vendor/grammars/sublime-fantom"]
path = vendor/grammars/sublime-fantom
url = https://github.com/rkoeninger/sublime-fantom
[submodule "vendor/grammars/language-pan"]
path = vendor/grammars/language-pan
url = https://github.com/quattor/language-pan

View File

@@ -1,20 +1,33 @@
language: ruby
sudo: false
addons:
apt:
packages:
- libicu-dev
- libicu48
before_install: script/travis/before_install
script:
- bundle exec rake
- script/licensed verify
rvm:
- 2.0.0
- 2.1
- 2.2
- 2.3.3
- 2.4.0
matrix:
allow_failures:
- rvm: 2.4.0
notifications:
disabled: true
git:
submodules: false
depth: 3
cache: bundler

View File

@@ -10,15 +10,15 @@ We try only to add new extensions once they have some usage on GitHub. In most c
To add support for a new extension:
0. Add your extension to the language entry in [`languages.yml`][languages], keeping the extensions in alphabetical order.
0. Add at least one sample for your extension to the [samples directory][samples] in the correct subdirectory.
0. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
1. Add your extension to the language entry in [`languages.yml`][languages], keeping the extensions in alphabetical order.
1. Add at least one sample for your extension to the [samples directory][samples] in the correct subdirectory.
1. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
In addition, if this extension is already listed in [`languages.yml`][languages] then sometimes a few more steps will need to be taken:
0. Make sure that example `.yourextension` files are present in the [samples directory][samples] for each language that uses `.yourextension`.
0. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.yourextension` files. (ping **@bkeepers** to help with this) to ensure we're not misclassifying files.
0. If the Bayesian classifier does a bad job with the sample `.yourextension` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
1. Make sure that example `.yourextension` files are present in the [samples directory][samples] for each language that uses `.yourextension`.
1. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.yourextension` files. (ping **@bkeepers** to help with this) to ensure we're not misclassifying files.
1. If the Bayesian classifier does a bad job with the sample `.yourextension` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
## Adding a language
@@ -27,17 +27,17 @@ We try only to add languages once they have some usage on GitHub. In most cases
To add support for a new language:
0. Add an entry for your language to [`languages.yml`][languages]. Omit the `language_id` field for now.
0. Add a grammar for your language: `script/add-grammar https://github.com/JaneSmith/MyGrammar`. Please only add grammars that have [one of these licenses][licenses].
0. Add samples for your language to the [samples directory][samples] in the correct subdirectory.
0. Add a `language_id` for your language using `script/set-language-ids`. **You should only ever need to run `script/set-language-ids --update`. Anything other than this risks breaking GitHub search :cry:**
0. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
1. Add an entry for your language to [`languages.yml`][languages]. Omit the `language_id` field for now.
1. Add a grammar for your language: `script/add-grammar https://github.com/JaneSmith/MyGrammar`. Please only add grammars that have [one of these licenses][licenses].
1. Add samples for your language to the [samples directory][samples] in the correct subdirectory.
1. Add a `language_id` for your language using `script/set-language-ids`. **You should only ever need to run `script/set-language-ids --update`. Anything other than this risks breaking GitHub search :cry:**
1. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
In addition, if your new language defines an extension that's already listed in [`languages.yml`][languages] (such as `.foo`) then sometimes a few more steps will need to be taken:
0. Make sure that example `.foo` files are present in the [samples directory][samples] for each language that uses `.foo`.
0. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.foo` files. (ping **@bkeepers** to help with this) to ensure we're not misclassifying files.
0. If the Bayesian classifier does a bad job with the sample `.foo` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
1. Make sure that example `.foo` files are present in the [samples directory][samples] for each language that uses `.foo`.
1. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.foo` files. (ping **@bkeepers** to help with this) to ensure we're not misclassifying files.
1. If the Bayesian classifier does a bad job with the sample `.foo` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
Remember, the goal here is to try and avoid false positives!
@@ -67,6 +67,16 @@ For development you are going to want to checkout out the source. To get it, clo
cd linguist/
script/bootstrap
To run Linguist from the cloned repository, you will need to generate the code samples first:
bundle exec rake samples
Run this command each time a [sample][samples] has been modified.
To run Linguist from the cloned repository:
bundle exec bin/linguist --breakdown
To run the tests:
bundle exec rake test
@@ -80,12 +90,14 @@ Here's our current build status: [![Build Status](https://api.travis-ci.org/gith
Linguist is maintained with :heart: by:
- **@Alhadis**
- **@brandonblack** (GitHub staff)
- **@BenEddy** (GitHub staff)
- **@Caged** (GitHub staff)
- **@grantr** (GitHub staff)
- **@larsbrinkhoff**
- **@lildude** (GitHub staff)
- **@lizzhale** (GitHub staff)
- **@mikemcquaid** (GitHub staff)
- **@pchaigno**
- **@rafer** (GitHub staff)
- **@shreyasjoshis** (GitHub staff)
As Linguist is a production dependency for GitHub we have a couple of workflow restrictions:
@@ -96,21 +108,21 @@ As Linguist is a production dependency for GitHub we have a couple of workflow r
If you are the current maintainer of this gem:
0. Create a branch for the release: `git checkout -b cut-release-vxx.xx.xx`
0. Make sure your local dependencies are up to date: `script/bootstrap`
0. If grammar submodules have not been updated recently, update them: `git submodule update --remote && git commit -a`
0. Ensure that samples are updated: `bundle exec rake samples`
0. Ensure that tests are green: `bundle exec rake test`
0. Bump gem version in `lib/linguist/version.rb`, [like this](https://github.com/github/linguist/commit/8d2ea90a5ba3b2fe6e1508b7155aa4632eea2985).
0. Make a PR to github/linguist, [like this](https://github.com/github/linguist/pull/1238).
0. Build a local gem: `bundle exec rake build_gem`
0. Test the gem:
0. Bump the Gemfile and Gemfile.lock versions for an app which relies on this gem
0. Install the new gem locally
0. Test behavior locally, branch deploy, whatever needs to happen
0. Merge github/linguist PR
0. Tag and push: `git tag vx.xx.xx; git push --tags`
0. Push to rubygems.org -- `gem push github-linguist-3.0.0.gem`
1. Create a branch for the release: `git checkout -b cut-release-vxx.xx.xx`
1. Make sure your local dependencies are up to date: `script/bootstrap`
1. If grammar submodules have not been updated recently, update them: `git submodule update --remote && git commit -a`
1. Ensure that samples are updated: `bundle exec rake samples`
1. Ensure that tests are green: `bundle exec rake test`
1. Bump gem version in `lib/linguist/version.rb`, [like this](https://github.com/github/linguist/commit/8d2ea90a5ba3b2fe6e1508b7155aa4632eea2985).
1. Make a PR to github/linguist, [like this](https://github.com/github/linguist/pull/1238).
1. Build a local gem: `bundle exec rake build_gem`
1. Test the gem:
1. Bump the Gemfile and Gemfile.lock versions for an app which relies on this gem
1. Install the new gem locally
1. Test behavior locally, branch deploy, whatever needs to happen
1. Merge github/linguist PR
1. Tag and push: `git tag vx.xx.xx; git push --tags`
1. Push to rubygems.org -- `gem push github-linguist-3.0.0.gem`
[grammars]: /grammars.yml
[languages]: /lib/linguist/languages.yml

View File

@@ -1,4 +1,4 @@
Copyright (c) 2011-2016 GitHub, Inc.
Copyright (c) 2017 GitHub, Inc.
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation

View File

@@ -15,10 +15,10 @@ See [Troubleshooting](#troubleshooting) and [`CONTRIBUTING.md`](/CONTRIBUTING.md
The Language stats bar displays languages percentages for the files in the repository. The percentages are calculated based on the bytes of code for each language as reported by the [List Languages](https://developer.github.com/v3/repos/#list-languages) API. If the bar is reporting a language that you don't expect:
0. Click on the name of the language in the stats bar to see a list of the files that are identified as that language.
0. If you see files that you didn't write, consider moving the files into one of the [paths for vendored code](/lib/linguist/vendor.yml), or use the [manual overrides](#overrides) feature to ignore them.
0. If the files are being misclassified, search for [open issues][issues] to see if anyone else has already reported the issue. Any information you can add, especially links to public repositories, is helpful.
0. If there are no reported issues of this misclassification, [open an issue][new-issue] and include a link to the repository or a sample of the code that is being misclassified.
1. Click on the name of the language in the stats bar to see a list of the files that are identified as that language.
1. If you see files that you didn't write, consider moving the files into one of the [paths for vendored code](/lib/linguist/vendor.yml), or use the [manual overrides](#overrides) feature to ignore them.
1. If the files are being misclassified, search for [open issues][issues] to see if anyone else has already reported the issue. Any information you can add, especially links to public repositories, is helpful.
1. If there are no reported issues of this misclassification, [open an issue][new-issue] and include a link to the repository or a sample of the code that is being misclassified.
### There's a problem with the syntax highlighting of a file
@@ -32,13 +32,15 @@ Linguist supports a number of different custom overrides strategies for language
### Using gitattributes
Add a `.gitattributes` file to your project and use standard git-style path matchers for the files you want to override to set `linguist-documentation`, `linguist-language`, and `linguist-vendored`. `.gitattributes` will be used to determine language statistics, but will not be used to syntax highlight files. To manually set syntax highlighting, use [Vim or Emacs modelines](#using-emacs-or-vim-modelines).
Add a `.gitattributes` file to your project and use standard git-style path matchers for the files you want to override to set `linguist-documentation`, `linguist-language`, `linguist-vendored`, and `linguist-generated`. `.gitattributes` will be used to determine language statistics and will be used to syntax highlight files. You can also manually set syntax highlighting using [Vim or Emacs modelines](#using-emacs-or-vim-modelines).
```
$ cat .gitattributes
*.rb linguist-language=Java
```
#### Vendored code
Checking code you didn't write, such as JavaScript libraries, into your git repo is a common practice, but this often inflates your project's language stats and may even cause your project to be labeled as another language. By default, Linguist treats all of the paths defined in [lib/linguist/vendor.yml](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml) as vendored and therefore doesn't include them in the language statistics for a repository.
Use the `linguist-vendored` attribute to vendor or un-vendor paths.
@@ -49,6 +51,8 @@ special-vendored-path/* linguist-vendored
jquery.js linguist-vendored=false
```
#### Documentation
Just like vendored files, Linguist excludes documentation files from your project's language stats. [lib/linguist/documentation.yml](lib/linguist/documentation.yml) lists common documentation paths and excludes them from the language statistics for your repository.
Use the `linguist-documentation` attribute to mark or unmark paths as documentation.
@@ -59,19 +63,18 @@ project-docs/* linguist-documentation
docs/formatter.rb linguist-documentation=false
```
#### Generated file detection
#### Generated code
Not all plain text files are true source files. Generated files like minified js and compiled CoffeeScript can be detected and excluded from language stats. As an added bonus, unlike vendored and documentation files, these files are suppressed in diffs.
```ruby
Linguist::FileBlob.new("underscore.min.js").generated? # => true
```
See [Linguist::Generated#generated?](https://github.com/github/linguist/blob/master/lib/linguist/generated.rb).
$ cat .gitattributes
Api.elm linguist-generated=true
```
### Using Emacs or Vim modelines
Alternatively, you can use Vim or Emacs style modelines to set the language for a single file. Modelines can be placed anywhere within a file and are respected when determining how to syntax-highlight a file on GitHub.com
If you do not want to use `.gitattributes` to override the syntax highlighting used on GitHub.com, you can use Vim or Emacs style modelines to set the language for a single file. Modelines can be placed anywhere within a file and are respected when determining how to syntax-highlight a file on GitHub.com
##### Vim
```

View File

@@ -4,6 +4,7 @@ require 'rake/testtask'
require 'yaml'
require 'yajl'
require 'open-uri'
require 'json'
task :default => :test

View File

@@ -75,7 +75,7 @@ elsif File.file?(path)
else
abort <<-HELP
Linguist v#{Linguist::VERSION}
Detect language type for a file, or, given a directory, determine language breakdown.
Detect language type for a file, or, given a repository, determine language breakdown.
Usage: linguist <path>
linguist <path> [--breakdown] [--json]

View File

@@ -16,7 +16,7 @@ Gem::Specification.new do |s|
s.add_dependency 'charlock_holmes', '~> 0.7.3'
s.add_dependency 'escape_utils', '~> 1.1.0'
s.add_dependency 'mime-types', '>= 1.19'
s.add_dependency 'rugged', '0.25.1.1'
s.add_dependency 'rugged', '>= 0.25.1'
s.add_development_dependency 'minitest', '>= 5.0'
s.add_development_dependency 'mocha'
@@ -26,5 +26,5 @@ Gem::Specification.new do |s|
s.add_development_dependency 'yajl-ruby'
s.add_development_dependency 'color-proximity', '~> 0.2.1'
s.add_development_dependency 'licensed'
s.add_development_dependency 'licensee', '>= 8.6.0'
s.add_development_dependency 'licensee', '~> 8.8.0'
end

View File

@@ -56,6 +56,8 @@ vendor/grammars/MQL5-sublime:
vendor/grammars/MagicPython:
- source.python
- source.regexp.python
- text.python.console
- text.python.traceback
vendor/grammars/Modelica:
- source.modelica
vendor/grammars/NSIS:
@@ -98,6 +100,8 @@ vendor/grammars/Sublime-Modula-2:
- source.modula2
vendor/grammars/Sublime-Nit:
- source.nit
vendor/grammars/Sublime-Pep8/:
- source.pep8
vendor/grammars/Sublime-QML:
- source.qml
vendor/grammars/Sublime-REBOL:
@@ -130,6 +134,8 @@ vendor/grammars/TLA:
- source.tla
vendor/grammars/TXL:
- source.txl
vendor/grammars/Terraform.tmLanguage:
- source.terraform
vendor/grammars/Textmate-Gosu-Bundle:
- source.gosu.2
vendor/grammars/UrWeb-Language-Definition:
@@ -180,6 +186,9 @@ vendor/grammars/atom-language-1c-bsl:
- source.sdbl
vendor/grammars/atom-language-clean:
- source.clean
- text.restructuredtext.clean
vendor/grammars/atom-language-p4:
- source.p4
vendor/grammars/atom-language-perl6:
- source.meta-info
- source.perl6fe
@@ -220,7 +229,6 @@ vendor/grammars/capnproto.tmbundle:
vendor/grammars/carto-atom:
- source.css.mss
vendor/grammars/ceylon-sublimetext:
- module.ceylon
- source.ceylon
vendor/grammars/chapel-tmbundle:
- source.chapel
@@ -241,11 +249,9 @@ vendor/grammars/cython:
- source.cython
vendor/grammars/d.tmbundle:
- source.d
vendor/grammars/dart-sublime-bundle:
vendor/grammars/dartlang:
- source.dart
- source.pubspec
- text.dart-analysis-output
- text.dart-doccomments
- source.yaml-ext
vendor/grammars/desktop.tmbundle:
- source.desktop
vendor/grammars/diff.tmbundle:
@@ -352,6 +358,8 @@ vendor/grammars/language-click:
- source.click
vendor/grammars/language-clojure:
- source.clojure
vendor/grammars/language-closure-templates:
- text.html.soy
vendor/grammars/language-coffee-script:
- source.coffee
- source.litcoffee
@@ -399,6 +407,13 @@ vendor/grammars/language-javascript:
- source.js
- source.js.regexp
- source.js.regexp.replacement
- source.jsdoc
vendor/grammars/language-jison:
- source.jison
- source.jisonlex
- source.jisonlex-injection
vendor/grammars/language-jolie:
- source.jolie
vendor/grammars/language-jsoniq:
- source.jq
- source.xq
@@ -406,20 +421,28 @@ vendor/grammars/language-less:
- source.css.less
vendor/grammars/language-maxscript:
- source.maxscript
vendor/grammars/language-meson:
- source.meson
vendor/grammars/language-ncl:
- source.ncl
vendor/grammars/language-ninja:
- source.ninja
vendor/grammars/language-pan:
- source.pan
vendor/grammars/language-povray:
- source.pov-ray sdl
vendor/grammars/language-python:
- text.python.console
- text.python.traceback
vendor/grammars/language-regexp:
- source.regexp
- source.regexp.extended
vendor/grammars/language-renpy:
- source.renpy
vendor/grammars/language-restructuredtext:
- text.restructuredtext
vendor/grammars/language-ring:
- source.ring
vendor/grammars/language-roff:
- source.ditroff
- source.ditroff.desc
- source.ideal
- source.pic
- text.roff
@@ -436,11 +459,15 @@ vendor/grammars/language-toc-wow:
- source.toc
vendor/grammars/language-turing:
- source.turing
vendor/grammars/language-typelanguage:
- source.tl
vendor/grammars/language-viml:
- source.viml
vendor/grammars/language-wavefront:
- source.wavefront.mtl
- source.wavefront.obj
vendor/grammars/language-webassembly:
- source.webassembly
vendor/grammars/language-xbase:
- source.harbour
vendor/grammars/language-xcompose:
@@ -474,6 +501,8 @@ vendor/grammars/make.tmbundle:
- source.makefile
vendor/grammars/mako-tmbundle:
- text.html.mako
vendor/grammars/marko-tmbundle:
- text.marko
vendor/grammars/mathematica-tmbundle:
- source.mathematica
vendor/grammars/matlab.tmbundle:
@@ -511,6 +540,8 @@ vendor/grammars/ooc.tmbundle:
- source.ooc
vendor/grammars/opa.tmbundle:
- source.opa
vendor/grammars/openscad.tmbundle:
- source.scad
vendor/grammars/oz-tmbundle/Syntaxes/Oz.tmLanguage:
- source.oz
vendor/grammars/parrot:
@@ -567,6 +598,9 @@ vendor/grammars/scilab.tmbundle:
- source.scilab
vendor/grammars/secondlife-lsl:
- source.lsl
vendor/grammars/shaders-tmLanguage:
- source.hlsl
- source.shaderlab
vendor/grammars/smali-sublime:
- source.smali
vendor/grammars/smalltalk-tmbundle:
@@ -594,6 +628,8 @@ vendor/grammars/sublime-cirru:
- source.cirru
vendor/grammars/sublime-clips:
- source.clips
vendor/grammars/sublime-fantom:
- source.fan
vendor/grammars/sublime-glsl:
- source.essl
- source.glsl
@@ -615,6 +651,8 @@ vendor/grammars/sublime-rexx:
- source.rexx
vendor/grammars/sublime-robot-plugin:
- text.robot
vendor/grammars/sublime-shen:
- source.shen
vendor/grammars/sublime-spintools:
- source.regexp.spin
- source.spin

View File

@@ -15,9 +15,9 @@ class << Linguist
# see Linguist::LazyBlob and Linguist::FileBlob for examples
#
# Returns Language or nil.
def detect(blob)
def detect(blob, allow_empty: false)
# Bail early if the blob is binary or empty.
return nil if blob.likely_binary? || blob.binary? || blob.empty?
return nil if blob.likely_binary? || blob.binary? || (!allow_empty && blob.empty?)
Linguist.instrument("linguist.detection", :blob => blob) do
# Call each strategy until one candidate is returned.
@@ -74,7 +74,7 @@ class << Linguist
# end
# end
#
# Linguist.instrumenter = CustomInstrumenter
# Linguist.instrumenter = CustomInstrumenter.new
#
# The instrumenter must conform to the `ActiveSupport::Notifications`
# interface, which defines `#instrument` and accepts:

View File

@@ -95,7 +95,7 @@ module Linguist
# Returns sorted Array of result pairs. Each pair contains the
# String language name and a Float score.
def classify(tokens, languages)
return [] if tokens.nil?
return [] if tokens.nil? || languages.empty?
tokens = Tokenizer.tokenize(tokens) if tokens.is_a?(String)
scores = {}

View File

@@ -9,11 +9,12 @@
## Documentation directories ##
- ^docs?/
- ^[Dd]ocs?/
- (^|/)[Dd]ocumentation/
- (^|/)javadoc/
- ^man/
- (^|/)[Jj]avadoc/
- ^[Mm]an/
- ^[Ee]xamples/
- ^[Dd]emos?/
## Documentation files ##
@@ -27,4 +28,4 @@
- (^|/)[Rr]eadme(\.|$)
# Samples folders
- ^[Ss]amples/
- ^[Ss]amples?/

View File

@@ -57,7 +57,7 @@ module Linguist
composer_lock? ||
node_modules? ||
go_vendor? ||
npm_shrinkwrap? ||
npm_shrinkwrap_or_package_lock? ||
godeps? ||
generated_by_zephir? ||
minified_files? ||
@@ -326,11 +326,11 @@ module Linguist
!!name.match(/vendor\/((?!-)[-0-9A-Za-z]+(?<!-)\.)+(com|edu|gov|in|me|net|org|fm|io)/)
end
# Internal: Is the blob a generated npm shrinkwrap file?
# Internal: Is the blob a generated npm shrinkwrap or package lock file?
#
# Returns true or false.
def npm_shrinkwrap?
!!name.match(/npm-shrinkwrap\.json/)
def npm_shrinkwrap_or_package_lock?
name.match(/npm-shrinkwrap\.json/) || name.match(/package-lock\.json/)
end
# Internal: Is the blob part of Godeps/,

View File

@@ -125,11 +125,18 @@ module Linguist
end
disambiguate ".d" do |data|
if /^module /.match(data)
# see http://dlang.org/spec/grammar
# ModuleDeclaration | ImportDeclaration | FuncDeclaration | unittest
if /^module\s+[\w.]*\s*;|import\s+[\w\s,.:]*;|\w+\s+\w+\s*\(.*\)(?:\(.*\))?\s*{[^}]*}|unittest\s*(?:\(.*\))?\s*{[^}]*}/.match(data)
Language["D"]
elsif /^((dtrace:::)?BEGIN|provider |#pragma (D (option|attributes)|ident)\s)/.match(data)
# see http://dtrace.org/guide/chp-prog.html, http://dtrace.org/guide/chp-profile.html, http://dtrace.org/guide/chp-opt.html
elsif /^(\w+:\w*:\w*:\w*|BEGIN|END|provider\s+|(tick|profile)-\w+\s+{[^}]*}|#pragma\s+D\s+(option|attributes|depends_on)\s|#pragma\s+ident\s)/.match(data)
Language["DTrace"]
elsif /(\/.*:( .* \\)$| : \\$|^ : |: \\$)/.match(data)
# path/target : dependency \
# target : \
# : dependency
# path/file.ext1 : some/path/../file.ext2
elsif /([\/\\].*:\s+.*\s\\$|: \\$|^ : |^[\w\s\/\\.]+\w+\.\w+\s*:\s+[\w\s\/\\.]+\w+\.\w+)/.match(data)
Language["Makefile"]
end
end
@@ -158,7 +165,7 @@ module Linguist
elsif data.include?("flowop")
Language["Filebench WML"]
elsif fortran_rx.match(data)
Language["FORTRAN"]
Language["Fortran"]
end
end
@@ -166,7 +173,7 @@ module Linguist
if /^: /.match(data)
Language["Forth"]
elsif fortran_rx.match(data)
Language["FORTRAN"]
Language["Fortran"]
end
end
@@ -219,7 +226,7 @@ module Linguist
elsif /^(%[%{}]xs|<.*>)/.match(data)
Language["Lex"]
elsif /^\.[a-z][a-z](\s|$)/i.match(data)
Language["Groff"]
Language["Roff"]
elsif /^\((de|class|rel|code|data|must)\s/.match(data)
Language["PicoLisp"]
end
@@ -260,10 +267,12 @@ module Linguist
end
disambiguate ".md" do |data|
if /(^[-a-z0-9=#!\*\[|])|<\//i.match(data) || data.empty?
if /(^[-a-z0-9=#!\*\[|>])|<\//i.match(data) || data.empty?
Language["Markdown"]
elsif /^(;;|\(define_)/.match(data)
Language["GCC machine description"]
Language["GCC Machine Description"]
else
Language["Markdown"]
end
end
@@ -287,9 +296,9 @@ module Linguist
disambiguate ".ms" do |data|
if /^[.'][a-z][a-z](\s|$)/i.match(data)
Language["Groff"]
elsif /(?<!\S)\.(include|globa?l)\s/.match(data) || /(?<!\/\*)(\A|\n)\s*\.[A-Za-z]/.match(data.gsub(/"([^\\"]|\\.)*"|'([^\\']|\\.)*'|\\\s*(?:--.*)?\n/, ""))
Language["GAS"]
Language["Roff"]
elsif /(?<!\S)\.(include|globa?l)\s/.match(data) || /(?<!\/\*)(\A|\n)\s*\.[A-Za-z][_A-Za-z0-9]*:/.match(data.gsub(/"([^\\"]|\\.)*"|'([^\\']|\\.)*'|\\\s*(?:--.*)?\n/, ""))
Language["Unix Assembly"]
else
Language["MAXScript"]
end
@@ -297,7 +306,7 @@ module Linguist
disambiguate ".n" do |data|
if /^[.']/.match(data)
Language["Groff"]
Language["Roff"]
elsif /^(module|namespace|using)\s/.match(data)
Language["Nemerle"]
end
@@ -331,20 +340,20 @@ module Linguist
elsif /use strict|use\s+v?5\./.match(data)
Language["Perl"]
elsif /^(use v6|(my )?class|module)/.match(data)
Language["Perl6"]
Language["Perl 6"]
end
end
disambiguate ".pm", ".t" do |data|
if /use strict|use\s+v?5\./.match(data)
disambiguate ".pm" do |data|
if /^\s*(?:use\s+v6\s*;|(?:\bmy\s+)?class|module)\b/.match(data)
Language["Perl 6"]
elsif /\buse\s+(?:strict\b|v?5\.)/.match(data)
Language["Perl"]
elsif /^(use v6|(my )?class|module)/.match(data)
Language["Perl6"]
end
end
disambiguate ".pod" do |data|
if /^=\w+$/.match(data)
if /^=\w+\b/.match(data)
Language["Pod"]
else
Language["Perl"]
@@ -383,7 +392,7 @@ module Linguist
if /^\.!|^\.end lit(?:eral)?\b/i.match(data)
Language["RUNOFF"]
elsif /^\.\\" /.match(data)
Language["Groff"]
Language["Roff"]
end
end
@@ -434,10 +443,12 @@ module Linguist
end
disambiguate ".t" do |data|
if /^\s*%|^\s*var\s+\w+\s*:\s*\w+/.match(data)
if /^\s*%[ \t]+|^\s*var\s+\w+\s*:=\s*\w+/.match(data)
Language["Turing"]
elsif /^\s*use\s+v6\s*;/.match(data)
Language["Perl6"]
elsif /^\s*(?:use\s+v6\s*;|\bmodule\b|\b(?:my\s+)?class\b)/.match(data)
Language["Perl 6"]
elsif /\buse\s+(?:strict\b|v?5\.)/.match(data)
Language["Perl"]
end
end
@@ -465,5 +476,13 @@ module Linguist
Language["Scilab"]
end
end
disambiguate ".tsx" do |data|
if /^\s*(import.+(from\s+|require\()['"]react|\/\/\/\s*<reference\s)/.match(data)
Language["TypeScript"]
elsif /^\s*<\?xml\s+version/i.match(data)
Language["XML"]
end
end
end
end

View File

@@ -455,7 +455,6 @@ C:
- ".cats"
- ".h"
- ".idc"
- ".w"
interpreters:
- tcc
ace_mode: c_cpp
@@ -589,6 +588,13 @@ CSV:
extensions:
- ".csv"
language_id: 51
CWeb:
type: programming
extensions:
- ".w"
tm_scope: none
ace_mode: text
language_id: 657332628
Cap'n Proto:
type: programming
tm_scope: source.capnp
@@ -687,6 +693,18 @@ Clojure:
filenames:
- riemann.config
language_id: 62
Closure Templates:
type: markup
group: HTML
ace_mode: soy_template
codemirror_mode: soy
codemirror_mime_type: text/x-soy
alias:
- soy
extensions:
- ".soy"
tm_scope: text.html.soy
language_id: 357046146
CoffeeScript:
type: programming
tm_scope: source.coffee
@@ -1066,6 +1084,16 @@ Eagle:
codemirror_mode: xml
codemirror_mime_type: text/xml
language_id: 97
Easybuild:
type: data
group: Python
ace_mode: python
codemirror_mode: python
codemirror_mime_type: text/x-python
tm_scope: source.python
extensions:
- ".eb"
language_id: 342840477
Ecere Projects:
type: data
group: JavaScript
@@ -1156,6 +1184,7 @@ Erlang:
- ".xrl"
- ".yrl"
filenames:
- Emakefile
- rebar.config
- rebar.config.lock
- rebar.lock
@@ -1212,10 +1241,10 @@ Fancy:
language_id: 109
Fantom:
type: programming
color: "#dbded5"
color: "#14253c"
extensions:
- ".fan"
tm_scope: none
tm_scope: source.fan
ace_mode: text
language_id: 110
Filebench WML:
@@ -1358,6 +1387,8 @@ GLSL:
- ".glslv"
- ".gshader"
- ".shader"
- ".tesc"
- ".tese"
- ".vert"
- ".vrx"
- ".vsh"
@@ -1587,17 +1618,18 @@ HCL:
ace_mode: ruby
codemirror_mode: ruby
codemirror_mime_type: text/x-ruby
tm_scope: source.ruby
tm_scope: source.terraform
language_id: 144
HLSL:
type: programming
extensions:
- ".hlsl"
- ".cginc"
- ".fx"
- ".fxh"
- ".hlsli"
ace_mode: text
tm_scope: none
tm_scope: source.hlsl
language_id: 145
HTML:
type: markup
@@ -1605,7 +1637,7 @@ HTML:
ace_mode: html
codemirror_mode: htmlmixed
codemirror_mime_type: text/html
color: "#e44b23"
color: "#e34c26"
aliases:
- xhtml
extensions:
@@ -2022,6 +2054,33 @@ JavaScript:
interpreters:
- node
language_id: 183
Jison:
type: programming
group: Yacc
extensions:
- ".jison"
tm_scope: source.jison
ace_mode: text
language_id: 284531423
Jison Lex:
type: programming
group: Lex
extensions:
- ".jisonlex"
tm_scope: source.jisonlex
ace_mode: text
language_id: 406395330
Jolie:
type: programming
extensions:
- ".ol"
- ".iol"
interpreters:
- jolie
color: "#843179"
ace_mode: text
tm_scope: source.jolie
language_id: 998078858
Julia:
type: programming
extensions:
@@ -2291,6 +2350,8 @@ LookML:
color: "#652B81"
extensions:
- ".lookml"
- ".model.lkml"
- ".view.lkml"
tm_scope: source.yaml
language_id: 211
LoomScript:
@@ -2418,6 +2479,7 @@ Makefile:
- Makefile.frag
- Makefile.in
- Makefile.inc
- Makefile.wat
- makefile
- makefile.sco
- mkfile
@@ -2437,6 +2499,8 @@ Mako:
language_id: 221
Markdown:
type: prose
aliases:
- pandoc
ace_mode: markdown
codemirror_mode: gfm
codemirror_mime_type: text/x-gfm
@@ -2444,12 +2508,27 @@ Markdown:
extensions:
- ".md"
- ".markdown"
- ".mdown"
- ".mdwn"
- ".mkd"
- ".mkdn"
- ".mkdown"
- ".ron"
- ".workbook"
tm_scope: source.gfm
language_id: 222
Marko:
group: HTML
type: markup
tm_scope: text.marko
extensions:
- ".marko"
aliases:
- markojs
ace_mode: text
codemirror_mode: htmlmixed
codemirror_mime_type: text/html
language_id: 932782397
Mask:
type: markup
color: "#f97732"
@@ -2534,6 +2613,15 @@ Mercury:
- ".moo"
tm_scope: source.mercury
language_id: 229
Meson:
type: programming
color: "#007800"
filenames:
- meson.build
- meson_options.txt
tm_scope: source.meson
ace_mode: text
language_id: 799141244
Metal:
type: programming
color: "#8f14e9"
@@ -2902,7 +2990,7 @@ OpenSCAD:
type: programming
extensions:
- ".scad"
tm_scope: none
tm_scope: source.scad
ace_mode: scad
language_id: 266
OpenType Feature File:
@@ -2949,6 +3037,14 @@ Oz:
codemirror_mode: oz
codemirror_mime_type: text/x-oz
language_id: 270
P4:
type: programming
color: "#7055b5"
extensions:
- ".p4"
tm_scope: source.p4
ace_mode: text
language_id: 348895984
PAWN:
type: programming
color: "#dbb284"
@@ -2994,12 +3090,21 @@ PLSQL:
color: "#dad8d8"
extensions:
- ".pls"
- ".bdy"
- ".ddl"
- ".fnc"
- ".pck"
- ".pkb"
- ".pks"
- ".plb"
- ".plsql"
- ".prc"
- ".spc"
- ".sql"
- ".tpb"
- ".tps"
- ".trg"
- ".vw"
language_id: 273
PLpgSQL:
type: programming
@@ -3025,7 +3130,7 @@ Pan:
color: "#cc0000"
extensions:
- ".pan"
tm_scope: none
tm_scope: source.pan
ace_mode: text
language_id: 276
Papyrus:
@@ -3085,6 +3190,14 @@ Pascal:
codemirror_mode: pascal
codemirror_mime_type: text/x-pascal
language_id: 281
Pep8:
type: programming
color: "#C76F5B"
extensions:
- ".pep"
ace_mode: text
tm_scope: source.pep8
language_id: 840372442
Perl:
type: programming
tm_scope: source.perl
@@ -3104,10 +3217,12 @@ Perl:
- ".pod"
- ".psgi"
- ".t"
filenames:
- cpanfile
interpreters:
- perl
language_id: 282
Perl6:
Perl 6:
type: programming
color: "#0000fb"
extensions:
@@ -3226,6 +3341,7 @@ PowerBuilder:
language_id: 292
PowerShell:
type: programming
color: "#012456"
ace_mode: powershell
codemirror_mode: powershell
codemirror_mime_type: application/x-powershell
@@ -3353,6 +3469,7 @@ Python:
- ".lmi"
- ".py3"
- ".pyde"
- ".pyi"
- ".pyp"
- ".pyt"
- ".pyw"
@@ -3368,6 +3485,7 @@ Python:
- SConscript
- SConstruct
- Snakefile
- WORKSPACE
- wscript
interpreters:
- python
@@ -3601,6 +3719,17 @@ Redcode:
tm_scope: none
ace_mode: text
language_id: 321
Regular Expression:
type: data
extensions:
- ".regexp"
- ".regex"
aliases:
- regexp
- regex
ace_mode: text
tm_scope: source.regexp
language_id: 363378884
Ren'Py:
type: programming
aliases:
@@ -3619,6 +3748,14 @@ RenderScript:
tm_scope: none
ace_mode: text
language_id: 323
Ring:
type: programming
color: "#0e60e3"
extensions:
- .ring
tm_scope: source.ring
ace_mode: text
language_id: 431
RobotFramework:
type: programming
extensions:
@@ -3651,6 +3788,7 @@ Roff:
- ".me"
- ".ms"
- ".n"
- ".nr"
- ".rno"
- ".roff"
- ".tmac"
@@ -3689,10 +3827,10 @@ Ruby:
extensions:
- ".rb"
- ".builder"
- ".eye"
- ".fcgi"
- ".gemspec"
- ".god"
- ".irbrc"
- ".jbuilder"
- ".mspec"
- ".pluginspec"
@@ -3714,6 +3852,7 @@ Ruby:
- jruby
- rbx
filenames:
- ".irbrc"
- ".pryrc"
- Appraisals
- Berksfile
@@ -3729,6 +3868,7 @@ Ruby:
- Mavenfile
- Podfile
- Puppetfile
- Rakefile
- Snapfile
- Thorfile
- Vagrantfile
@@ -3947,6 +4087,13 @@ Self:
tm_scope: none
ace_mode: text
language_id: 345
ShaderLab:
type: programming
extensions:
- ".shader"
ace_mode: text
tm_scope: source.shaderlab
language_id: 664257356
Shell:
type: programming
color: "#89e051"
@@ -3975,7 +4122,12 @@ Shell:
- PKGBUILD
- gradlew
interpreters:
- ash
- bash
- dash
- ksh
- mksh
- pdksh
- rc
- sh
- zsh
@@ -4000,7 +4152,7 @@ Shen:
color: "#120F14"
extensions:
- ".shen"
tm_scope: none
tm_scope: source.shen
ace_mode: text
language_id: 348
Slash:
@@ -4298,6 +4450,7 @@ Text:
- ".no"
filenames:
- COPYING
- COPYRIGHT.regex
- FONTLOG
- INSTALL
- INSTALL.mysql
@@ -4360,6 +4513,15 @@ Twig:
codemirror_mode: twig
codemirror_mime_type: text/x-twig
language_id: 377
Type Language:
type: data
aliases:
- tl
extensions:
- ".tl"
tm_scope: source.tl
ace_mode: text
language_id: 632765617
TypeScript:
type: programming
color: "#2b7489"
@@ -4557,6 +4719,20 @@ Web Ontology Language:
tm_scope: text.xml
ace_mode: xml
language_id: 394
WebAssembly:
type: programming
color: "#04133b"
extensions:
- ".wast"
- ".wat"
aliases:
- wast
- wasm
tm_scope: source.webassembly
ace_mode: lisp
codemirror_mode: commonlisp
codemirror_mime_type: text/x-common-lisp
language_id: 956556503
WebIDL:
type: programming
extensions:
@@ -4597,9 +4773,9 @@ XCompose:
type: data
filenames:
- ".XCompose"
- "XCompose"
- "xcompose"
tm_scope: 'config.xcompose'
- XCompose
- xcompose
tm_scope: config.xcompose
ace_mode: text
language_id: 225167241
XML:
@@ -4613,6 +4789,8 @@ XML:
- wsdl
extensions:
- ".xml"
- ".adml"
- ".admx"
- ".ant"
- ".axml"
- ".builds"
@@ -4640,6 +4818,7 @@ XML:
- ".kml"
- ".launch"
- ".mdpolicy"
- ".mjml"
- ".mm"
- ".mod"
- ".mxml"
@@ -4678,8 +4857,11 @@ XML:
- ".ux"
- ".vbproj"
- ".vcxproj"
- ".vsixmanifest"
- ".vssettings"
- ".vstemplate"
- ".vxml"
- ".wixproj"
- ".wsdl"
- ".wsf"
- ".wxi"
@@ -4695,6 +4877,7 @@ XML:
- ".xml.dist"
- ".xproj"
- ".xsd"
- ".xspec"
- ".xul"
- ".zcml"
filenames:

View File

@@ -72,6 +72,9 @@
# Normalize.css
- (^|/)normalize\.(css|less|scss|styl)$
# Skeleton.css
- (^|/)skeleton\.(css|less|scss|styl)$
# Bourbon css
- (^|/)[Bb]ourbon/.*\.(css|less|scss|styl)$

View File

@@ -1,3 +1,3 @@
module Linguist
VERSION = "5.0.4"
VERSION = "5.1.0"
end

27
samples/C/asm.h Normal file
View File

@@ -0,0 +1,27 @@
/* CarbonOS System/Kernel
* Copyright 2015-2017 David Aylaian
* Licensed under Apache 2.0: https://github.com/DavidAylaian/CarbonOS/blob/master/LICENSE.md
*/
#ifndef ASM_H
#define ASM_H
#include <stdint.h>
// macros for enabling and disabling interrupts
#define enable() asm("sti");
#define disable() asm("cli");
// inb instruction
uint8_t inb (uint16_t port) {
uint8_t val;
asm volatile ("inb %0, %1" : "=a"(val): "Nd"(port));
return val;
}
// outb instruction
void outb (uint16_t port, uint8_t val) {
asm volatile ("outb %1, %0" : : "a"(val), "Nd"(port));
}
#endif

25
samples/C/cpuid.h Normal file
View File

@@ -0,0 +1,25 @@
#ifndef CPUID_H
#define CPUID_H
#include "misc.h"
static inline void do_cpuid(dword_t *eax, dword_t *ebx, dword_t *ecx, dword_t *edx) {
dword_t leaf = *eax;
switch (leaf) {
case 0:
*eax = 0x01; // we support barely anything
*ebx = 0x756e6547; // Genu
*edx = 0x49656e69; // ineI
*ecx = 0x6c65746e; // ntel
break;
default: // if leaf is too high, use highest supported leaf
case 1:
*eax = 0x0; // say nothing about cpu model number
*ebx = 0x0; // processor number 0, flushes 0 bytes on clflush
*ecx = 0b00000000000000000000000000000000; // we support none of the features in ecx
*edx = 0b00000000000000000000000000000000; // we also support none of the features in edx
break;
}
}
#endif

404
samples/CWeb/sat-life.w Normal file
View File

@@ -0,0 +1,404 @@
\datethis
@*Intro. This program generates clauses for the transition relation
from time $t$ to time $t+1$ in Conway's Game of Life, assuming that
all of the potentially live cells at time $t$ belong to a pattern
that's specified in |stdin|. The pattern is defined by one or more
lines representing rows of cells, where each line has `\..' in a
cell that's guaranteed to be dead at time~$t$, otherwise it has `\.*'.
The time is specified separately as a command-line parameter.
The Boolean variable for cell $(x,y)$ at time $t$ is named by its
so-called ``xty code,'' namely by the decimal value of~$x$, followed
by a code letter for~$t$, followed by the decimal value of~$y$. For
example, if $x=10$ and $y=11$ and $t=0$, the variable that indicates
liveness of the cell is \.{10a11}; and the corresponding variable
for $t=1$ is \.{10b11}.
Up to 19 auxiliary variables are used together with each xty code,
in order to construct clauses that define the successor state.
The names of these variables are obtained by appending one of
the following two-character combinations to the xty code:
\.{A2}, \.{A3}, \.{A4},
\.{B1}, \.{B2}, \.{B3}, \.{B4},
\.{C1}, \.{C2}, \.{C3}, \.{C4},
\.{D1}, \.{D2},
\.{E1}, \.{E2},
\.{F1}, \.{F2},
\.{G1}, \.{G2}.
These variables are derived from the Bailleux--Boufkhad method
of encoding cardinality constraints:
The auxiliary variable \.{A$k$} stands for the condition
``at least $k$ of the eight neighbors are alive.'' Similarly,
\.{B$k$} stands for ``at least $k$ of the first four neighbors
are alive,'' and \.{C$k$} accounts for the other four neighbors.
Codes \.D, \.E, \.F, and~\.G refer to pairs of neighbors.
Thus, for instance, \.{10a11C2} means that at least two of the
last four neighbors of cell $(10,11)$ are alive.
Those auxiliary variables receive values by means of up to 77 clauses per cell.
For example, if $u$ and~$v$ are the neighbors of cell~$z$ that correspond
to a pairing of type~\.D, there are six clauses
$$\bar u d_1,\quad
\bar v d_1,\quad
\bar u\bar v d_2,\quad
u v\bar d_1,\quad
u\bar d_2,\quad
v\bar d_2.$$
The sixteen clauses
$$\displaylines{\hfill
\bar d_1b_1,\quad
\bar e_1b_1,\quad
\bar d_2b_2,\quad
\bar d_1\bar e_1b_2,\quad
\bar e_2b_2,\quad
\bar d_2\bar e_1b_3,\quad
\bar d_1\bar e_2b_3,\quad
\bar d_2\bar e_2b_4,
\hfill\cr\hfill
d_1e_1\bar b_1,\quad
d_1e_2\bar b_2,\quad
d_2e_1\bar b_2,\quad
d_1\bar b_3,\quad
d_2e_2\bar b_3,\quad
e_1\bar b_3,\quad
d_2\bar b_4,\quad
e_2\bar b_4
\hfill}$$
define $b$ variables from $d$'s and $e$'s; and another sixteen
define $c$'s from $f$'s and $g$'s in the same fashion.
A similar set of 21 clauses will define the $a$'s from the $b$'s and $c$'s.
Once the $a$'s are defined, thus essentially counting the
live neighbors of cell $z$, the next
state~$z'$ is defined by five further clauses
$$\bar a_4\bar z',\quad
a_2\bar z',\quad
a_3z\bar z',\quad
\bar a_3a_4z',\quad
\bar a_2a_4\bar zz'.$$
For example, the last of these states that $z'$ will be true
(i.e., that cell $z$ will be alive at time $t+1$) if
$z$ is alive at time~$t$ and has $\ge2$ live neighbors
but not $\ge4$.
Nearby cells can share auxiliary variables, according to a tricky scheme that
is worked out below. In consequence, the actual number of auxiliary variables
and clauses per cell is reduced from 19 and $77+5$ to 13 and $57+5$,
respectively, except at the boundaries.
@ So here's the overall outline of the program.
@d maxx 50 /* maximum number of lines in the pattern supplied by |stdin| */
@d maxy 50 /* maximum number of columns per line in |stdin| */
@c
#include <stdio.h>
#include <stdlib.h>
char p[maxx+2][maxy+2]; /* is cell $(x,y)$ potentially alive? */
char have_b[maxx+2][maxy+2]; /* did we already generate $b(x,y)$? */
char have_d[maxx+2][maxy+2]; /* did we already generate $d(x,y)$? */
char have_e[maxx+2][maxy+4]; /* did we already generate $e(x,y)$? */
char have_f[maxx+4][maxy+2]; /* did we already generate $f(x-2,y)$? */
int tt; /* time as given on the command line */
int xmax,ymax; /* the number of rows and columns in the input pattern */
int xmin=maxx,ymin=maxy; /* limits in the other direction */
char timecode[]="abcdefghijklmnopqrstuvwxyz"@|
"ABCDEFGHIJKLMNOPQRSTUVWXYZ"@|
"!\"#$%&'()*+,-./:;<=>?@@[\\]^_`{|}~"; /* codes for $0\le t\le83$ */
@q$@>
char buf[maxy+2]; /* input buffer */
unsigned int clause[4]; /* clauses are assembled here */
int clauseptr; /* this many literals are in the current clause */
@<Subroutines@>@;
main(int argc,char*argv[]) {
register int j,k,x,y;
@<Process the command line@>;
@<Input the pattern@>;
for (x=xmin-1;x<=xmax+1;x++) for (y=ymin-1;y<=ymax+1;y++) {
@<If cell $(x,y)$ is obviously dead at time $t+1$, |continue|@>;
a(x,y);
zprime(x,y);
}
}
@ @<Process the command line@>=
if (argc!=2 || sscanf(argv[1],"%d",&tt)!=1) {
fprintf(stderr,"Usage: %s t\n",argv[0]);
exit(-1);
}
if (tt<0 || tt>82) {
fprintf(stderr,"The time should be between 0 and 82 (not %d)!\n",tt);
exit(-2);
}
@ @<Input the pattern@>=
for (x=1;;x++) {
if (!fgets(buf,maxy+2,stdin)) break;
if (x>maxx) {
fprintf(stderr,"Sorry, the pattern should have at most %d rows!\n",maxx);
exit(-3);
}
for (y=1;buf[y-1]!='\n';y++) {
if (y>maxy) {
fprintf(stderr,"Sorry, the pattern should have at most %d columns!\n",
maxy);
exit(-4);
}
if (buf[y-1]=='*') {
p[x][y]=1;
if (y>ymax) ymax=y;
if (y<ymin) ymin=y;
if (x>xmax) xmax=x;
if (x<xmin) xmin=x;
}@+else if (buf[y-1]!='.') {
fprintf(stderr,"Unexpected character `%c' found in the pattern!\n",
buf[y-1]);
exit(-5);
}
}
}
@ @d pp(xx,yy) ((xx)>=0 && (yy)>=0? p[xx][yy]: 0)
@<If cell $(x,y)$ is obviously dead at time $t+1$, |continue|@>=
if (pp(x-1,y-1)+pp(x-1,y)+pp(x-1,y+1)+
pp(x,y-1)+p[x][y]+p[x][y+1]+
pp(x+1,y-1)+p[x+1][y]+p[x+1][y+1]<3) continue;
@ Clauses are assembled in the |clause| array (surprise), where we
put encoded literals.
The code for a literal is an unsigned 32-bit quantity, where the leading
bit is 1 if the literal should be complemented. The next three bits
specify the type of the literal (0 thru 7 for plain and \.A--\.G);
the next three bits specify an integer~$k$; and the next bit is zero.
That leaves room for two 12-bit fields, which specify $x$ and $y$.
Type 0 literals have $k=0$ for the ordinary xty code. However, the
value $k=1$ indicates that the time code should be for $t+1$ instead of~$t$.
And $k=2$ denotes a special ``tautology'' literal, which is always true.
If the tautology literal is complemented, we omit it from the clause;
otherwise we omit the entire clause.
Finally, $k=7$ denotes an auxiliary literal, used to avoid
clauses of length~4.
Here's a subroutine that outputs the current clause and resets
the |clause| array.
@d taut (2<<25)
@d sign (1U<<31)
@<Sub...@>=
void outclause(void) {
register int c,k,x,y,p;
for (p=0;p<clauseptr;p++)
if (clause[p]==taut) goto done;
for (p=0;p<clauseptr;p++) if (clause[p]!=taut+sign) {
if (clause[p]>>31) printf(" ~");@+else printf(" ");
c=(clause[p]>>28)&0x7;
k=(clause[p]>>25)&0x7;
x=(clause[p]>>12)&0xfff;
y=clause[p]&0xfff;
if (c) printf("%d%c%d%c%d",
x,timecode[tt],y,c+'@@',k);
else if (k==7) printf("%d%c%dx",
x,timecode[tt],y);
else printf("%d%c%d",
x,timecode[tt+k],y);
}
printf("\n");
done: clauseptr=0;
}
@ And here's another, which puts a type-0 literal into |clause|.
@<Sub...@>=
void applit(int x,int y,int bar,int k) {
if (k==0 && (x<xmin || x>xmax || y<ymin || y>ymax || p[x][y]==0))
clause[clauseptr++]=(bar? 0: sign)+taut;
else clause[clauseptr++]=(bar? sign:0)+(k<<25)+(x<<12)+y;
}
@ The |d| and |e| subroutines are called for only one-fourth
of all cell addresses $(x,y)$. Indeed, one can show that
$x$ is always odd, and that $y\bmod4<2$.
Therefore we remember if we've seen $(x,y)$ before.
Slight trick: If |yy| is not in range, we avoid generating the
clause $\bar d_k$ twice.
@d newlit(x,y,c,k) clause[clauseptr++]=((c)<<28)+((k)<<25)+((x)<<12)+(y)
@d newcomplit(x,y,c,k)
clause[clauseptr++]=sign+((c)<<28)+((k)<<25)+((x)<<12)+(y)
@<Sub...@>=
void d(int x,int y) {
register x1=x-1,x2=x,yy=y+1;
if (have_d[x][y]!=tt+1) {
applit(x1,yy,1,0),newlit(x,y,4,1),outclause();
applit(x2,yy,1,0),newlit(x,y,4,1),outclause();
applit(x1,yy,1,0),applit(x2,yy,1,0),newlit(x,y,4,2),outclause();
applit(x1,yy,0,0),applit(x2,yy,0,0),newcomplit(x,y,4,1),outclause();
applit(x1,yy,0,0),newcomplit(x,y,4,2),outclause();
if (yy>=ymin && yy<=ymax)
applit(x2,yy,0,0),newcomplit(x,y,4,2),outclause();
have_d[x][y]=tt+1;
}
}
@#
void e(int x,int y) {
register x1=x-1,x2=x,yy=y-1;
if (have_e[x][y]!=tt+1) {
applit(x1,yy,1,0),newlit(x,y,5,1),outclause();
applit(x2,yy,1,0),newlit(x,y,5,1),outclause();
applit(x1,yy,1,0),applit(x2,yy,1,0),newlit(x,y,5,2),outclause();
applit(x1,yy,0,0),applit(x2,yy,0,0),newcomplit(x,y,5,1),outclause();
applit(x1,yy,0,0),newcomplit(x,y,5,2),outclause();
if (yy>=ymin && yy<=ymax)
applit(x2,yy,0,0),newcomplit(x,y,5,2),outclause();
have_e[x][y]=tt+1;
}
}
@ The |f| subroutine can't be shared quite so often. But we
do save a factor of~2, because $x+y$ is always even.
@<Sub...@>=
void f(int x,int y) {
register xx=x-1,y1=y,y2=y+1;
if (have_f[x][y]!=tt+1) {
applit(xx,y1,1,0),newlit(x,y,6,1),outclause();
applit(xx,y2,1,0),newlit(x,y,6,1),outclause();
applit(xx,y1,1,0),applit(xx,y2,1,0),newlit(x,y,6,2),outclause();
applit(xx,y1,0,0),applit(xx,y2,0,0),newcomplit(x,y,6,1),outclause();
applit(xx,y1,0,0),newcomplit(x,y,6,2),outclause();
if (xx>=xmin && xx<=xmax)
applit(xx,y2,0,0),newcomplit(x,y,6,2),outclause();
have_f[x][y]=tt+1;
}
}
@ The |g| subroutine cleans up the dregs, by somewhat tediously
locating the two neighbors that weren't handled by |d|, |e|, or~|f|.
No sharing is possible here.
@<Sub...@>=
void g(int x,int y) {
register x1,x2,y1,y2;
if (x&1) x1=x-1,y1=y,x2=x+1,y2=y^1;
else x1=x+1,y1=y,x2=x-1,y2=y-1+((y&1)<<1);
applit(x1,y1,1,0),newlit(x,y,7,1),outclause();
applit(x2,y2,1,0),newlit(x,y,7,1),outclause();
applit(x1,y1,1,0),applit(x2,y2,1,0),newlit(x,y,7,2),outclause();
applit(x1,y1,0,0),applit(x2,y2,0,0),newcomplit(x,y,7,1),outclause();
applit(x1,y1,0,0),newcomplit(x,y,7,2),outclause();
applit(x2,y2,0,0),newcomplit(x,y,7,2),outclause();
}
@ Fortunately the |b| subroutine {\it can\/} be shared (since |x| is always
odd), thus saving half of the sixteen clauses generated.
@<Sub...@>=
void b(int x,int y) {
register j,k,xx=x,y1=y-(y&2),y2=y+(y&2);
if (have_b[x][y]!=tt+1) {
d(xx,y1);
e(xx,y2);
for (j=0;j<3;j++) for (k=0;k<3;k++) if (j+k) {
if (j) newcomplit(xx,y1,4,j); /* $\bar d_j$ */
if (k) newcomplit(xx,y2,5,k); /* $\bar e_k$ */
newlit(x,y,2,j+k); /* $b_{j+k}$ */
outclause();
if (j) newlit(xx,y1,4,3-j); /* $d_{3-j}$ */
if (k) newlit(xx,y2,5,3-k); /* $e_{3-k}$ */
newcomplit(x,y,2,5-j-k); /* $\bar b_{5-j-k}$ */
outclause();
}
have_b[x][y]=tt+1;
}
}
@ The (unshared) |c| subroutine handles the other four neighbors,
by working with |f| and |g| instead of |d| and~|e|.
If |y=0|, the overlap rules set |y1=-1|, which can be problematic.
I've decided to avoid this case by omitting |f| when it is
guaranteed to be zero.
@<Sub...@>=
void c(int x,int y) {
register j,k,x1,y1;
if (x&1) x1=x+2,y1=(y-1)|1;
else x1=x,y1=y&-2;
g(x,y);
if (x1-1<xmin || x1-1>xmax || y1+1<ymin || y1>ymax)
@<Set |c| equal to |g|@>@;
else {
f(x1,y1);
for (j=0;j<3;j++) for (k=0;k<3;k++) if (j+k) {
if (j) newcomplit(x1,y1,6,j); /* $\bar f_j$ */
if (k) newcomplit(x,y,7,k); /* $\bar g_k$ */
newlit(x,y,3,j+k); /* $c_{j+k}$ */
outclause();
if (j) newlit(x1,y1,6,3-j); /* $f_{3-j}$ */
if (k) newlit(x,y,7,3-k); /* $g_{3-k}$ */
newcomplit(x,y,3,5-j-k); /* $\bar c_{5-j-k}$ */
outclause();
}
}
}
@ @<Set |c| equal to |g|@>=
{
for (k=1;k<3;k++) {
newcomplit(x,y,7,k),newlit(x,y,3,k),outclause(); /* $\bar g_k\lor c_k$ */
newlit(x,y,7,k),newcomplit(x,y,3,k),outclause(); /* $g_k\lor\bar c_k$ */
}
newcomplit(x,y,3,3),outclause(); /* $\bar c_3$ */
newcomplit(x,y,3,4),outclause(); /* $\bar c_4$ */
}
@ Totals over all eight neighbors are then deduced by the |a|
subroutine.
@<Sub...@>=
void a(int x,int y) {
register j,k,xx=x|1;
b(xx,y);
c(x,y);
for (j=0;j<5;j++) for (k=0;k<5;k++) if (j+k>1 && j+k<5) {
if (j) newcomplit(xx,y,2,j); /* $\bar b_j$ */
if (k) newcomplit(x,y,3,k); /* $\bar c_k$ */
newlit(x,y,1,j+k); /* $a_{j+k}$ */
outclause();
}
for (j=0;j<5;j++) for (k=0;k<5;k++) if (j+k>2 && j+k<6 && j*k) {
if (j) newlit(xx,y,2,j); /* $b_j$ */
if (k) newlit(x,y,3,k); /* $c_k$ */
newcomplit(x,y,1,j+k-1); /* $\bar a_{j+k-1}$ */
outclause();
}
}
@ Finally, as mentioned at the beginning, $z'$ is determined
from $z$, $a_2$, $a_3$, and $a_4$.
I actually generate six clauses, not five, in order to stick to
{\mc 3SAT}.
@<Sub...@>=
void zprime(int x,int y) {
newcomplit(x,y,1,4),applit(x,y,1,1),outclause(); /* $\bar a_4\bar z'$ */
newlit(x,y,1,2),applit(x,y,1,1),outclause(); /* $a_2\bar z'$ */
newlit(x,y,1,3),applit(x,y,0,0),applit(x,y,1,1),outclause();
/* $a_3z\bar z'$ */
newcomplit(x,y,1,3),newlit(x,y,1,4),applit(x,y,0,1),outclause();
/* $\bar a_3a_4z'$ */
applit(x,y,0,7),newcomplit(x,y,1,2),newlit(x,y,1,4),outclause();
/* $x\bar a_2a_4$ */
applit(x,y,1,7),applit(x,y,1,0),applit(x,y,0,1),outclause();
/* $\bar x\bar zz'$ */
}
@*Index.

View File

@@ -0,0 +1,24 @@
{namespace Exmaple}
/**
* Example
*/
{template .foo}
{@param count: string}
{@param? name: int}
{if isNonnull($name)}
<h1>{$name}</h1>
{/if}
<div class="content">
{switch count}
{case 0}
{call Empty.view}
{param count: $count /}
{/call}
{default}
<h2>Wow, so many!</h2>
{/switch}
</div>
{/template}

File diff suppressed because it is too large Load Diff

440
samples/D/aa.d Normal file
View File

@@ -0,0 +1,440 @@
/**
* Implementation of associative arrays.
*
* Copyright: Martin Nowak 2015 -.
* License: $(LINK2 http://www.boost.org/LICENSE_1_0.txt, Boost License 1.0)
* Authors: Martin Nowak
*/
module core.aa;
import core.memory : GC;
private
{
// grow threshold
enum GROW_NUM = 4;
enum GROW_DEN = 5;
// shrink threshold
enum SHRINK_NUM = 1;
enum SHRINK_DEN = 8;
// grow factor
enum GROW_FAC = 4;
// growing the AA doubles it's size, so the shrink threshold must be
// smaller than half the grow threshold to have a hysteresis
static assert(GROW_FAC * SHRINK_NUM * GROW_DEN < GROW_NUM * SHRINK_DEN);
// initial load factor (for literals), mean of both thresholds
enum INIT_NUM = (GROW_DEN * SHRINK_NUM + GROW_NUM * SHRINK_DEN) / 2;
enum INIT_DEN = SHRINK_DEN * GROW_DEN;
// magic hash constants to distinguish empty, deleted, and filled buckets
enum HASH_EMPTY = 0;
enum HASH_DELETED = 0x1;
enum HASH_FILLED_MARK = size_t(1) << 8 * size_t.sizeof - 1;
}
enum INIT_NUM_BUCKETS = 8;
struct AA(Key, Val)
{
this(size_t sz)
{
impl = new Impl(nextpow2(sz));
}
@property bool empty() const pure nothrow @safe @nogc
{
return !length;
}
@property size_t length() const pure nothrow @safe @nogc
{
return impl is null ? 0 : impl.length;
}
void opIndexAssign(Val val, in Key key)
{
// lazily alloc implementation
if (impl is null)
impl = new Impl(INIT_NUM_BUCKETS);
// get hash and bucket for key
immutable hash = calcHash(key);
// found a value => assignment
if (auto p = impl.findSlotLookup(hash, key))
{
p.entry.val = val;
return;
}
auto p = findSlotInsert(hash);
if (p.deleted)
--deleted;
// check load factor and possibly grow
else if (++used * GROW_DEN > dim * GROW_NUM)
{
grow();
p = findSlotInsert(hash);
assert(p.empty);
}
// update search cache and allocate entry
firstUsed = min(firstUsed, cast(uint)(p - buckets.ptr));
p.hash = hash;
p.entry = new Impl.Entry(key, val); // TODO: move
return;
}
ref inout(Val) opIndex(in Key key) inout @trusted
{
auto p = opIn_r(key);
assert(p !is null);
return *p;
}
inout(Val)* opIn_r(in Key key) inout @trusted
{
if (empty)
return null;
immutable hash = calcHash(key);
if (auto p = findSlotLookup(hash, key))
return &p.entry.val;
return null;
}
bool remove(in Key key)
{
if (empty)
return false;
immutable hash = calcHash(key);
if (auto p = findSlotLookup(hash, key))
{
// clear entry
p.hash = HASH_DELETED;
p.entry = null;
++deleted;
if (length * SHRINK_DEN < dim * SHRINK_NUM)
shrink();
return true;
}
return false;
}
Val get(in Key key, lazy Val val)
{
auto p = opIn_r(key);
return p is null ? val : *p;
}
ref Val getOrSet(in Key key, lazy Val val)
{
// lazily alloc implementation
if (impl is null)
impl = new Impl(INIT_NUM_BUCKETS);
// get hash and bucket for key
immutable hash = calcHash(key);
// found a value => assignment
if (auto p = impl.findSlotLookup(hash, key))
return p.entry.val;
auto p = findSlotInsert(hash);
if (p.deleted)
--deleted;
// check load factor and possibly grow
else if (++used * GROW_DEN > dim * GROW_NUM)
{
grow();
p = findSlotInsert(hash);
assert(p.empty);
}
// update search cache and allocate entry
firstUsed = min(firstUsed, cast(uint)(p - buckets.ptr));
p.hash = hash;
p.entry = new Impl.Entry(key, val);
return p.entry.val;
}
/**
Convert the AA to the type of the builtin language AA.
*/
Val[Key] toBuiltinAA() pure nothrow
{
return cast(Val[Key]) _aaFromCoreAA(impl, rtInterface);
}
private:
private this(inout(Impl)* impl) inout
{
this.impl = impl;
}
ref Val getLValue(in Key key)
{
// lazily alloc implementation
if (impl is null)
impl = new Impl(INIT_NUM_BUCKETS);
// get hash and bucket for key
immutable hash = calcHash(key);
// found a value => assignment
if (auto p = impl.findSlotLookup(hash, key))
return p.entry.val;
auto p = findSlotInsert(hash);
if (p.deleted)
--deleted;
// check load factor and possibly grow
else if (++used * GROW_DEN > dim * GROW_NUM)
{
grow();
p = findSlotInsert(hash);
assert(p.empty);
}
// update search cache and allocate entry
firstUsed = min(firstUsed, cast(uint)(p - buckets.ptr));
p.hash = hash;
p.entry = new Impl.Entry(key); // TODO: move
return p.entry.val;
}
static struct Impl
{
this(size_t sz)
{
buckets = allocBuckets(sz);
}
@property size_t length() const pure nothrow @nogc
{
assert(used >= deleted);
return used - deleted;
}
@property size_t dim() const pure nothrow @nogc
{
return buckets.length;
}
@property size_t mask() const pure nothrow @nogc
{
return dim - 1;
}
// find the first slot to insert a value with hash
inout(Bucket)* findSlotInsert(size_t hash) inout pure nothrow @nogc
{
for (size_t i = hash & mask, j = 1;; ++j)
{
if (!buckets[i].filled)
return &buckets[i];
i = (i + j) & mask;
}
}
// lookup a key
inout(Bucket)* findSlotLookup(size_t hash, in Key key) inout
{
for (size_t i = hash & mask, j = 1;; ++j)
{
if (buckets[i].hash == hash && key == buckets[i].entry.key)
return &buckets[i];
else if (buckets[i].empty)
return null;
i = (i + j) & mask;
}
}
void grow()
{
// If there are so many deleted entries, that growing would push us
// below the shrink threshold, we just purge deleted entries instead.
if (length * SHRINK_DEN < GROW_FAC * dim * SHRINK_NUM)
resize(dim);
else
resize(GROW_FAC * dim);
}
void shrink()
{
if (dim > INIT_NUM_BUCKETS)
resize(dim / GROW_FAC);
}
void resize(size_t ndim) pure nothrow
{
auto obuckets = buckets;
buckets = allocBuckets(ndim);
foreach (ref b; obuckets)
if (b.filled)
*findSlotInsert(b.hash) = b;
firstUsed = 0;
used -= deleted;
deleted = 0;
GC.free(obuckets.ptr); // safe to free b/c impossible to reference
}
static struct Entry
{
Key key;
Val val;
}
static struct Bucket
{
size_t hash;
Entry* entry;
@property bool empty() const
{
return hash == HASH_EMPTY;
}
@property bool deleted() const
{
return hash == HASH_DELETED;
}
@property bool filled() const
{
return cast(ptrdiff_t) hash < 0;
}
}
Bucket[] allocBuckets(size_t dim) @trusted pure nothrow
{
enum attr = GC.BlkAttr.NO_INTERIOR;
immutable sz = dim * Bucket.sizeof;
return (cast(Bucket*) GC.calloc(sz, attr))[0 .. dim];
}
Bucket[] buckets;
uint used;
uint deleted;
uint firstUsed;
}
RTInterface* rtInterface()() pure nothrow @nogc
{
static size_t aaLen(in void* pimpl) pure nothrow @nogc
{
auto aa = const(AA)(cast(const(Impl)*) pimpl);
return aa.length;
}
static void* aaGetY(void** pimpl, in void* pkey)
{
auto aa = AA(cast(Impl*)*pimpl);
auto res = &aa.getLValue(*cast(const(Key*)) pkey);
*pimpl = aa.impl; // might have changed
return res;
}
static inout(void)* aaInX(inout void* pimpl, in void* pkey)
{
auto aa = inout(AA)(cast(inout(Impl)*) pimpl);
return aa.opIn_r(*cast(const(Key*)) pkey);
}
static bool aaDelX(void* pimpl, in void* pkey)
{
auto aa = AA(cast(Impl*) pimpl);
return aa.remove(*cast(const(Key*)) pkey);
}
static immutable vtbl = RTInterface(&aaLen, &aaGetY, &aaInX, &aaDelX);
return cast(RTInterface*)&vtbl;
}
static size_t calcHash(in ref Key key)
{
return hashOf(key) | HASH_FILLED_MARK;
}
Impl* impl;
alias impl this;
}
package extern (C) void* _aaFromCoreAA(void* impl, RTInterface* rtIntf) pure nothrow;
private:
struct RTInterface
{
alias AA = void*;
size_t function(in AA aa) pure nothrow @nogc len;
void* function(AA* aa, in void* pkey) getY;
inout(void)* function(inout AA aa, in void* pkey) inX;
bool function(AA aa, in void* pkey) delX;
}
unittest
{
AA!(int, int) aa;
assert(aa.length == 0);
aa[0] = 1;
assert(aa.length == 1 && aa[0] == 1);
aa[1] = 2;
assert(aa.length == 2 && aa[1] == 2);
import core.stdc.stdio;
int[int] rtaa = aa.toBuiltinAA();
assert(rtaa.length == 2);
puts("length");
assert(rtaa[0] == 1);
assert(rtaa[1] == 2);
rtaa[2] = 3;
assert(aa[2] == 3);
}
unittest
{
auto aa = AA!(int, int)(3);
aa[0] = 0;
aa[1] = 1;
aa[2] = 2;
assert(aa.length == 3);
}
//==============================================================================
// Helper functions
//------------------------------------------------------------------------------
size_t nextpow2(in size_t n) pure nothrow @nogc
{
import core.bitop : bsr;
if (n < 2)
return 1;
return size_t(1) << bsr(n - 1) + 1;
}
pure nothrow @nogc unittest
{
// 0, 1, 2, 3, 4, 5, 6, 7, 8, 9
foreach (const n, const pow2; [1, 1, 2, 4, 4, 8, 8, 8, 8, 16])
assert(nextpow2(n) == pow2);
}
T min(T)(T a, T b) pure nothrow @nogc
{
return a < b ? a : b;
}
T max(T)(T a, T b) pure nothrow @nogc
{
return b < a ? a : b;
}

187
samples/D/arrayops.d Normal file
View File

@@ -0,0 +1,187 @@
/**
* Benchmark for array ops.
*
* Copyright: Copyright Martin Nowak 2016 -.
* License: $(LINK2 http://www.boost.org/LICENSE_1_0.txt, Boost License 1.0)
* Authors: Martin Nowak
*/
import core.cpuid, std.algorithm, std.datetime, std.meta, std.stdio, std.string,
std.range;
float[6] getLatencies(T, string op)()
{
enum N = (64 * (1 << 6) + 64) * T.sizeof;
auto a = Array!T(N), b = Array!T(N), c = Array!T(N);
float[6] latencies = float.max;
foreach (i, ref latency; latencies)
{
auto len = 1 << i;
foreach (_; 1 .. 32)
{
a[] = 24;
b[] = 4;
c[] = 2;
auto sw = StopWatch(AutoStart.yes);
foreach (off; size_t(0) .. size_t(64))
{
off = off * len + off;
enum op = op.replace("const", "2").replace("a",
"a[off .. off + len]").replace("b",
"b[off .. off + len]").replace("c", "c[off .. off + len]");
mixin(op ~ ";");
}
latency = min(latency, sw.peek.nsecs);
}
}
float[6] res = latencies[] / 1024;
return res;
}
float[4] getThroughput(T, string op)()
{
enum N = (40 * 1024 * 1024 + 64 * T.sizeof) / T.sizeof;
auto a = Array!T(N), b = Array!T(N), c = Array!T(N);
float[4] latencies = float.max;
size_t[4] lengths = [
8 * 1024 / T.sizeof, 32 * 1024 / T.sizeof, 512 * 1024 / T.sizeof, 32 * 1024 * 1024 / T
.sizeof
];
foreach (i, ref latency; latencies)
{
auto len = lengths[i] / 64;
foreach (_; 1 .. 4)
{
a[] = 24;
b[] = 4;
c[] = 2;
auto sw = StopWatch(AutoStart.yes);
foreach (off; size_t(0) .. size_t(64))
{
off = off * len + off;
enum op = op.replace("const", "2").replace("a",
"a[off .. off + len]").replace("b",
"b[off .. off + len]").replace("c", "c[off .. off + len]");
mixin(op ~ ";");
}
immutable nsecs = sw.peek.nsecs;
runMasked({latency = min(latency, nsecs);});
}
}
float[4] throughputs = void;
runMasked({throughputs = T.sizeof * lengths[] / latencies[];});
return throughputs;
}
string[] genOps()
{
string[] ops;
foreach (op1; ["+", "-", "*", "/"])
{
ops ~= "a " ~ op1 ~ "= b";
ops ~= "a " ~ op1 ~ "= const";
foreach (op2; ["+", "-", "*", "/"])
{
ops ~= "a " ~ op1 ~ "= b " ~ op2 ~ " c";
ops ~= "a " ~ op1 ~ "= b " ~ op2 ~ " const";
}
}
return ops;
}
void runOp(string op)()
{
foreach (T; AliasSeq!(ubyte, ushort, uint, ulong, byte, short, int, long, float,
double))
writefln("%s, %s, %(%.2f, %), %(%s, %)", T.stringof, op,
getLatencies!(T, op), getThroughput!(T, op));
}
struct Array(T)
{
import core.stdc.stdlib : free, malloc;
this(size_t n)
{
ary = (cast(T*) malloc(T.sizeof * n))[0 .. n];
}
~this()
{
free(ary.ptr);
}
T[] ary;
alias ary this;
}
version (X86)
version = SSE;
else version (X86_64)
version = SSE;
else
static assert(0, "unimplemented");
version (SSE)
{
uint mxcsr()
{
uint ret = void;
asm
{
stmxcsr ret;
}
return ret;
}
void mxcsr(uint val)
{
asm
{
ldmxcsr val;
}
}
// http://softpixel.com/~cwright/programming/simd/sse.php
enum FPU_EXCEPTION_MASKS = 1 << 12 | 1 << 11 | 1 << 10 | 1 << 9 | 1 << 8 | 1 << 7;
enum FPU_EXCEPTION_FLAGS = 1 << 5 | 1 << 4 | 1 << 3 | 1 << 2 | 1 << 1 | 1 << 0;
void maskFPUExceptions()
{
mxcsr = mxcsr | FPU_EXCEPTION_MASKS;
}
void unmaskFPUExceptions()
{
mxcsr = mxcsr & ~FPU_EXCEPTION_MASKS;
}
uint FPUExceptionFlags()
{
return mxcsr & FPU_EXCEPTION_FLAGS;
}
void clearFPUExceptionFlags()
{
mxcsr = mxcsr & ~FPU_EXCEPTION_FLAGS;
}
}
void runMasked(scope void delegate() dg)
{
assert(FPUExceptionFlags == 0);
maskFPUExceptions;
dg();
clearFPUExceptionFlags;
unmaskFPUExceptions;
}
void main()
{
unmaskFPUExceptions;
writefln("type, op, %(latency%s, %), %-(throughput%s, %)", iota(6)
.map!(i => 1 << i), ["8KB", "32KB", "512KB", "32MB"]);
foreach (op; mixin("AliasSeq!(%(%s, %))".format(genOps)))
runOp!op;
maskFPUExceptions;
}

3
samples/D/function.d Normal file
View File

@@ -0,0 +1,3 @@
void foo()
{
}

6
samples/D/hello_world.d Normal file
View File

@@ -0,0 +1,6 @@
import std.stdio;
void main()
{
writeln("Hello World");
}

7
samples/D/template.d Normal file
View File

@@ -0,0 +1,7 @@
template Fib(size_t N)
{
static if (N < 2)
enum Fib = size_t(1);
else
enum Fib = Fib!(N - 2) + Fib!(N - 1);
}

View File

@@ -0,0 +1,3 @@
void bar(T)(T t)
{
}

3
samples/D/unittest1.d Normal file
View File

@@ -0,0 +1,3 @@
unittest
{
}

3
samples/D/unittest2.d Normal file
View File

@@ -0,0 +1,3 @@
unittest("optional name")
{
}

View File

@@ -0,0 +1,20 @@
# not really (there's an EB_bzip2 easyblock), but fine for use in unit tests
easyblock = 'ConfigureMake'
name = 'bzip2'
version = '1.0.6'
homepage = 'http://www.bzip.org/'
description = """bzip2 is a freely available, patent free, high-quality data compressor. It typically
compresses files to within 10% to 15% of the best available techniques (the PPM family of statistical
compressors), whilst being around twice as fast at compression and six times faster at decompression."""
toolchain = {'name': 'GCC', 'version': '4.9.2'}
toolchainopts = {'pic': True}
sources = [SOURCE_TAR_GZ]
source_urls = ['http://www.bzip.org/%(version)s']
builddependencies = [('gzip', '1.6')]
moduleclass = 'tools'

View File

@@ -0,0 +1,7 @@
{"src/*", [
report,
verbose,
{i, "include"},
{outdir, "ebin"},
debug_info
]}.

View File

@@ -0,0 +1,97 @@
/*
* Author: Robert Koeninger
* License: WTFPL (http://www.wtfpl.net/)
*/
class Spelling {
** Load sample text and offer corrections for input
static Void main(Str[] args) {
text := File.os("big.txt").readAllStr
counts := Str:Int[:] { def = 0 }
text.split.each |word| { counts[word] += 1 }
args.each |arg| { echo(correction(counts, arg)) }
}
static const Range letters := Range.makeInclusive(97, 122)
** Most probable spelling correction for `word`.
static Str correction(Str:Int counts, Str word) {
candidates(counts, word).max |x, y| { counts[x] <=> counts[y] }
}
** Generate possible spelling corrections for `word`.
static Str[] candidates(Str:Int counts, Str word) {
result := known(counts, Str[word])
if (result.size > 0) return result
result = known(counts, edits1(word))
if (result.size > 0) return result
result = known(counts, edits2(word))
if (result.size > 0) return result
return Str[word]
}
** The subset of `words` that appear in the map of `counts`.
static Str[] known(Str:Int counts, Str[] words) {
words.findAll |word, i| { counts[word] > 0 }.unique
}
** All edits that are one edit away from `word`.
static Str[] edits1(Str word) {
edits := Str[,]
for (i := 0; i < word.size; ++i) {
edits.add(delete(word, i))
if (i < word.size - 2) {
edits.add(transpose(word, i))
}
edits.addAll(replace(word, i))
edits.addAll(insert(word, i))
}
edits = edits.unique
edits.remove(word)
return edits
}
** Word with `i`th letter removed.
static Str delete(Str word, Int i) {
left := word.getRange(Range.makeExclusive(0, i))
right := word.getRange(Range.makeExclusive(i + 1, word.size))
return left + right
}
** Word with `i`th and `i+1`st letter swapped.
static Str transpose(Str word, Int i) {
left := word.getRange(Range.makeExclusive(0, i))
right := word.getRange(Range.makeExclusive(i, word.size))
first := right.get(0).toChar
second := right.get(1).toChar
rest := right.getRange(Range.makeExclusive(2, right.size))
return left + second + first + rest
}
** Word with `i`th letter replaced with every other letter.
static Str[] replace(Str word, Int i) {
left := word.getRange(Range.makeExclusive(0, i))
right := word.getRange(Range.makeExclusive(i + 1, word.size))
return letters.map |ch| { left + ch.toChar + right }
}
** Word with each letter inserted at `i`.
static Str[] insert(Str word, Int i) {
left := word.getRange(Range.makeExclusive(0, i))
right := word.getRange(Range.makeExclusive(i, word.size))
return letters.map |ch| { left + ch.toChar + right }
}
** All edits that are two edits away from `word`.
static Str[] edits2(Str word) {
(Str[])(edits1(word).map |w| { edits1(w) }.flatten)
}
}

View File

@@ -0,0 +1,50 @@
/*
* Author: Robert Koeninger
* License: WTFPL (http://www.wtfpl.net/)
*/
mixin Expr
{
abstract Obj? eval()
}
class Constant : Expr
{
Obj? value
new make(Obj? value) { this.value = value }
override Obj? eval() { value }
}
enum class Op
{
plus,
minus
}
class Infix : Expr
{
Op op
Expr left
Expr right
new make(Op op, Expr left, Expr right)
{
this.op = op
this.left = left
this.right = right
}
override Obj? eval()
{
switch (op)
{
case Op.plus:
return (Int)left.eval() + (Int)right.eval()
case Op.minus:
return (Int)left.eval() - (Int)right.eval()
default:
throw Err("undefined Op")
}
}
}

161
samples/GLSL/SyLens.shader Normal file
View File

@@ -0,0 +1,161 @@
#version 120
/*
Original Lens Distortion Algorithm from SSontech (Syntheyes)
http://www.ssontech.com/content/lensalg.htm
r2 is radius squared.
r2 = image_aspect*image_aspect*u*u + v*v
f = 1 + r2*(k + kcube*sqrt(r2))
u' = f*u
v' = f*v
*/
// Controls
uniform float kCoeff, kCube, uShift, vShift;
uniform float chroma_red, chroma_green, chroma_blue;
uniform bool apply_disto;
// Uniform inputs
uniform sampler2D input1;
uniform float adsk_input1_w, adsk_input1_h, adsk_input1_aspect, adsk_input1_frameratio;
uniform float adsk_result_w, adsk_result_h;
float distortion_f(float r) {
float f = 1 + (r*r)*(kCoeff + kCube * r);
return f;
}
float inverse_f(float r)
{
// Build a lookup table on the radius, as a fixed-size table.
// We will use a vec3 since we will store the multipled number in the Z coordinate.
// So to recap: x will be the radius, y will be the f(x) distortion, and Z will be x * y;
vec3[48] lut;
// Since out LUT is shader-global check if it's been computed alrite
// Flame has no overflow bbox so we can safely max out at the image edge, plus some cushion
float max_r = sqrt((adsk_input1_frameratio * adsk_input1_frameratio) + 1) + 0.1;
float incr = max_r / 48;
float lut_r = 0;
float f;
for(int i=0; i < 48; i++) {
f = distortion_f(lut_r);
lut[i] = vec3(lut_r, f, lut_r * f);
lut_r += incr;
}
float t;
// Now find the nehgbouring elements
// only iterate to 46 since we will need
// 47 as i+1
for(int i=0; i < 47; i++) {
if(lut[i].z < r && lut[i+1].z > r) {
// BAM! our value is between these two segments
// get the T interpolant and mix
t = (r - lut[i].z) / (lut[i+1].z - lut[i]).z;
return mix(lut[i].y, lut[i+1].y, t );
}
}
}
float aberrate(float f, float chroma)
{
return f + (f * chroma);
}
vec3 chromaticize_and_invert(float f)
{
vec3 rgb_f = vec3(aberrate(f, chroma_red), aberrate(f, chroma_green), aberrate(f, chroma_blue));
// We need to DIVIDE by F when we redistort, and x / y == x * (1 / y)
if(apply_disto) {
rgb_f = 1 / rgb_f;
}
return rgb_f;
}
void main(void)
{
vec2 px, uv;
float f = 1;
float r = 1;
px = gl_FragCoord.xy;
// Make sure we are still centered
px.x -= (adsk_result_w - adsk_input1_w) / 2;
px.y -= (adsk_result_h - adsk_input1_h) / 2;
// Push the destination coordinates into the [0..1] range
uv.x = px.x / adsk_input1_w;
uv.y = px.y / adsk_input1_h;
// And to Syntheyes UV which are [1..-1] on both X and Y
uv.x = (uv.x *2 ) - 1;
uv.y = (uv.y *2 ) - 1;
// Add UV shifts
uv.x += uShift;
uv.y += vShift;
// Make the X value the aspect value, so that the X coordinates go to [-aspect..aspect]
uv.x = uv.x * adsk_input1_frameratio;
// Compute the radius
r = sqrt(uv.x*uv.x + uv.y*uv.y);
// If we are redistorting, account for the oversize plate in the input, assume that
// the input aspect is the same
if(apply_disto) {
r = r / (float(adsk_input1_w) / float(adsk_result_w));
}
// Apply or remove disto, per channel honoring chromatic aberration
if(apply_disto) {
f = inverse_f(r);
} else {
f = distortion_f(r);
}
vec2[3] rgb_uvs = vec2[](uv, uv, uv);
// Compute distortions per component
vec3 rgb_f = chromaticize_and_invert(f);
// Apply the disto coefficients, per component
rgb_uvs[0] = rgb_uvs[0] * rgb_f.rr;
rgb_uvs[1] = rgb_uvs[1] * rgb_f.gg;
rgb_uvs[2] = rgb_uvs[2] * rgb_f.bb;
// Convert all the UVs back to the texture space, per color component
for(int i=0; i < 3; i++) {
uv = rgb_uvs[i];
// Back from [-aspect..aspect] to [-1..1]
uv.x = uv.x / adsk_input1_frameratio;
// Remove UV shifts
uv.x -= uShift;
uv.y -= vShift;
// Back to OGL UV
uv.x = (uv.x + 1) / 2;
uv.y = (uv.y + 1) / 2;
rgb_uvs[i] = uv;
}
// Sample the input plate, per component
vec4 sampled;
sampled.r = texture2D(input1, rgb_uvs[0]).r;
sampled.g = texture2D(input1, rgb_uvs[1]).g;
sampled.b = texture2D(input1, rgb_uvs[2]).b;
// and assign to the output
gl_FragColor.rgba = vec4(sampled.rgb, 1.0 );
}

View File

@@ -0,0 +1,630 @@
//// High quality (Some browsers may freeze or crash)
//#define HIGHQUALITY
//// Medium quality (Should be fine on all systems, works on Intel HD2000 on Win7 but quite slow)
//#define MEDIUMQUALITY
//// Defaults
//#define REFLECTIONS
#define SHADOWS
//#define GRASS
//#define SMALL_WAVES
#define RAGGED_LEAVES
//#define DETAILED_NOISE
//#define LIGHT_AA // 2 sample SSAA
//#define HEAVY_AA // 2x2 RG SSAA
//#define TONEMAP
//// Configurations
#ifdef MEDIUMQUALITY
#define SHADOWS
#define SMALL_WAVES
#define RAGGED_LEAVES
#define TONEMAP
#endif
#ifdef HIGHQUALITY
#define REFLECTIONS
#define SHADOWS
//#define GRASS
#define SMALL_WAVES
#define RAGGED_LEAVES
#define DETAILED_NOISE
#define LIGHT_AA
#define TONEMAP
#endif
// Constants
const float eps = 1e-5;
const float PI = 3.14159265359;
const vec3 sunDir = vec3(0.79057,-0.47434, 0.0);
const vec3 skyCol = vec3(0.3, 0.5, 0.8);
const vec3 sandCol = vec3(0.9, 0.8, 0.5);
const vec3 treeCol = vec3(0.8, 0.65, 0.3);
const vec3 grassCol = vec3(0.4, 0.5, 0.18);
const vec3 leavesCol = vec3(0.3, 0.6, 0.2);
const vec3 leavesPos = vec3(-5.1,13.4, 0.0);
#ifdef TONEMAP
const vec3 sunCol = vec3(1.8, 1.7, 1.6);
#else
const vec3 sunCol = vec3(0.9, 0.85, 0.8);
#endif
const float exposure = 1.1; // Only used when tonemapping
// Description : Array and textureless GLSL 2D/3D/4D simplex
// noise functions.
// Author : Ian McEwan, Ashima Arts.
// License : Copyright (C) 2011 Ashima Arts. All rights reserved.
// Distributed under the MIT License. See LICENSE file.
// https://github.com/ashima/webgl-noise
vec3 mod289(vec3 x) {
return x - floor(x * (1.0 / 289.0)) * 289.0;
}
vec4 mod289(vec4 x) {
return x - floor(x * (1.0 / 289.0)) * 289.0;
}
vec4 permute(vec4 x) {
return mod289(((x*34.0)+1.0)*x);
}
vec4 taylorInvSqrt(vec4 r) {
return 1.79284291400159 - 0.85373472095314 * r;
}
float snoise(vec3 v) {
const vec2 C = vec2(1.0/6.0, 1.0/3.0) ;
const vec4 D = vec4(0.0, 0.5, 1.0, 2.0);
// First corner
vec3 i = floor(v + dot(v, C.yyy) );
vec3 x0 = v - i + dot(i, C.xxx) ;
// Other corners
vec3 g = step(x0.yzx, x0.xyz);
vec3 l = 1.0 - g;
vec3 i1 = min( g.xyz, l.zxy );
vec3 i2 = max( g.xyz, l.zxy );
// x0 = x0 - 0.0 + 0.0 * C.xxx;
// x1 = x0 - i1 + 1.0 * C.xxx;
// x2 = x0 - i2 + 2.0 * C.xxx;
// x3 = x0 - 1.0 + 3.0 * C.xxx;
vec3 x1 = x0 - i1 + C.xxx;
vec3 x2 = x0 - i2 + C.yyy; // 2.0*C.x = 1/3 = C.y
vec3 x3 = x0 - D.yyy; // -1.0+3.0*C.x = -0.5 = -D.y
// Permutations
i = mod289(i);
vec4 p = permute( permute( permute(
i.z + vec4(0.0, i1.z, i2.z, 1.0 ))
+ i.y + vec4(0.0, i1.y, i2.y, 1.0 ))
+ i.x + vec4(0.0, i1.x, i2.x, 1.0 ));
// Gradients: 7x7 points over a square, mapped onto an octahedron.
// The ring size 17*17 = 289 is close to a multiple of 49 (49*6 = 294)
float n_ = 0.142857142857; // 1.0/7.0
vec3 ns = n_ * D.wyz - D.xzx;
vec4 j = p - 49.0 * floor(p * ns.z * ns.z); // mod(p,7*7)
vec4 x_ = floor(j * ns.z);
vec4 y_ = floor(j - 7.0 * x_ ); // mod(j,N)
vec4 x = x_ *ns.x + ns.yyyy;
vec4 y = y_ *ns.x + ns.yyyy;
vec4 h = 1.0 - abs(x) - abs(y);
vec4 b0 = vec4( x.xy, y.xy );
vec4 b1 = vec4( x.zw, y.zw );
//vec4 s0 = vec4(lessThan(b0,0.0))*2.0 - 1.0;
//vec4 s1 = vec4(lessThan(b1,0.0))*2.0 - 1.0;
vec4 s0 = floor(b0)*2.0 + 1.0;
vec4 s1 = floor(b1)*2.0 + 1.0;
vec4 sh = -step(h, vec4(0.0));
vec4 a0 = b0.xzyw + s0.xzyw*sh.xxyy ;
vec4 a1 = b1.xzyw + s1.xzyw*sh.zzww ;
vec3 p0 = vec3(a0.xy,h.x);
vec3 p1 = vec3(a0.zw,h.y);
vec3 p2 = vec3(a1.xy,h.z);
vec3 p3 = vec3(a1.zw,h.w);
//Normalise gradients
vec4 norm = taylorInvSqrt(vec4(dot(p0,p0), dot(p1,p1), dot(p2, p2), dot(p3,p3)));
p0 *= norm.x;
p1 *= norm.y;
p2 *= norm.z;
p3 *= norm.w;
// Mix final noise value
vec4 m = max(0.6 - vec4(dot(x0,x0), dot(x1,x1), dot(x2,x2), dot(x3,x3)), 0.0);
m = m * m;
return 42.0 * dot( m*m, vec4( dot(p0,x0), dot(p1,x1),
dot(p2,x2), dot(p3,x3) ) );
}
// Main
float fbm(vec3 p)
{
float final = snoise(p);
p *= 1.94; final += snoise(p) * 0.5;
#ifdef DETAILED_NOISE
p *= 3.75; final += snoise(p) * 0.25;
return final / 1.75;
#else
return final / 1.5;
#endif
}
float waterHeight(vec3 p)
{
float d = length(p.xz);
float h = sin(d * 1.5 + iGlobalTime * 3.0) * 12.0 / d; // Island waves
#ifdef SMALL_WAVES
h += fbm(p*0.5); // Other waves
#endif
return h;
}
vec3 bump(vec3 pos, vec3 rayDir)
{
float s = 2.0;
// Fade out waves to reduce aliasing
float dist = dot(pos, rayDir);
s *= dist < 2.0 ? 1.0 : 1.4142 / sqrt(dist);
// Calculate normal from heightmap
vec2 e = vec2(1e-2, 0.0);
vec3 p = vec3(pos.x, iGlobalTime*0.5, pos.z)*0.7;
float m = waterHeight(p)*s;
return normalize(vec3(
waterHeight(p+e.xyy)*s-m,
1.0,
waterHeight(p+e.yxy)*s-m
));
}
// Ray intersections
vec4 intersectSphere(vec3 rpos, vec3 rdir, vec3 pos, float rad)
{
vec3 op = pos - rpos;
float b = dot(op, rdir);
float det = b*b - dot(op, op) + rad*rad;
if (det > 0.0)
{
det = sqrt(det);
float t = b - det;
if (t > eps)
return vec4(-normalize(rpos+rdir*t-pos), t);
}
return vec4(0.0);
}
vec4 intersectCylinder(vec3 rpos, vec3 rdir, vec3 pos, float rad)
{
vec3 op = pos - rpos;
vec2 rdir2 = normalize(rdir.yz);
float b = dot(op.yz, rdir2);
float det = b*b - dot(op.yz, op.yz) + rad*rad;
if (det > 0.0)
{
det = sqrt(det);
float t = b - det;
if (t > eps)
return vec4(-normalize(rpos.yz+rdir2*t-pos.yz), 0.0, t);
t = b + det;
if (t > eps)
return vec4(-normalize(rpos.yz+rdir2*t-pos.yz), 0.0, t);
}
return vec4(0.0);
}
vec4 intersectPlane(vec3 rayPos, vec3 rayDir, vec3 n, float d)
{
float t = -(dot(rayPos, n) + d) / dot(rayDir, n);
return vec4(n * sign(dot(rayDir, n)), t);
}
// Helper functions
vec3 rotate(vec3 p, float theta)
{
float c = cos(theta), s = sin(theta);
return vec3(p.x * c + p.z * s, p.y,
p.z * c - p.x * s);
}
float impulse(float k, float x) // by iq
{
float h = k*x;
return h * exp(1.0 - h);
}
// Raymarched parts of scene
float grass(vec3 pos)
{
float h = length(pos - vec3(0.0, -7.0, 0.0)) - 8.0;
if (h > 2.0) return h; // Optimization (Avoid noise if too far away)
return h + snoise(pos * 3.0) * 0.1 + pos.y * 0.9;
}
float tree(vec3 pos)
{
pos.y -= 0.5;
float s = sin(pos.y*0.03);
float c = cos(pos.y*0.03);
mat2 m = mat2(c, -s, s, c);
vec3 p = vec3(m*pos.xy, pos.z);
float width = 1.0 - pos.y * 0.02 - clamp(sin(pos.y * 8.0) * 0.1, 0.05, 0.1);
return max(length(p.xz) - width, pos.y - 12.5);
}
vec2 scene(vec3 pos)
{
float vtree = tree(pos);
#ifdef GRASS
float vgrass = grass(pos);
float v = min(vtree, vgrass);
#else
float v = vtree;
#endif
return vec2(v, v == vtree ? 2.0 : 1.0);
}
vec3 normal(vec3 pos)
{
vec2 eps = vec2(1e-3, 0.0);
float h = scene(pos).x;
return normalize(vec3(
scene(pos-eps.xyy).x-h,
scene(pos-eps.yxy).x-h,
scene(pos-eps.yyx).x-h
));
}
float plantsShadow(vec3 rayPos, vec3 rayDir)
{
// Soft shadow taken from iq
float k = 6.0;
float t = 0.0;
float s = 1.0;
for (int i = 0; i < 30; i++)
{
vec3 pos = rayPos+rayDir*t;
vec2 res = scene(pos);
if (res.x < 0.001) return 0.0;
s = min(s, k*res.x/t);
t += max(res.x, 0.01);
}
return s*s*(3.0 - 2.0*s);
}
// Ray-traced parts of scene
vec4 intersectWater(vec3 rayPos, vec3 rayDir)
{
float h = sin(20.5 + iGlobalTime * 2.0) * 0.03;
float t = -(rayPos.y + 2.5 + h) / rayDir.y;
return vec4(0.0, 1.0, 0.0, t);
}
vec4 intersectSand(vec3 rayPos, vec3 rayDir)
{
return intersectSphere(rayPos, rayDir, vec3(0.0,-24.1,0.0), 24.1);
}
vec4 intersectTreasure(vec3 rayPos, vec3 rayDir)
{
return vec4(0.0);
}
vec4 intersectLeaf(vec3 rayPos, vec3 rayDir, float openAmount)
{
vec3 dir = normalize(vec3(0.0, 1.0, openAmount));
float offset = 0.0;
vec4 res = intersectPlane(rayPos, rayDir, dir, 0.0);
vec3 pos = rayPos+rayDir*res.w;
#ifdef RAGGED_LEAVES
offset = snoise(pos*0.8) * 0.3;
#endif
if (pos.y > 0.0 || length(pos * vec3(0.9, 2.0, 1.0)) > 4.0 - offset) res.w = 0.0;
vec4 res2 = intersectPlane(rayPos, rayDir, vec3(dir.xy, -dir.z), 0.0);
pos = rayPos+rayDir*res2.w;
#ifdef RAGGED_LEAVES
offset = snoise(pos*0.8) * 0.3;
#endif
if (pos.y > 0.0 || length(pos * vec3(0.9, 2.0, 1.0)) > 4.0 - offset) res2.w = 0.0;
if (res2.w > 0.0 && res2.w < res.w || res.w <= 0.0)
res = res2;
return res;
}
vec4 leaves(vec3 rayPos, vec3 rayDir)
{
float t = 1e20;
vec3 n = vec3(0.0);
rayPos -= leavesPos;
float sway = impulse(15.0, fract(iGlobalTime / PI * 0.125));
float upDownSway = sway * -sin(iGlobalTime) * 0.06;
float openAmount = sway * max(-cos(iGlobalTime) * 0.4, 0.0);
float angleOffset = -0.1;
for (float k = 0.0; k < 6.2; k += 0.75)
{
// Left-right
float alpha = k + (k - PI) * sway * 0.015;
vec3 p = rotate(rayPos, alpha);
vec3 d = rotate(rayDir, alpha);
// Up-down
angleOffset *= -1.0;
float theta = -0.4 +
angleOffset +
cos(k) * 0.35 +
upDownSway +
sin(iGlobalTime+k*10.0) * 0.03 * (sway + 0.2);
p = rotate(p.xzy, theta).xzy;
d = rotate(d.xzy, theta).xzy;
// Shift
p -= vec3(5.4, 0.0, 0.0);
// Intersect individual leaf
vec4 res = intersectLeaf(p, d, 1.0+openAmount);
if (res.w > 0.0 && res.w < t)
{
t = res.w;
n = res.xyz;
}
}
return vec4(n, t);
}
// Lighting
float shadow(vec3 rayPos, vec3 rayDir)
{
float s = 1.0;
// Intersect sand
//vec4 resSand = intersectSand(rayPos, rayDir);
//if (resSand.w > 0.0) return 0.0;
// Intersect plants
s = min(s, plantsShadow(rayPos, rayDir));
if (s < 0.0001) return 0.0;
// Intersect leaves
vec4 resLeaves = leaves(rayPos, rayDir);
if (resLeaves.w > 0.0 && resLeaves.w < 1e7) return 0.0;
return s;
}
vec3 light(vec3 p, vec3 n)
{
float s = 1.0;
#ifdef SHADOWS
s = shadow(p-sunDir*0.01, -sunDir);
#endif
vec3 col = sunCol * min(max(dot(n, sunDir), 0.0), s);
col += skyCol * (-n.y * 0.5 + 0.5) * 0.3;
return col;
}
vec3 lightLeaves(vec3 p, vec3 n)
{
float s = 1.0;
#ifdef SHADOWS
s = shadow(p-sunDir*0.01, -sunDir);
#endif
float ao = min(length(p - leavesPos) * 0.1, 1.0);
float ns = dot(n, sunDir);
float d = sqrt(max(ns, 0.0));
vec3 col = sunCol * min(d, s);
col += sunCol * max(-ns, 0.0) * vec3(0.3, 0.3, 0.1) * ao;
col += skyCol * (-n.y * 0.5 + 0.5) * 0.3 * ao;
return col;
}
vec3 sky(vec3 n)
{
return skyCol * (1.0 - n.y * 0.8);
}
// Ray-marching
vec4 plants(vec3 rayPos, vec3 rayDir)
{
float t = 0.0;
for (int i = 0; i < 40; i++)
{
vec3 pos = rayPos+rayDir*t;
vec2 res = scene(pos);
float h = res.x;
if (h < 0.001)
{
vec3 col = res.y == 2.0 ? treeCol : grassCol;
float uvFact = res.y == 2.0 ? 1.0 : 10.0;
vec3 n = normal(pos);
vec2 uv = vec2(n.x, pos.y * 0.5) * 0.2 * uvFact;
vec3 tex = texture2D(iChannel0, uv).rgb * 0.6 + 0.4;
float ao = min(length(pos - leavesPos) * 0.1, 1.0);
return vec4(col * light(pos, n) * ao * tex, t);
}
t += h;
}
return vec4(sky(rayDir), 1e8);
}
// Final combination
vec3 traceReflection(vec3 rayPos, vec3 rayDir)
{
vec3 col = vec3(0.0);
float t = 1e20;
// Intersect plants
vec4 resPlants = plants(rayPos, rayDir);
if (resPlants.w > 0.0 && resPlants.w < t)
{
t = resPlants.w;
col = resPlants.xyz;
}
// Intersect leaves
vec4 resLeaves = leaves(rayPos, rayDir);
if (resLeaves.w > 0.0 && resLeaves.w < t)
{
vec3 pos = rayPos + rayDir * resLeaves.w;
vec2 uv = (pos.xz - leavesPos.xz) * 0.3;
float tex = texture2D(iChannel0, uv).r * 0.6 + 0.5;
t = resLeaves.w;
col = leavesCol * lightLeaves(pos, resLeaves.xyz) * tex;
}
if (t > 1e7) return sky(rayDir);
return col;
}
vec3 trace(vec3 rayPos, vec3 rayDir)
{
vec3 col = vec3(0.0);
float t = 1e20;
// Intersect sand
vec4 resSand = intersectSand(rayPos, rayDir);
if (resSand.w > 0.0)
{
vec3 pos = rayPos + rayDir * resSand.w;
t = resSand.w;
col = sandCol * light(pos, resSand.xyz);
}
// Intersect treasure chest
vec4 resTreasure = intersectTreasure(rayPos, rayDir);
if (resTreasure.w > 0.0 && resTreasure.w < t)
{
vec3 pos = rayPos + rayDir * resTreasure.w;
t = resTreasure.w;
col = leavesCol * light(pos, resTreasure.xyz);
}
// Intersect leaves
vec4 resLeaves = leaves(rayPos, rayDir);
if (resLeaves.w > 0.0 && resLeaves.w < t)
{
vec3 pos = rayPos + rayDir * resLeaves.w;
vec2 uv = (pos.xz - leavesPos.xz) * 0.3;
float tex = texture2D(iChannel0, uv).r * 0.6 + 0.5;
t = resLeaves.w;
col = leavesCol * lightLeaves(pos, resLeaves.xyz) * tex;
}
// Intersect plants
vec4 resPlants = plants(rayPos, rayDir);
if (resPlants.w > 0.0 && resPlants.w < t)
{
t = resPlants.w;
col = resPlants.xyz;
}
// Intersect water
vec4 resWater = intersectWater(rayPos, rayDir);
if (resWater.w > 0.0 && resWater.w < t)
{
vec3 pos = rayPos + rayDir * resWater.w;
float dist = t - resWater.w;
vec3 n = bump(pos, rayDir);
float ct = -min(dot(n,rayDir), 0.0);
float fresnel = 0.9 - 0.9 * pow(1.0 - ct, 5.0);
vec3 trans = col * exp(-dist * vec3(1.0, 0.7, 0.4) * 3.0);
vec3 reflDir = normalize(reflect(rayDir, n));
vec3 refl = sky(reflDir);
#ifdef REFLECTIONS
if (dot(pos, rayDir) < -2.0)
refl = traceReflection(pos, reflDir).rgb;
#endif
t = resWater.t;
col = mix(refl, trans, fresnel);
}
if (t > 1e7) return sky(rayDir);
return col;
}
// Ray-generation
vec3 camera(vec2 px)
{
vec2 rd = (px / iResolution.yy - vec2(iResolution.x/iResolution.y*0.5-0.5, 0.0)) * 2.0 - 1.0;
float t = sin(iGlobalTime * 0.1) * 0.2;
vec3 rayDir = normalize(vec3(rd.x, rd.y, 1.0));
vec3 rayPos = vec3(0.0, 3.0, -18.0);
return trace(rayPos, rayDir);
}
void main(void)
{
#ifdef HEAVY_AA
vec3 col = camera(gl_FragCoord.xy+vec2(0.0,0.5))*0.25;
col += camera(gl_FragCoord.xy+vec2(0.25,0.0))*0.25;
col += camera(gl_FragCoord.xy+vec2(0.5,0.75))*0.25;
col += camera(gl_FragCoord.xy+vec2(0.75,0.25))*0.25;
#else
vec3 col = camera(gl_FragCoord.xy);
#ifdef LIGHT_AA
col = col * 0.5 + camera(gl_FragCoord.xy+vec2(0.5,0.5))*0.5;
#endif
#endif
#ifdef TONEMAP
// Optimized Haarm-Peter Duikers curve
vec3 x = max(vec3(0.0),col*exposure-0.004);
col = (x*(6.2*x+.5))/(x*(6.2*x+1.7)+0.06);
#else
col = pow(col, vec3(0.4545));
#endif
gl_FragColor = vec4(col, 1.0);
}

View File

@@ -0,0 +1,98 @@
/**
* The MIT License (MIT)
*
* Copyright (c) 2016 Sascha Willems
*
* Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
*/
#version 450
#extension GL_ARB_separate_shader_objects : enable
#extension GL_ARB_shading_language_420pack : enable
// PN patch data
struct PnPatch
{
float b210;
float b120;
float b021;
float b012;
float b102;
float b201;
float b111;
float n110;
float n011;
float n101;
};
// tessellation levels
layout (binding = 0) uniform UBO
{
float tessLevel;
} ubo;
layout(vertices=3) out;
layout(location = 0) in vec3 inNormal[];
layout(location = 1) in vec2 inUV[];
layout(location = 0) out vec3 outNormal[3];
layout(location = 3) out vec2 outUV[3];
layout(location = 6) out PnPatch outPatch[3];
float wij(int i, int j)
{
return dot(gl_in[j].gl_Position.xyz - gl_in[i].gl_Position.xyz, inNormal[i]);
}
float vij(int i, int j)
{
vec3 Pj_minus_Pi = gl_in[j].gl_Position.xyz
- gl_in[i].gl_Position.xyz;
vec3 Ni_plus_Nj = inNormal[i]+inNormal[j];
return 2.0*dot(Pj_minus_Pi, Ni_plus_Nj)/dot(Pj_minus_Pi, Pj_minus_Pi);
}
void main()
{
// get data
gl_out[gl_InvocationID].gl_Position = gl_in[gl_InvocationID].gl_Position;
outNormal[gl_InvocationID] = inNormal[gl_InvocationID];
outUV[gl_InvocationID] = inUV[gl_InvocationID];
// set base
float P0 = gl_in[0].gl_Position[gl_InvocationID];
float P1 = gl_in[1].gl_Position[gl_InvocationID];
float P2 = gl_in[2].gl_Position[gl_InvocationID];
float N0 = inNormal[0][gl_InvocationID];
float N1 = inNormal[1][gl_InvocationID];
float N2 = inNormal[2][gl_InvocationID];
// compute control points
outPatch[gl_InvocationID].b210 = (2.0*P0 + P1 - wij(0,1)*N0)/3.0;
outPatch[gl_InvocationID].b120 = (2.0*P1 + P0 - wij(1,0)*N1)/3.0;
outPatch[gl_InvocationID].b021 = (2.0*P1 + P2 - wij(1,2)*N1)/3.0;
outPatch[gl_InvocationID].b012 = (2.0*P2 + P1 - wij(2,1)*N2)/3.0;
outPatch[gl_InvocationID].b102 = (2.0*P2 + P0 - wij(2,0)*N2)/3.0;
outPatch[gl_InvocationID].b201 = (2.0*P0 + P2 - wij(0,2)*N0)/3.0;
float E = ( outPatch[gl_InvocationID].b210
+ outPatch[gl_InvocationID].b120
+ outPatch[gl_InvocationID].b021
+ outPatch[gl_InvocationID].b012
+ outPatch[gl_InvocationID].b102
+ outPatch[gl_InvocationID].b201 ) / 6.0;
float V = (P0 + P1 + P2)/3.0;
outPatch[gl_InvocationID].b111 = E + (E - V)*0.5;
outPatch[gl_InvocationID].n110 = N0+N1-vij(0,1)*(P1-P0);
outPatch[gl_InvocationID].n011 = N1+N2-vij(1,2)*(P2-P1);
outPatch[gl_InvocationID].n101 = N2+N0-vij(2,0)*(P0-P2);
// set tess levels
gl_TessLevelOuter[gl_InvocationID] = ubo.tessLevel;
gl_TessLevelInner[0] = ubo.tessLevel;
}

View File

@@ -0,0 +1,103 @@
/**
* The MIT License (MIT)
*
* Copyright (c) 2016 Sascha Willems
*
* Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
*/
#version 450
#extension GL_ARB_separate_shader_objects : enable
#extension GL_ARB_shading_language_420pack : enable
// PN patch data
struct PnPatch
{
float b210;
float b120;
float b021;
float b012;
float b102;
float b201;
float b111;
float n110;
float n011;
float n101;
};
layout (binding = 1) uniform UBO
{
mat4 projection;
mat4 model;
float tessAlpha;
} ubo;
layout(triangles, fractional_odd_spacing, ccw) in;
layout(location = 0) in vec3 iNormal[];
layout(location = 3) in vec2 iTexCoord[];
layout(location = 6) in PnPatch iPnPatch[];
layout(location = 0) out vec3 oNormal;
layout(location = 1) out vec2 oTexCoord;
#define uvw gl_TessCoord
void main()
{
vec3 uvwSquared = uvw * uvw;
vec3 uvwCubed = uvwSquared * uvw;
// extract control points
vec3 b210 = vec3(iPnPatch[0].b210, iPnPatch[1].b210, iPnPatch[2].b210);
vec3 b120 = vec3(iPnPatch[0].b120, iPnPatch[1].b120, iPnPatch[2].b120);
vec3 b021 = vec3(iPnPatch[0].b021, iPnPatch[1].b021, iPnPatch[2].b021);
vec3 b012 = vec3(iPnPatch[0].b012, iPnPatch[1].b012, iPnPatch[2].b012);
vec3 b102 = vec3(iPnPatch[0].b102, iPnPatch[1].b102, iPnPatch[2].b102);
vec3 b201 = vec3(iPnPatch[0].b201, iPnPatch[1].b201, iPnPatch[2].b201);
vec3 b111 = vec3(iPnPatch[0].b111, iPnPatch[1].b111, iPnPatch[2].b111);
// extract control normals
vec3 n110 = normalize(vec3(iPnPatch[0].n110, iPnPatch[1].n110, iPnPatch[2].n110));
vec3 n011 = normalize(vec3(iPnPatch[0].n011, iPnPatch[1].n011, iPnPatch[2].n011));
vec3 n101 = normalize(vec3(iPnPatch[0].n101, iPnPatch[1].n101, iPnPatch[2].n101));
// compute texcoords
oTexCoord = gl_TessCoord[2]*iTexCoord[0] + gl_TessCoord[0]*iTexCoord[1] + gl_TessCoord[1]*iTexCoord[2];
// normal
// Barycentric normal
vec3 barNormal = gl_TessCoord[2]*iNormal[0] + gl_TessCoord[0]*iNormal[1] + gl_TessCoord[1]*iNormal[2];
vec3 pnNormal = iNormal[0]*uvwSquared[2] + iNormal[1]*uvwSquared[0] + iNormal[2]*uvwSquared[1]
+ n110*uvw[2]*uvw[0] + n011*uvw[0]*uvw[1]+ n101*uvw[2]*uvw[1];
oNormal = ubo.tessAlpha*pnNormal + (1.0-ubo.tessAlpha) * barNormal;
// compute interpolated pos
vec3 barPos = gl_TessCoord[2]*gl_in[0].gl_Position.xyz
+ gl_TessCoord[0]*gl_in[1].gl_Position.xyz
+ gl_TessCoord[1]*gl_in[2].gl_Position.xyz;
// save some computations
uvwSquared *= 3.0;
// compute PN position
vec3 pnPos = gl_in[0].gl_Position.xyz*uvwCubed[2]
+ gl_in[1].gl_Position.xyz*uvwCubed[0]
+ gl_in[2].gl_Position.xyz*uvwCubed[1]
+ b210*uvwSquared[2]*uvw[0]
+ b120*uvwSquared[0]*uvw[2]
+ b201*uvwSquared[2]*uvw[1]
+ b021*uvwSquared[0]*uvw[1]
+ b102*uvwSquared[1]*uvw[2]
+ b012*uvwSquared[1]*uvw[0]
+ b111*6.0*uvw[0]*uvw[1]*uvw[2];
// final position and normal
vec3 finalPos = (1.0-ubo.tessAlpha)*barPos + ubo.tessAlpha*pnPos;
gl_Position = ubo.projection * ubo.model * vec4(finalPos,1.0);
}

135
samples/HCL/main.tf Normal file
View File

@@ -0,0 +1,135 @@
resource "aws_security_group" "elb_sec_group" {
description = "Allow traffic from the internet to ELB port 80"
vpc_id = "${var.vpc_id}"
ingress {
from_port = 80
to_port = 80
protocol = "tcp"
cidr_blocks = ["${split(",", var.allowed_cidr_blocks)}"]
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
}
resource "aws_security_group" "dokku_allow_ssh_from_internal" {
description = "Allow git access over ssh from the private subnet"
vpc_id = "${var.vpc_id}"
ingress {
from_port = 22
to_port = 22
protocol = "tcp"
cidr_blocks = ["${var.private_subnet_cidr}"]
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
}
resource "aws_security_group" "allow_from_elb_to_instance" {
description = "Allow traffic from the ELB to the private instance"
vpc_id = "${var.vpc_id}"
ingress {
security_groups = ["${aws_security_group.elb_sec_group.id}"]
from_port = 80
to_port = 80
protocol = "tcp"
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
}
resource "aws_instance" "dokku" {
ami = "ami-47a23a30"
instance_type = "${var.instance_type}"
associate_public_ip_address = false
key_name = "${var.key_name}"
subnet_id = "${var.private_subnet_id}"
vpc_security_group_ids = [
"${var.bastion_sec_group_id}",
"${aws_security_group.allow_from_elb_to_instance.id}",
"${aws_security_group.dokku_allow_ssh_from_internal.id}"
]
tags {
Name = "${var.name}"
}
connection {
user = "ubuntu"
private_key = "${var.private_key}"
bastion_host = "${var.bastion_host}"
bastion_port = "${var.bastion_port}"
bastion_user = "${var.bastion_user}"
bastion_private_key = "${var.bastion_private_key}"
}
provisioner "file" {
source = "${path.module}/../scripts/install-dokku.sh"
destination = "/home/ubuntu/install-dokku.sh"
}
provisioner "remote-exec" {
inline = [
"chmod +x /home/ubuntu/install-dokku.sh",
"HOSTNAME=${var.hostname} /home/ubuntu/install-dokku.sh"
]
}
}
resource "aws_elb" "elb_dokku" {
name = "elb-dokku-${var.name}"
subnets = ["${var.public_subnet_id}"]
security_groups = ["${aws_security_group.elb_sec_group.id}"]
listener {
instance_port = 80
instance_protocol = "http"
lb_port = 80
lb_protocol = "http"
}
health_check {
healthy_threshold = 2
unhealthy_threshold = 2
timeout = 3
target = "HTTP:80/"
interval = 30
}
instances = ["${aws_instance.dokku.id}"]
cross_zone_load_balancing = false
idle_timeout = 400
tags {
Name = "elb-dokku-${var.name}"
}
}
resource "aws_route53_record" "dokku-deploy" {
zone_id = "${var.zone_id}"
name = "deploy.${var.hostname}"
type = "A"
ttl = "300"
records = ["${aws_instance.dokku.private_ip}"]
}
resource "aws_route53_record" "dokku-wildcard" {
zone_id = "${var.zone_id}"
name = "*.${var.hostname}"
type = "CNAME"
ttl = "300"
records = ["${aws_elb.elb_dokku.dns_name}"]
}

89
samples/HLSL/bloom.cginc Normal file
View File

@@ -0,0 +1,89 @@
// From https://github.com/Unity-Technologies/PostProcessing/blob/master/PostProcessing/Resources/Shaders/Bloom.cginc
// Licensed under the MIT license
#ifndef __BLOOM__
#define __BLOOM__
#include "Common.cginc"
// Brightness function
half Brightness(half3 c)
{
return Max3(c);
}
// 3-tap median filter
half3 Median(half3 a, half3 b, half3 c)
{
return a + b + c - min(min(a, b), c) - max(max(a, b), c);
}
// Downsample with a 4x4 box filter
half3 DownsampleFilter(sampler2D tex, float2 uv, float2 texelSize)
{
float4 d = texelSize.xyxy * float4(-1.0, -1.0, 1.0, 1.0);
half3 s;
s = DecodeHDR(tex2D(tex, uv + d.xy));
s += DecodeHDR(tex2D(tex, uv + d.zy));
s += DecodeHDR(tex2D(tex, uv + d.xw));
s += DecodeHDR(tex2D(tex, uv + d.zw));
return s * (1.0 / 4.0);
}
// Downsample with a 4x4 box filter + anti-flicker filter
half3 DownsampleAntiFlickerFilter(sampler2D tex, float2 uv, float2 texelSize)
{
float4 d = texelSize.xyxy * float4(-1.0, -1.0, 1.0, 1.0);
half3 s1 = DecodeHDR(tex2D(tex, uv + d.xy));
half3 s2 = DecodeHDR(tex2D(tex, uv + d.zy));
half3 s3 = DecodeHDR(tex2D(tex, uv + d.xw));
half3 s4 = DecodeHDR(tex2D(tex, uv + d.zw));
// Karis's luma weighted average (using brightness instead of luma)
half s1w = 1.0 / (Brightness(s1) + 1.0);
half s2w = 1.0 / (Brightness(s2) + 1.0);
half s3w = 1.0 / (Brightness(s3) + 1.0);
half s4w = 1.0 / (Brightness(s4) + 1.0);
half one_div_wsum = 1.0 / (s1w + s2w + s3w + s4w);
return (s1 * s1w + s2 * s2w + s3 * s3w + s4 * s4w) * one_div_wsum;
}
half3 UpsampleFilter(sampler2D tex, float2 uv, float2 texelSize, float sampleScale)
{
#if MOBILE_OR_CONSOLE
// 4-tap bilinear upsampler
float4 d = texelSize.xyxy * float4(-1.0, -1.0, 1.0, 1.0) * (sampleScale * 0.5);
half3 s;
s = DecodeHDR(tex2D(tex, uv + d.xy));
s += DecodeHDR(tex2D(tex, uv + d.zy));
s += DecodeHDR(tex2D(tex, uv + d.xw));
s += DecodeHDR(tex2D(tex, uv + d.zw));
return s * (1.0 / 4.0);
#else
// 9-tap bilinear upsampler (tent filter)
float4 d = texelSize.xyxy * float4(1.0, 1.0, -1.0, 0.0) * sampleScale;
half3 s;
s = DecodeHDR(tex2D(tex, uv - d.xy));
s += DecodeHDR(tex2D(tex, uv - d.wy)) * 2.0;
s += DecodeHDR(tex2D(tex, uv - d.zy));
s += DecodeHDR(tex2D(tex, uv + d.zw)) * 2.0;
s += DecodeHDR(tex2D(tex, uv)) * 4.0;
s += DecodeHDR(tex2D(tex, uv + d.xw)) * 2.0;
s += DecodeHDR(tex2D(tex, uv + d.zy));
s += DecodeHDR(tex2D(tex, uv + d.wy)) * 2.0;
s += DecodeHDR(tex2D(tex, uv + d.xy));
return s * (1.0 / 16.0);
#endif
}
#endif // __BLOOM__

View File

@@ -0,0 +1,39 @@
digit [0-9]
id [a-zA-Z][a-zA-Z0-9]*
%%
"//".* /* ignore comment */
"main" return 'MAIN';
"class" return 'CLASS';
"extends" return 'EXTENDS';
"nat" return 'NATTYPE';
"if" return 'IF';
"else" return 'ELSE';
"for" return 'FOR';
"printNat" return 'PRINTNAT';
"readNat" return 'READNAT';
"this" return 'THIS';
"new" return 'NEW';
"var" return 'VAR';
"null" return 'NUL';
{digit}+ return 'NATLITERAL';
{id} return 'ID';
"==" return 'EQUALITY';
"=" return 'ASSIGN';
"+" return 'PLUS';
"-" return 'MINUS';
"*" return 'TIMES';
">" return 'GREATER';
"||" return 'OR';
"!" return 'NOT';
"." return 'DOT';
"{" return 'LBRACE';
"}" return 'RBRACE';
"(" return 'LPAREN';
")" return 'RPAREN';
";" return 'SEMICOLON';
\s+ /* skip whitespace */
"." throw 'Illegal character';
<<EOF>> return 'ENDOFFILE';

View File

@@ -0,0 +1,29 @@
%%
\n+ {yy.freshLine = true;}
\s+ {yy.freshLine = false;}
"y{"[^}]*"}" {yytext = yytext.substr(2, yyleng - 3); return 'ACTION';}
[a-zA-Z_][a-zA-Z0-9_-]* {return 'NAME';}
'"'([^"]|'\"')*'"' {return 'STRING_LIT';}
"'"([^']|"\'")*"'" {return 'STRING_LIT';}
"|" {return '|';}
"["("\]"|[^\]])*"]" {return 'ANY_GROUP_REGEX';}
"(" {return '(';}
")" {return ')';}
"+" {return '+';}
"*" {return '*';}
"?" {return '?';}
"^" {return '^';}
"/" {return '/';}
"\\"[a-zA-Z0] {return 'ESCAPE_CHAR';}
"$" {return '$';}
"<<EOF>>" {return '$';}
"." {return '.';}
"%%" {return '%%';}
"{"\d+(","\s?\d+|",")?"}" {return 'RANGE_REGEX';}
/"{" %{if (yy.freshLine) { this.input('{'); return '{'; } else { this.unput('y'); }%}
"}" %{return '}';%}
"%{"(.|\n)*?"}%" {yytext = yytext.substr(2, yyleng - 4); return 'ACTION';}
. {/* ignore bad characters */}
<<EOF>> {return 'EOF';}

418
samples/Jison/ansic.jison Normal file
View File

@@ -0,0 +1,418 @@
%token IDENTIFIER CONSTANT STRING_LITERAL SIZEOF
%token PTR_OP INC_OP DEC_OP LEFT_OP RIGHT_OP LE_OP GE_OP EQ_OP NE_OP
%token AND_OP OR_OP MUL_ASSIGN DIV_ASSIGN MOD_ASSIGN ADD_ASSIGN
%token SUB_ASSIGN LEFT_ASSIGN RIGHT_ASSIGN AND_ASSIGN
%token XOR_ASSIGN OR_ASSIGN TYPE_NAME
%token TYPEDEF EXTERN STATIC AUTO REGISTER
%token CHAR SHORT INT LONG SIGNED UNSIGNED FLOAT DOUBLE CONST VOLATILE VOID
%token STRUCT UNION ENUM ELLIPSIS
%token CASE DEFAULT IF ELSE SWITCH WHILE DO FOR GOTO CONTINUE BREAK RETURN
%nonassoc IF_WITHOUT_ELSE
%nonassoc ELSE
%start translation_unit
%%
primary_expression
: IDENTIFIER
| CONSTANT
| STRING_LITERAL
| '(' expression ')'
;
postfix_expression
: primary_expression
| postfix_expression '[' expression ']'
| postfix_expression '(' ')'
| postfix_expression '(' argument_expression_list ')'
| postfix_expression '.' IDENTIFIER
| postfix_expression PTR_OP IDENTIFIER
| postfix_expression INC_OP
| postfix_expression DEC_OP
;
argument_expression_list
: assignment_expression
| argument_expression_list ',' assignment_expression
;
unary_expression
: postfix_expression
| INC_OP unary_expression
| DEC_OP unary_expression
| unary_operator cast_expression
| SIZEOF unary_expression
| SIZEOF '(' type_name ')'
;
unary_operator
: '&'
| '*'
| '+'
| '-'
| '~'
| '!'
;
cast_expression
: unary_expression
| '(' type_name ')' cast_expression
;
multiplicative_expression
: cast_expression
| multiplicative_expression '*' cast_expression
| multiplicative_expression '/' cast_expression
| multiplicative_expression '%' cast_expression
;
additive_expression
: multiplicative_expression
| additive_expression '+' multiplicative_expression
| additive_expression '-' multiplicative_expression
;
shift_expression
: additive_expression
| shift_expression LEFT_OP additive_expression
| shift_expression RIGHT_OP additive_expression
;
relational_expression
: shift_expression
| relational_expression '<' shift_expression
| relational_expression '>' shift_expression
| relational_expression LE_OP shift_expression
| relational_expression GE_OP shift_expression
;
equality_expression
: relational_expression
| equality_expression EQ_OP relational_expression
| equality_expression NE_OP relational_expression
;
and_expression
: equality_expression
| and_expression '&' equality_expression
;
exclusive_or_expression
: and_expression
| exclusive_or_expression '^' and_expression
;
inclusive_or_expression
: exclusive_or_expression
| inclusive_or_expression '|' exclusive_or_expression
;
logical_and_expression
: inclusive_or_expression
| logical_and_expression AND_OP inclusive_or_expression
;
logical_or_expression
: logical_and_expression
| logical_or_expression OR_OP logical_and_expression
;
conditional_expression
: logical_or_expression
| logical_or_expression '?' expression ':' conditional_expression
;
assignment_expression
: conditional_expression
| unary_expression assignment_operator assignment_expression
;
assignment_operator
: '='
| MUL_ASSIGN
| DIV_ASSIGN
| MOD_ASSIGN
| ADD_ASSIGN
| SUB_ASSIGN
| LEFT_ASSIGN
| RIGHT_ASSIGN
| AND_ASSIGN
| XOR_ASSIGN
| OR_ASSIGN
;
expression
: assignment_expression
| expression ',' assignment_expression
;
constant_expression
: conditional_expression
;
declaration
: declaration_specifiers ';'
| declaration_specifiers init_declarator_list ';'
;
declaration_specifiers
: storage_class_specifier
| storage_class_specifier declaration_specifiers
| type_specifier
| type_specifier declaration_specifiers
| type_qualifier
| type_qualifier declaration_specifiers
;
init_declarator_list
: init_declarator
| init_declarator_list ',' init_declarator
;
init_declarator
: declarator
| declarator '=' initializer
;
storage_class_specifier
: TYPEDEF
| EXTERN
| STATIC
| AUTO
| REGISTER
;
type_specifier
: VOID
| CHAR
| SHORT
| INT
| LONG
| FLOAT
| DOUBLE
| SIGNED
| UNSIGNED
| struct_or_union_specifier
| enum_specifier
| TYPE_NAME
;
struct_or_union_specifier
: struct_or_union IDENTIFIER '{' struct_declaration_list '}'
| struct_or_union '{' struct_declaration_list '}'
| struct_or_union IDENTIFIER
;
struct_or_union
: STRUCT
| UNION
;
struct_declaration_list
: struct_declaration
| struct_declaration_list struct_declaration
;
struct_declaration
: specifier_qualifier_list struct_declarator_list ';'
;
specifier_qualifier_list
: type_specifier specifier_qualifier_list
| type_specifier
| type_qualifier specifier_qualifier_list
| type_qualifier
;
struct_declarator_list
: struct_declarator
| struct_declarator_list ',' struct_declarator
;
struct_declarator
: declarator
| ':' constant_expression
| declarator ':' constant_expression
;
enum_specifier
: ENUM '{' enumerator_list '}'
| ENUM IDENTIFIER '{' enumerator_list '}'
| ENUM IDENTIFIER
;
enumerator_list
: enumerator
| enumerator_list ',' enumerator
;
enumerator
: IDENTIFIER
| IDENTIFIER '=' constant_expression
;
type_qualifier
: CONST
| VOLATILE
;
declarator
: pointer direct_declarator
| direct_declarator
;
direct_declarator
: IDENTIFIER
| '(' declarator ')'
| direct_declarator '[' constant_expression ']'
| direct_declarator '[' ']'
| direct_declarator '(' parameter_type_list ')'
| direct_declarator '(' identifier_list ')'
| direct_declarator '(' ')'
;
pointer
: '*'
| '*' type_qualifier_list
| '*' pointer
| '*' type_qualifier_list pointer
;
type_qualifier_list
: type_qualifier
| type_qualifier_list type_qualifier
;
parameter_type_list
: parameter_list
| parameter_list ',' ELLIPSIS
;
parameter_list
: parameter_declaration
| parameter_list ',' parameter_declaration
;
parameter_declaration
: declaration_specifiers declarator
| declaration_specifiers abstract_declarator
| declaration_specifiers
;
identifier_list
: IDENTIFIER
| identifier_list ',' IDENTIFIER
;
type_name
: specifier_qualifier_list
| specifier_qualifier_list abstract_declarator
;
abstract_declarator
: pointer
| direct_abstract_declarator
| pointer direct_abstract_declarator
;
direct_abstract_declarator
: '(' abstract_declarator ')'
| '[' ']'
| '[' constant_expression ']'
| direct_abstract_declarator '[' ']'
| direct_abstract_declarator '[' constant_expression ']'
| '(' ')'
| '(' parameter_type_list ')'
| direct_abstract_declarator '(' ')'
| direct_abstract_declarator '(' parameter_type_list ')'
;
initializer
: assignment_expression
| '{' initializer_list '}'
| '{' initializer_list ',' '}'
;
initializer_list
: initializer
| initializer_list ',' initializer
;
statement
: labeled_statement
| compound_statement
| expression_statement
| selection_statement
| iteration_statement
| jump_statement
;
labeled_statement
: IDENTIFIER ':' statement
| CASE constant_expression ':' statement
| DEFAULT ':' statement
;
compound_statement
: '{' '}'
| '{' statement_list '}'
| '{' declaration_list '}'
| '{' declaration_list statement_list '}'
;
declaration_list
: declaration
| declaration_list declaration
;
statement_list
: statement
| statement_list statement
;
expression_statement
: ';'
| expression ';'
;
selection_statement
: IF '(' expression ')' statement %prec IF_WITHOUT_ELSE
| IF '(' expression ')' statement ELSE statement
| SWITCH '(' expression ')' statement
;
iteration_statement
: WHILE '(' expression ')' statement
| DO statement WHILE '(' expression ')' ';'
| FOR '(' expression_statement expression_statement ')' statement
| FOR '(' expression_statement expression_statement expression ')' statement
;
jump_statement
: GOTO IDENTIFIER ';'
| CONTINUE ';'
| BREAK ';'
| RETURN ';'
| RETURN expression ';'
;
translation_unit
: external_declaration
| translation_unit external_declaration
;
external_declaration
: function_definition
| declaration
;
function_definition
: declaration_specifiers declarator declaration_list compound_statement
| declaration_specifiers declarator compound_statement
| declarator declaration_list compound_statement
| declarator compound_statement
;

View File

@@ -0,0 +1,84 @@
/* description: ClassyLang grammar. Very classy. */
/*
To build parser:
$ ./bin/jison examples/classy.jison examples/classy.jisonlex
*/
/* author: Zach Carter */
%right ASSIGN
%left OR
%nonassoc EQUALITY GREATER
%left PLUS MINUS
%left TIMES
%right NOT
%left DOT
%%
pgm
: cdl MAIN LBRACE vdl el RBRACE ENDOFFILE
;
cdl
: c cdl
|
;
c
: CLASS id EXTENDS id LBRACE vdl mdl RBRACE
;
vdl
: VAR t id SEMICOLON vdl
|
;
mdl
: t id LPAREN t id RPAREN LBRACE vdl el RBRACE mdl
|
;
t
: NATTYPE
| id
;
id
: ID
;
el
: e SEMICOLON el
| e SEMICOLON
;
e
: NATLITERAL
| NUL
| id
| NEW id
| THIS
| IF LPAREN e RPAREN LBRACE el RBRACE ELSE LBRACE el RBRACE
| FOR LPAREN e SEMICOLON e SEMICOLON e RPAREN LBRACE el RBRACE
| READNAT LPAREN RPAREN
| PRINTNAT LPAREN e RPAREN
| e PLUS e
| e MINUS e
| e TIMES e
| e EQUALITY e
| e GREATER e
| NOT e
| e OR e
| e DOT id
| id ASSIGN e
| e DOT id ASSIGN e
| id LPAREN e RPAREN
| e DOT id LPAREN e RPAREN
| LPAREN e RPAREN
;

145
samples/Jison/lex.jison Normal file
View File

@@ -0,0 +1,145 @@
// `%nonassoc` tells the parser compiler (JISON) that these tokens cannot occur more than once,
// i.e. input like '//a' (tokens '/', '/' and 'a') is not a legal input while '/a' (tokens '/' and 'a')
// *is* legal input for this grammar.
%nonassoc '/' '/!'
// Likewise for `%left`: this informs the LALR(1) grammar compiler (JISON) that these tokens
// *can* occur repeatedly, e.g. 'a?*' and even 'a**' are considered legal inputs given this
// grammar!
//
// Token `RANGE_REGEX` may seem the odd one out here but really isn't: given the `regex_base`
// choice/rule `regex_base range_regex`, which is recursive, this grammar tells JISON that
// any input matching a sequence like `regex_base range_regex range_regex` *is* legal.
// If you do not want that to be legal, you MUST adjust the grammar rule set you match your
// actual intent.
%left '*' '+' '?' RANGE_REGEX
%%
lex
: definitions include '%%' rules '%%' EOF
{{ $$ = {macros: $1, rules: $4};
if ($2) $$.actionInclude = $2;
return $$; }}
| definitions include '%%' rules EOF
{{ $$ = {macros: $1, rules: $4};
if ($2) $$.actionInclude = $2;
return $$; }}
;
include
: action
|
;
definitions
: definitions definition
{ $$ = $1; $$.concat($2); }
| definition
{ $$ = [$1]; }
;
definition
: name regex
{ $$ = [$1, $2]; }
;
name
: NAME
{ $$ = yytext; }
;
rules
: rules rule
{ $$ = $1; $$.push($2); }
| rule
{ $$ = [$1]; }
;
rule
: regex action
{ $$ = [$1, $2]; }
;
action
: ACTION
{ $$ = yytext; }
;
regex
: start_caret regex_list end_dollar
{ $$ = $1+$2+$3; }
;
start_caret
: '^'
{ $$ = '^'; }
|
{ $$ = ''; }
;
end_dollar
: '$'
{ $$ = '$'; }
|
{ $$ = ''; }
;
regex_list
: regex_list '|' regex_chain
{ $$ = $1+'|'+$3; }
| regex_chain
;
regex_chain
: regex_chain regex_base
{ $$ = $1+$2;}
| regex_base
{ $$ = $1;}
;
regex_base
: '(' regex_list ')'
{ $$ = '('+$2+')'; }
| regex_base '+'
{ $$ = $1+'+'; }
| regex_base '*'
{ $$ = $1+'*'; }
| regex_base '?'
{ $$ = $1+'?'; }
| '/' regex_base
{ $$ = '(?=' + $regex_base + ')'; }
| '/!' regex_base
{ $$ = '(?!' + $regex_base + ')'; }
| name_expansion
| regex_base range_regex
{ $$ = $1+$2; }
| any_group_regex
| '.'
{ $$ = '.'; }
| string
;
name_expansion
: '{' name '}'
{{ $$ = '{'+$2+'}'; }}
;
any_group_regex
: ANY_GROUP_REGEX
{ $$ = yytext; }
;
range_regex
: RANGE_REGEX
{ $$ = yytext; }
;
string
: STRING_LIT
{ $$ = yy.prepareString(yytext.substr(1, yyleng-2)); }
;

37
samples/Jolie/common.iol Normal file
View File

@@ -0,0 +1,37 @@
include "types/Binding.iol"
constants {
Location_Exam = "socket://localhost:8000"
}
type StartExamRequest:void {
.examName:string
.studentName:string
.student:Binding
}
type MakeQuestionRequest:void {
.question:string
.examName:string
.studentName:string
}
type DecisionMessage:void {
.studentName:string
.examName:string
}
interface ExamInterface {
OneWay:
startExam(StartExamRequest),
pass(DecisionMessage), fail(DecisionMessage)
RequestResponse:
makeQuestion(MakeQuestionRequest)(int)
}
interface StudentInterface {
OneWay:
sendMessage(string)
RequestResponse:
makeQuestion(MakeQuestionRequest)(int)
}

39
samples/Jolie/exam.ol Normal file
View File

@@ -0,0 +1,39 @@
include "common.iol"
cset {
studentName:
StartExamRequest.studentName
DecisionMessage.studentName
MakeQuestionRequest.studentName,
examName:
StartExamRequest.examName
DecisionMessage.examName
MakeQuestionRequest.examName
}
execution { concurrent }
outputPort Student {
Interfaces: StudentInterface
}
inputPort ExamInput {
Location: Location_Exam
Protocol: sodep
Interfaces: ExamInterface
}
main
{
startExam( examRequest );
Student << examRequest.student;
makeQuestion( question )( answer ) {
makeQuestion@Student( question )( answer )
};
[ pass( message ) ] {
sendMessage@Student( "You passed!" )
}
[ fail( message ) ] {
sendMessage@Student( "You failed!" )
}
}

26
samples/Jolie/examiner.ol Normal file
View File

@@ -0,0 +1,26 @@
include "common.iol"
include "ui/swing_ui.iol"
include "console.iol"
outputPort Exam {
Location: Location_Exam
Protocol: sodep
Interfaces: ExamInterface
}
main
{
question.studentName = "John";
question.examName = "SPLG";
question.question = "Random question";
makeQuestion@Exam( question )( answer );
showYesNoQuestionDialog@SwingUI( "Do you want to accept answer " + answer + " ?" )( decision );
message.studentName = "John";
message.examName = "SPLG";
if ( decision == 0 ) {
pass@Exam( message )
} else {
fail@Exam( message )
}
}

84
samples/Jolie/hanoi.ol Normal file
View File

@@ -0,0 +1,84 @@
// https://github.com/jolie/website/blob/master/docs/documentation/locations/code/local.ol
include "runtime.iol"
include "string_utils.iol"
type HanoiRequest: void{
.src: string
.aux: string
.dst: string
.n: int
.sid?: string
}
type HanoiReponse: void {
.move?: string
}
interface LocalOperations{
RequestResponse:
hanoiSolver( HanoiRequest )( HanoiReponse )
}
interface ExternalOperations{
RequestResponse:
hanoi( HanoiRequest )( string )
}
outputPort Self{
Interfaces: LocalOperations
}
inputPort Self {
Location: "local"
Interfaces: LocalOperations
}
inputPort PowerService {
Location: "socket://localhost:8000"
Protocol: http{
.format = "html"
}
Interfaces: ExternalOperations
}
execution { concurrent }
init
{
getLocalLocation@Runtime()( Self.location )
}
main
{
[ hanoi( request )( response ){
getRandomUUID@StringUtils()(request.sid);
hanoiSolver@Self( request )( subRes );
response = subRes.move
}]{ nullProcess }
[ hanoiSolver( request )( response ){
if ( request.n > 0 ){
subReq.n = request.n;
subReq.n--;
with( request ){
subReq.aux = .dst;
subReq.dst = .aux;
subReq.src = .src;
subReq.sid = .sid
};
hanoiSolver@Self( subReq )( response );
response.move += "<br>" +
++global.counters.(request.sid) +
") Move from " + request.src +
" to " + request.dst + ";";
with ( request ){
subReq.src = .aux;
subReq.aux = .src;
subReq.dst = .dst
};
hanoiSolver@Self( subReq )( subRes );
response.move += subRes.move
}
}]{ nullProcess }
}

29
samples/Jolie/student.ol Normal file
View File

@@ -0,0 +1,29 @@
include "common.iol"
include "ui/swing_ui.iol"
include "console.iol"
outputPort Exam {
Location: Location_Exam
Protocol: sodep
Interfaces: ExamInterface
}
inputPort StudentInput {
Location: "socket://localhost:8001/"
Protocol: sodep
Interfaces: StudentInterface
}
main
{
request.studentName = "John";
request.examName = "SPLG";
request.student.location = "socket://localhost:8001/";
request.student.protocol = "sodep";
startExam@Exam( request );
makeQuestion( question )( answer ) {
showYesNoQuestionDialog@SwingUI( question.question )( answer )
};
sendMessage( message );
println@Console( message )()
}

View File

@@ -0,0 +1,49 @@
- label: 'desired label name'
- connection: connection_name
- include: filename_or_pattern
# Possibly more include declarations
- persist_for: N (seconds | minutes | hours)
- case_sensitive: true | false
- week_start_day: monday | tuesday | wednesday | thursday | friday | saturday | sunday
- value_formats:
- name: desired_format_name
value_format: 'excel-style formatting string'
# Possibly more value formats
- explore: view_name
label: 'desired label name'
description: 'description string'
symmetric_aggregates: true | false
hidden: true | false
fields: [field_or_set, field_or_set, …]
sql_always_where: SQL WHERE condition
always_filter:
field_name: 'looker filter expression'
conditionally_filter:
field_name: 'looker filter expression'
unless: [field_or_set, field_or_set, …]
access_filter_fields: [fully_scoped_field, fully_scoped_field, …]
always_join: [view_name, view_name, …]
joins:
- join: view_name
type: left_outer | full_outer | inner | cross
relationship: one_to_one | many_to_one | one_to_many | many_to_many
from: view_name
sql_table_name: table_name
view_label: 'desired label name'
fields: [field_or_set, field_or_set, …]
required_joins: [view_name, view_name, …]
foreign_key: dimension_name
sql_on: SQL ON clause
# Possibly more join declarations
persist_for: N (seconds | minutes | hours)
from: view_name
view: view_name
case_sensitive: true | false
sql_table_name: table_name
cancel_grouping_fields: [fully_scoped_field, fully_scoped_field, …]
# Possibly more explore declarations

View File

@@ -0,0 +1,90 @@
- view: view_name
sql_table_name: table_name
suggestions: true | false
derived_table:
sql: SQL query
persist_for: N (seconds | minutes | hours)
sql_trigger_value: SQL query
distribution: column_name
distribution_style: ALL | EVEN
sortkeys: [column_name, column_name, …]
indexes: [column_name, column_name, …]
sets:
set_name:
- field_or_set
- field_or_set
- …
# Possibly more set declarations
fields:
- (dimension | dimension_group | measure | filter): field_name
label: 'desired label name'
view_label: 'desired label name'
group_label: 'desired label name'
description: 'description string'
hidden: true | false
alias: [old_field_name, old_field_name, …]
value_format: 'excel-style formatting string'
value_format_name: format_name
html: HTML expression using Liquid template elements
sql: SQL expression to generate the field value
required_fields: [field_name, field_name, …]
drill_fields: [field_or_set, field_or_set, …]
can_filter: true | false
fanout_on: repeated_record_name
# DIMENSION SPECIFIC PARAMETERS
type: dimension_field_type
primary_key: true | false
sql_case:
value: SQL condition
value: SQL condition
# Possibly more sql_case statements
alpha_sort: true | false
tiers: [N, N, …]
style: classic | interval | integer | relational
sql_latitude: SQL expression to generate a latitude
sql_longitude: SQL expression to generate a longitude
suggestable: true | false
suggest_persist_for: N (seconds | minutes | hours)
suggest_dimension: dimension_name
suggest_explore: explore_name
suggestions: ['suggestion string', 'suggestion string', …]
bypass_suggest_restrictions: true | false
full_suggestions: true | false
skip_drill_filter: true | false
case_sensitive: true | false
order_by_field: dimension_name
map_layer: name_of_map_layer
links:
- label: 'desired label name'
url: desired_url
icon_url: url_of_an_ico_file
# Possibly more links
# DIMENSION GROUP SPECIFIC PARAMETERS
timeframes: [timeframe, timeframe, …]
convert_tz: true | false
datatype: epoch | timestamp | datetime | date | yyyymmdd
# MEASURE SPECIFIC PARAMETERS
type: measure_field_type
direction: row | column
approximate: true | false
approximate_threshold: N
sql_distinct_key: SQL expression to define repeated entities
list_field: dimension_name
filters:
dimension_name: 'looker filter expression'
# Possibly more filters statements
# FILTER SPECIFIC PARAMETERS
default_value: 'desired default value'
# Possibly more dimension or measure declarations

View File

@@ -0,0 +1,176 @@
-------------------------------------------------------------------------------
--
-- File: rolloutCreator.ms
-- Description: Localization friendly helper struct for dynamically creating rollouts
-- By: Ravi Karra [Discreet] ravi.karra@discreet.com
--
-- Version: 1.01
-- Version: 1.02 - Larry Minton [Discreet]
-- changed <string1> += <string2> to append string1 string2
-- added addText method
-- Declarations:
/*
rolloutCreator <rollout_name> <rollout_caption> [width:] [height:]
creates an instance of rolloutCreator, assign it to a variable
width - width of the rollout/dialog to be created
height - of the rollout/dialog to be created
eg:
rci = rolloutCreator "myRollout" "My Rollout"
.begin()
this function needs to be called immediately after the instance is created, this does the initialization
.addLocal <local_name> [init:]
<local_name>
name of the local
[init:]
what the local should be initialized to
.addControl <control_type> <control_name> <control_caption> [paramStr:<string>] =
adds a control to the rollout
<control_type>
can be any of named rolloutControls eg: #button, #spinner, #activeXControl etc
<control_name>
variable name of the control by which it is referred eg: #btnButton
<control_caption>
caption of the control "My Button"
[paramStr:]
an optional string representation of all the keyword parameters that needs to be passed to the control
eg: "width:100 height:20 align:#right"
eg:
rci.addControl #button #myButton "My Button"
.addHandler <control_name> <event_type> [paramStr:<string>] [codeStr:<string>] [filter:<boolean>]
adds an event handler for the controls previously added
<control_name>
the variable passed during the control creation
<event_type>
any of the events supported by the control, eg: #changed, #pressed, #selected
[paramStr:<string>]
an optional string representation of all the positional and keyword parameters that are passed to the event
[codeStr:<string>]
a string representation of the event handler code, if the string contains sub-strings, enclose them in two character '@'
and pass on\true for the filter: parameter
[filter:<boolean>]
if true, converts '@' to quote in codeStr
eg:
rci.addHandler #myButton #pressed codeStr:"MessageBox @Hey@" filter:on
will add an event handler for button named "myButton". When the button is clicked, messagebox pops up with text "hey" in it.
.addText <string> [filter:<boolean>]
adds string to rollout definition. Typically used for function definitions.
[filter:<boolean>]
if true, converts '@' to quote in string
.end()
this function has to be called whenever all the required control and their event handler's are called. This function forms
the rollout string, evaluates it and returns the definition which can passed to createDialog and addRollout functions.
Complete Example:
rci = rolloutCreator "myRollout" "My Rollout"
rci.begin()
rci.addControl #button #myButton "My Button"
rci.addHandler #myButton #pressed filter:on codeStr:"MessageBox @Isn't this cool@ title:@Wow@"
createDialog (rci.end())
*/
-------------------------------------------------------------------------------
if __rcCounter == undefined then global __rcCounter = 0
struct rolloutCreator
(
-- variables
name, caption, str, def, width, height, quote="\"",
-- functions
fn begin =
(
if name == undefined then
(
__rcCounter += 1
name = "rolloutCreator" + __rcCounter as string
)
if caption == undefined then caption = ""
str = ""
),
fn addLocal name init: =
(
local dStr = "\tlocal " + name as string
if init != unsupplied then append dStr (" = " + init as string)
append dStr "\n"
append str dStr
),
fn addControl type name caption paramStr:"" =
(
append str ("\t" + type as string + " " + name as string + " " + quote + caption + quote + paramStr + "\n")
),
fn strFilter codeStr =
(
local last_is_at = codeStr[codeStr.count] == "@"
local fltStr = filterString codeStr "@"
local rep = "\""
codeStr = (if (codeStr[1] == "@") then rep else "") + fltStr[1]
for i=2 to fltStr.count do
(
append codeStr (rep + fltStr[i])
)
if last_is_at then append codeStr rep
codeStr
),
fn addHandler ctrl event paramStr:"" filter:on codeStr:"" =
(
if filter do codeStr = (strFilter codeStr)
append str ("\non " + ctrl as string + " " + event as string + " " + paramStr as string + " do \n(\n" + codeStr + ";ok\n)\n")
),
fn addText txt filter:on =
(
if filter do txt = (strFilter txt )
append str ("\t " + txt + "\n")
),
fn end =
(
local dStr = "rollout " + name + " " + quote + caption + quote
if width != undefined then
append dStr (" width:" + width as string)
if height != undefined then
append dStr (" height:" + height as string)
append dStr "\n(\n"
append dStr str
append dStr "\n)\n"
str = dStr
def = execute str
)
)
/*-- Usage
-- Create an instance of the rolloutCreator passing the name and the caption
rfTest = rolloutCreator "rfTestN" "rfTestC" --width:300 height:100
-- Start creating the rollout
rfTest.begin()
rfTest.addControl #button #myButton "My Button" -- add a button
-- rfTest.addHandler #myButton #pressed filter:on codeStr:"MessageBox @Hey@"
rfTest.addHandler #myButton #pressed filter:on codeStr:"MessageBox @Look to the \@Light\@ thing@"
rfTest.end()
createDialog rfTest.def
*/

View File

@@ -0,0 +1,248 @@
#
# The FreeType Project LICENSE
# ----------------------------
#
# Copyright 1996-1999 by
# David Turner, Robert Wilhelm, and Werner Lemberg
#
#
#
# Introduction
# ============
#
# The FreeType Project is distributed in several archive packages;
# some of them may contain, in addition to the FreeType font engine,
# various tools and contributions which rely on, or relate to, the
# FreeType Project.
#
# This license applies to all files found in such packages, and
# which do not fall under their own explicit license. The license
# affects thus the FreeType font engine, the test programs,
# documentation and makefiles, at the very least.
#
# This license was inspired by the BSD, Artistic, and IJG
# (Independent JPEG Group) licenses, which all encourage inclusion
# and use of free software in commercial and freeware products
# alike. As a consequence, its main points are that:
#
# o We don't promise that this software works. However, we are be
# interested in any kind of bug reports. (`as is' distribution)
#
# o You can use this software for whatever you want, in parts or
# full form, without having to pay us. (`royalty-free' usage)
#
# o You may not pretend that you wrote this software. If you use
# it, or only parts of it, in a program, you must acknowledge
# somewhere in your documentation that you've used the FreeType
# code. (`credits')
#
# We specifically permit and encourage the inclusion of this
# software, with or without modifications, in commercial products,
# provided that all warranty or liability claims are assumed by the
# product vendor.
#
#
# Legal Terms
# ===========
#
# 0. Definitions
# --------------
#
# Throughout this license, the terms `package', `FreeType Project',
# and `FreeType archive' refer to the set of files originally
# distributed by the authors (David Turner, Robert Wilhelm, and
# Werner Lemberg) as the `FreeType project', be they named as alpha,
# beta or final release.
#
# `You' refers to the licensee, or person using the project, where
# `using' is a generic term including compiling the project's source
# code as well as linking it to form a `program' or `executable'.
# This program is referred to as `a program using the FreeType
# engine'.
#
# This license applies to all files distributed in the original
# FreeType archive, including all source code, binaries and
# documentation, unless otherwise stated in the file in its
# original, unmodified form as distributed in the original archive.
# If you are unsure whether or not a particular file is covered by
# this license, you must contact us to verify this.
#
# The FreeType project is copyright (C) 1996-1999 by David Turner,
# Robert Wilhelm, and Werner Lemberg. All rights reserved except as
# specified below.
#
# 1. No Warranty
# --------------
#
# THE FREETYPE ARCHIVE IS PROVIDED `AS IS' WITHOUT WARRANTY OF ANY
# KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
# PURPOSE. IN NO EVENT WILL ANY OF THE AUTHORS OR COPYRIGHT HOLDERS
# BE LIABLE FOR ANY DAMAGES CAUSED BY THE USE OR THE INABILITY TO
# USE, OF THE FREETYPE PROJECT.
#
# As you have not signed this license, you are not required to
# accept it. However, as the FreeType project is copyrighted
# material, only this license, or another one contracted with the
# authors, grants you the right to use, distribute, and modify it.
# Therefore, by using, distributing, or modifying the FreeType
# project, you indicate that you understand and accept all the terms
# of this license.
#
# 2. Redistribution
# -----------------
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions
# are met:
#
# o Redistribution of source code must retain this license file
# (`licence.txt') unaltered; any additions, deletions or changes
# to the original files must be clearly indicated in
# accompanying documentation. The copyright notices of the
# unaltered, original files must be preserved in all copies of
# source files.
#
# o Redistribution in binary form must provide a disclaimer that
# states that the software is based in part of the work of the
# FreeType Team, in the distribution documentation. We also
# encourage you to put an URL to the FreeType web page in your
# documentation, though this isn't mandatory.
#
# These conditions apply to any software derived from or based on
# the FreeType code, not just the unmodified files. If you use our
# work, you must acknowledge us. However, no fee need be paid to
# us.
#
# 3. Advertising
# --------------
#
# The names of FreeType's authors and contributors may not be used
# to endorse or promote products derived from this software without
# specific prior written permission.
#
# We suggest, but do not require, that you use one or more of the
# following phrases to refer to this software in your documentation
# or advertising materials: `FreeType Project', `FreeType Engine',
# `FreeType library', or `FreeType Distribution'.
#
# 4. Contacts
# -----------
#
# There are two mailing lists related to FreeType:
#
# o freetype@freetype.org
#
# Discusses general use and applications of FreeType, as well as
# future and wanted additions to the library and distribution.
# If you are looking for support, start in this list if you
# haven't found anything to help you in the documentation.
#
# o devel@freetype.org
#
# Discusses bugs, as well as engine internals, design issues,
# specific licenses, porting, etc.
#
# o http://www.freetype.org
#
# Holds the current FreeType web page, which will allow you to
# download our latest development version and read online
# documentation.
#
# You can also contact us individually at:
#
# David Turner <david.turner@freetype.org>
# Robert Wilhelm <robert.wilhelm@freetype.org>
# Werner Lemberg <werner.lemberg@freetype.org>
#
#
# --- end of license ---
#
# This file is part of the FreeType project.
#
# This builds the Watcom library with Watcom's wcc386 under OS/2.
#
# You'll need Watcom's wmake.
#
#
# Invoke by "wmake -f arch\os2\Makefile.wat" when in the "lib" directory
#
# This will build "freetype\lib\libttf.lib"
ARCH = arch\os2
FT_MAKEFILE = $(ARCH)\Makefile.wat
FT_MAKE = wmake -h
.EXTENSIONS:
.EXTENSIONS: .lib .obj .c .h
.obj:.;.\extend;.\$(ARCH)
.c:.;.\extend;.\$(ARCH)
.h:.;.\extend;.\$(ARCH)
CC = wcc386
CCFLAGS = /otexanl+ /s /w5 /zq -Iarch\os2 -I. -Iextend
TTFILE = .\ttfile.c
TTMEMORY = .\ttmemory.c
TTMUTEX = .\ttmutex.c
TTFILE_OBJ = ttfile.obj
TTMEMORY_OBJ = ttmemory.obj
TTMUTEX_OBJ = ttmutex.obj
PORT = $(TTFILE) $(TTMEMORY) $(TTMUTEX)
PORT_OBJS = $(TTFILE_OBJ) $(TTMEMORY_OBJ) $(TTMUTEX_OBJ)
SRC_X = extend\ftxgasp.c extend\ftxkern.c extend\ftxpost.c &
extend\ftxcmap.c extend\ftxwidth.c extend\ftxsbit.c &
extend\ftxgsub.c extend\ftxgpos.c extend\ftxopen.c &
extend\ftxgdef.c
OBJS_X = extend\ftxgasp.obj extend\ftxkern.obj extend\ftxpost.obj &
extend\ftxcmap.obj extend\ftxwidth.obj extend\ftxsbit.obj &
extend\ftxgsub.obj extend\ftxgpos.obj extend\ftxopen.obj &
extend\ftxgdef.obj
SRC_M = ttapi.c ttcache.c ttcalc.c ttcmap.c &
ttgload.c ttinterp.c ttload.c ttobjs.c &
ttraster.c ttextend.c $(PORT)
OBJS_M = ttapi.obj ttcache.obj ttcalc.obj ttcmap.obj &
ttgload.obj ttinterp.obj ttload.obj ttobjs.obj &
ttraster.obj ttextend.obj $(PORT_OBJS) $(OBJS_X)
SRC_S = freetype.c
OBJ_S = freetype.obj
OBJS_S = $(OBJ_S) $(OBJS_X)
.c.obj:
$(CC) $(CCFLAGS) $[* /fo=$[*.obj
all: .symbolic
$(FT_MAKE) -f $(FT_MAKEFILE) libttf.lib
debug: .symbolic
$(FT_MAKE) -f $(FT_MAKEFILE) LIB_FILES="$(OBJS_M)" libttf.lib
libttf.lib: $(OBJS_M)
wlib -q -n libttf.lib $(OBJS_M)
# is this correct? Know nothing about wmake and the Watcom compiler...
$(OBJ_S): $(SRC_S) $(SRC_M)
$(CC) $(CCFLAGS) $(SRC_S) /fo=$(OBJ_S)
clean: .symbolic
@-erase $(OBJS_M)
@-erase *.err
distclean: .symbolic clean
@-erase libttf.lib
new: .symbolic
@-wtouch *.c
# end of Makefile.wat

View File

@@ -0,0 +1,20 @@
# Installation
You can install this bundle in TextMate by opening the preferences and going to the bundles tab. After installation it will be automatically updated for you.
# General
* [Bundle Styleguide](http://kb.textmate.org/bundle_styleguide) — _before you make changes_
* [Commit Styleguide](http://kb.textmate.org/commit_styleguide) — _before you send a pull request_
* [Writing Bug Reports](http://kb.textmate.org/writing_bug_reports) — _before you report an issue_
# License
If not otherwise specified (see below), files in this repository fall under the following license:
Permission to copy, use, modify, sell and distribute this
software is granted. This software is provided "as is" without
express or implied warranty, and with no claim as to its
suitability for any purpose.
An exception is made for files in readable text which contain their own license information, or files where an accompanying file exists (in the same directory) with a “-license” suffix added to the base-name name of the original file, and an extension of txt, html, or similar. For example “tidy” is accompanied by “tidy-license.txt”.

View File

@@ -0,0 +1,192 @@
---
uti: com.xamarin.workbook
platforms:
- Console
---
# Using C# 6
Some examples from Xamarin's [intro to C# 6](https://developer.xamarin.com/guides/cross-platform/advanced/csharp_six/).
* Null-conditional operator
* String Interpolation
* Expression-bodied Function Members
* Auto-property Initialization
* Index Initializers
* using static
## Null-conditional operator
The `?.` operator automatically does a null-check before referencing the
specified member. The example string array below has a `null` entry:
```csharp
var names = new string[] { "Foo", null };
```
In C# 5, a null-check is required before accessing the `.Length` property:
```csharp
// C# 5
int secondLength = 0;
if (names[1] != null)
secondLength = names[1].Length;
```
C# 6 allows the length to be queried in a single line; the entire
statement returns `null` if any object is null.
```csharp
var length0 = names[0]?.Length; // 3
var length1 = names[1]?.Length; // null
```
This can be used in conjunction with the `??` null coalescing operator
to set a default value (such as `0`) in the example below:
```csharp
var lengths = names.Select (names => names?.Length ?? 0); //[3, 0]
```
## String Interpolation
Previously strings were built in a number of different ways:
```csharp
var animal = "Monkeys";
var food = "bananas";
var out1 = String.Format ("{0} love to eat {1}", animal, food);
var out2 = animal + " love to eat " + food;
// or even StringBuilder
```
C# 6 provides a simple syntax where the fieldname can be
embedded directly in the string:
```csharp
$"{animal} love to eat {food}"
```
String-formatting can also be done with this syntax:
```csharp
var values = new int[] { 1, 2, 3, 4, 12, 123456 };
foreach (var s in values.Select (i => $"The value is {i,10:N2}.")) {
Console.WriteLine (s);
}
```
## Expression-bodied Function Members
The `ToString` override in the following class is an expression-bodied
function - a more succinct declaration syntax.
```csharp
class Person
{
public string FirstName { get; }
public string LastName { get; }
public Person (string firstname, string lastname)
{
FirstName = firstname;
LastName = lastname;
}
// note there is no explicit `return` keyword
public override string ToString () => $"{LastName}, {FirstName} {LastName}";
}
```
`void` expression bodied functions are also allowed so long as
the expression is a statement:
```csharp
public void Log(string message) => System.Console.WriteLine($"{DateTime.Now.ToString ("s", System.Globalization.CultureInfo.InvariantCulture )}: {message}");
```
This simple example calls these two methods:
```csharp
Log(new Person("James", "Bond").ToString())
```
## Auto-property Initialization
Properties (ie. specified with `{get;set;}`) can be initialized inline
with C# 6:
```csharp
class Todo
{
public bool Done { get; set; } = false;
public DateTime Created { get; } = DateTime.Now;
public string Description { get; }
public Todo (string description)
{
this.Description = description; // can assign (only in constructor!)
}
public override string ToString () => $"'{Description}' was created on {Created}";
}
```
```csharp
new Todo("buy apples")
```
## Index Initializers
Dictionary-style data structures let you specify key/value
types with a simple object-initializer-like syntax:
```csharp
var userInfo = new Dictionary<string,object> {
["Created"] = DateTime.Now,
["Due"] = DateTime.Now.AddSeconds(60 * 60 * 24),
["Task"] = "buy lettuce"
};
```
## using static
Enumerations, and certain classes such as System.Math, are primarily
holders of static values and functions. In C# 6, you can import all
static members of a type with a single using static statement:
```csharp
using static System.Math;
```
C# 6 code can then reference the static members directly, avoiding
repetition of the class name (eg. `Math.PI` becomes `PI`):
```csharp
public class Location
{
public Location (double lat, double @long) {Latitude = lat; Longitude = @long;}
public double Latitude = 0; public double Longitude = 0;
}
static public double MilesBetween(Location loc1, Location loc2)
{
double rlat1 = PI * loc1.Latitude / 180;
double rlat2 = PI * loc2.Latitude / 180;
double theta = loc1.Longitude - loc2.Longitude;
double rtheta = PI * theta / 180;
double dist =
Sin(rlat1) * Sin(rlat2) + Cos(rlat1) *
Cos(rlat2) * Cos(rtheta);
dist = Acos(dist);
dist = dist*180/PI;
dist = dist*60*1.1515;
return dist; //miles
}
```
```csharp
MilesBetween (new Location(-12,22), new Location(-13,33))
```

View File

@@ -0,0 +1 @@
_This_ is a **Markdown** readme.

View File

@@ -0,0 +1,26 @@
class {
constructor() {
this.state = { count:0 };
}
increment() {
this.state.count++;
}
}
style {
.count {
color:#09c;
font-size:3em;
}
.example-button {
font-size:1em;
padding:0.5em;
}
}
<div.count>
${state.count}
</div>
<button.example-button on-click('increment')>
Click me!
</button>

15
samples/Marko/hello.marko Normal file
View File

@@ -0,0 +1,15 @@
$ var name = 'Frank';
$ var colors = ['red', 'green', 'blue'];
<h1>
Hello ${name}!
</h1>
<ul if(colors.length)>
<li style={color: color} for(color in colors)>
${color}
</li>
</ul>
<div else>
No colors!
</div>

View File

@@ -0,0 +1,36 @@
static const colors = ['red', 'green', 'blue'];
static const defaultColor = [255, 0, 0];
class {
onInput(input) {
this.state = { color: input.color || defaultColor };
}
updateColor() {
this.state.color = colors.map((color) => {
return parseInt(this.getEl(color + 'Input').value, 10);
});
}
getStyleColor() {
return 'rgb(' + this.state.color.join(',') + ')';
}
}
<div.rgb-sliders>
<div.inputs>
<for(i, color in colors)>
<div>
<label for-key=color+"Input">
${color}:
</label>
<input type="range" max="255"
key=color+"Input"
on-input('updateColor')
value=state.color[i] >
</div>
</for>
</div>
<div.color style={backgroundColor: component.getStyleColor()}>
</div>
</div>

View File

@@ -0,0 +1,51 @@
project('test', ['c'],
version: '0.1.0'
)
# This is a comment test('foo')
add_global_arguments(['-foo'])
add_global_link_arguments(['-foo'])
gnome = import('gnome') # As is this
gnome.do_something('test')
meson.source_root()
foreach foo: bar
foreach baz : foo
message(baz)
endforeach
endforeach
blah = '''
afjoakjflajf # Test
lflkasjf
test\'test
test\\\\test
test\ntest
'''
foo = ''
foo = ''''''
foo = 'string'
foo = '''string2'''
foo = 12314
foo = 1231.1231
foo = true
foo = false
foo = ['te\'st', 1, 3.3, '''test''']
foo += 1231
foo = '@0@'.format('test')
foo = include_directories('foo', kwarg: 'bar', include_directories: 'foo')
foo = true ? 'true' : 'false'
foo = 2 - 1 + 3 % 8 / 4 * 3
if true and false
elif false or true
elif true not false
elif foo == 12
elif (foo != 124) and (foo <= 200)
else
endif

View File

@@ -0,0 +1,3 @@
option('with-something', type: 'boolean',
value: true,
)

329
samples/P4/l2.p4 Normal file
View File

@@ -0,0 +1,329 @@
/*
Copyright 2013-present Barefoot Networks, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
/*
* Layer-2 processing
*/
header_type l2_metadata_t {
fields {
lkp_pkt_type : 3;
lkp_mac_sa : 48;
lkp_mac_da : 48;
lkp_mac_type : 16;
l2_nexthop : 16; /* next hop from l2 */
l2_nexthop_type : 1; /* ecmp or nexthop */
l2_redirect : 1; /* l2 redirect action */
l2_src_miss : 1; /* l2 source miss */
l2_src_move : IFINDEX_BIT_WIDTH; /* l2 source interface mis-match */
stp_group: 10; /* spanning tree group id */
stp_state : 3; /* spanning tree port state */
bd_stats_idx : 16; /* ingress BD stats index */
learning_enabled : 1; /* is learning enabled */
port_vlan_mapping_miss : 1; /* port vlan mapping miss */
same_if_check : IFINDEX_BIT_WIDTH; /* same interface check */
}
}
metadata l2_metadata_t l2_metadata;
#ifndef L2_DISABLE
/*****************************************************************************/
/* Spanning tree lookup */
/*****************************************************************************/
action set_stp_state(stp_state) {
modify_field(l2_metadata.stp_state, stp_state);
}
table spanning_tree {
reads {
ingress_metadata.ifindex : exact;
l2_metadata.stp_group: exact;
}
actions {
set_stp_state;
}
size : SPANNING_TREE_TABLE_SIZE;
}
#endif /* L2_DISABLE */
control process_spanning_tree {
#ifndef L2_DISABLE
if (l2_metadata.stp_group != STP_GROUP_NONE) {
apply(spanning_tree);
}
#endif /* L2_DISABLE */
}
#ifndef L2_DISABLE
/*****************************************************************************/
/* Source MAC lookup */
/*****************************************************************************/
action smac_miss() {
modify_field(l2_metadata.l2_src_miss, TRUE);
}
action smac_hit(ifindex) {
bit_xor(l2_metadata.l2_src_move, ingress_metadata.ifindex, ifindex);
}
table smac {
reads {
ingress_metadata.bd : exact;
l2_metadata.lkp_mac_sa : exact;
}
actions {
nop;
smac_miss;
smac_hit;
}
size : MAC_TABLE_SIZE;
}
/*****************************************************************************/
/* Destination MAC lookup */
/*****************************************************************************/
action dmac_hit(ifindex) {
modify_field(ingress_metadata.egress_ifindex, ifindex);
bit_xor(l2_metadata.same_if_check, l2_metadata.same_if_check, ifindex);
}
action dmac_multicast_hit(mc_index) {
modify_field(intrinsic_metadata.mcast_grp, mc_index);
#ifdef FABRIC_ENABLE
modify_field(fabric_metadata.dst_device, FABRIC_DEVICE_MULTICAST);
#endif /* FABRIC_ENABLE */
}
action dmac_miss() {
modify_field(ingress_metadata.egress_ifindex, IFINDEX_FLOOD);
#ifdef FABRIC_ENABLE
modify_field(fabric_metadata.dst_device, FABRIC_DEVICE_MULTICAST);
#endif /* FABRIC_ENABLE */
}
action dmac_redirect_nexthop(nexthop_index) {
modify_field(l2_metadata.l2_redirect, TRUE);
modify_field(l2_metadata.l2_nexthop, nexthop_index);
modify_field(l2_metadata.l2_nexthop_type, NEXTHOP_TYPE_SIMPLE);
}
action dmac_redirect_ecmp(ecmp_index) {
modify_field(l2_metadata.l2_redirect, TRUE);
modify_field(l2_metadata.l2_nexthop, ecmp_index);
modify_field(l2_metadata.l2_nexthop_type, NEXTHOP_TYPE_ECMP);
}
action dmac_drop() {
drop();
}
table dmac {
reads {
ingress_metadata.bd : exact;
l2_metadata.lkp_mac_da : exact;
}
actions {
#ifdef OPENFLOW_ENABLE
openflow_apply;
openflow_miss;
#endif /* OPENFLOW_ENABLE */
nop;
dmac_hit;
dmac_multicast_hit;
dmac_miss;
dmac_redirect_nexthop;
dmac_redirect_ecmp;
dmac_drop;
}
size : MAC_TABLE_SIZE;
support_timeout: true;
}
#endif /* L2_DISABLE */
control process_mac {
#ifndef L2_DISABLE
apply(smac);
apply(dmac);
#endif /* L2_DISABLE */
}
#ifndef L2_DISABLE
/*****************************************************************************/
/* MAC learn notification */
/*****************************************************************************/
field_list mac_learn_digest {
ingress_metadata.bd;
l2_metadata.lkp_mac_sa;
ingress_metadata.ifindex;
}
action generate_learn_notify() {
generate_digest(MAC_LEARN_RECEIVER, mac_learn_digest);
}
table learn_notify {
reads {
l2_metadata.l2_src_miss : ternary;
l2_metadata.l2_src_move : ternary;
l2_metadata.stp_state : ternary;
}
actions {
nop;
generate_learn_notify;
}
size : LEARN_NOTIFY_TABLE_SIZE;
}
#endif /* L2_DISABLE */
control process_mac_learning {
#ifndef L2_DISABLE
if (l2_metadata.learning_enabled == TRUE) {
apply(learn_notify);
}
#endif /* L2_DISABLE */
}
/*****************************************************************************/
/* Validate packet */
/*****************************************************************************/
action set_unicast() {
modify_field(l2_metadata.lkp_pkt_type, L2_UNICAST);
}
action set_unicast_and_ipv6_src_is_link_local() {
modify_field(l2_metadata.lkp_pkt_type, L2_UNICAST);
modify_field(ipv6_metadata.ipv6_src_is_link_local, TRUE);
}
action set_multicast() {
modify_field(l2_metadata.lkp_pkt_type, L2_MULTICAST);
add_to_field(l2_metadata.bd_stats_idx, 1);
}
action set_multicast_and_ipv6_src_is_link_local() {
modify_field(l2_metadata.lkp_pkt_type, L2_MULTICAST);
modify_field(ipv6_metadata.ipv6_src_is_link_local, TRUE);
add_to_field(l2_metadata.bd_stats_idx, 1);
}
action set_broadcast() {
modify_field(l2_metadata.lkp_pkt_type, L2_BROADCAST);
add_to_field(l2_metadata.bd_stats_idx, 2);
}
action set_malformed_packet(drop_reason) {
modify_field(ingress_metadata.drop_flag, TRUE);
modify_field(ingress_metadata.drop_reason, drop_reason);
}
table validate_packet {
reads {
#ifndef __TARGET_BMV2__
l2_metadata.lkp_mac_sa mask 0x010000000000 : ternary;
#else
l2_metadata.lkp_mac_sa : ternary;
#endif
l2_metadata.lkp_mac_da : ternary;
l3_metadata.lkp_ip_type : ternary;
l3_metadata.lkp_ip_ttl : ternary;
l3_metadata.lkp_ip_version : ternary;
#ifndef __TARGET_BMV2__
ipv4_metadata.lkp_ipv4_sa mask 0xFF000000 : ternary;
#else
ipv4_metadata.lkp_ipv4_sa : ternary;
#endif
#ifndef IPV6_DISABLE
#ifndef __TARGET_BMV2__
ipv6_metadata.lkp_ipv6_sa mask 0xFFFF0000000000000000000000000000 : ternary;
#else
ipv6_metadata.lkp_ipv6_sa : ternary;
#endif
#endif /* IPV6_DISABLE */
}
actions {
nop;
set_unicast;
set_unicast_and_ipv6_src_is_link_local;
set_multicast;
set_multicast_and_ipv6_src_is_link_local;
set_broadcast;
set_malformed_packet;
}
size : VALIDATE_PACKET_TABLE_SIZE;
}
control process_validate_packet {
if (ingress_metadata.drop_flag == FALSE) {
apply(validate_packet);
}
}
/*****************************************************************************/
/* Egress BD lookup */
/*****************************************************************************/
action set_egress_bd_properties() {
}
table egress_bd_map {
reads {
egress_metadata.bd : exact;
}
actions {
nop;
set_egress_bd_properties;
}
size : EGRESS_BD_MAPPING_TABLE_SIZE;
}
control process_egress_bd {
apply(egress_bd_map);
}
/*****************************************************************************/
/* Egress VLAN decap */
/*****************************************************************************/
action remove_vlan_single_tagged() {
modify_field(ethernet.etherType, vlan_tag_[0].etherType);
remove_header(vlan_tag_[0]);
}
action remove_vlan_double_tagged() {
modify_field(ethernet.etherType, vlan_tag_[1].etherType);
remove_header(vlan_tag_[0]);
remove_header(vlan_tag_[1]);
}
table vlan_decap {
reads {
vlan_tag_[0] : valid;
vlan_tag_[1] : valid;
}
actions {
nop;
remove_vlan_single_tagged;
remove_vlan_double_tagged;
}
size: VLAN_DECAP_TABLE_SIZE;
}
control process_vlan_decap {
apply(vlan_decap);
}

39
samples/P4/mirror_acl.p4 Normal file
View File

@@ -0,0 +1,39 @@
// Copyright 2015, Barefoot Networks, Inc.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
action set_mirror_id(session_id) {
clone_ingress_pkt_to_egress(session_id);
}
table mirror_acl {
reads {
ingress_metadata.if_label : ternary;
ingress_metadata.bd_label : ternary;
/* ip acl */
ingress_metadata.lkp_ipv4_sa : ternary;
ingress_metadata.lkp_ipv4_da : ternary;
ingress_metadata.lkp_ip_proto : ternary;
/* mac acl */
ingress_metadata.lkp_mac_sa : ternary;
ingress_metadata.lkp_mac_da : ternary;
ingress_metadata.lkp_mac_type : ternary;
}
actions {
nop;
set_mirror_id;
}
size : INGRESS_MIRROR_ACL_TABLE_SIZE;
}

View File

@@ -0,0 +1,12 @@
create or replace procedure print_bool(
p_bool in BOOLEAN,
p_true_value in varchar2 default 'TRUE',
p_false_value in varchar2 := 'FALSE'
)
as
begin
dbms_output.put_line(case when p_bool then p_true_value else p_false_value end);
end print_bool;
/

48
samples/PLSQL/videodb.ddl Normal file
View File

@@ -0,0 +1,48 @@
CREATE TABLE users (
user_name varchar2(40),
first_name varchar2(40),
last_name varchar2(40),
email varchar2(40),
password varchar2(40),
created_date DATE,
total_credits NUMBER,
credit_change_date DATE,
PRIMARY KEY (user_name)
);
/
CREATE TABLE users_videos (
video_id NUMBER,
video_name varchar2(40),
user_name varchar2(40),
description varchar2(512),
upload_date DATE,
PRIMARY KEY (video_id),
CONSTRAINT "USERS_VIDEOS_FK1" FOREIGN KEY ("USER_NAME") REFERENCES "USERS"("USER_NAME")
);
/
create or replace procedure print_user_videos(
p_user_name in users.user_name%type
)
AUTHID DEFINER
as
type t_user_videos is table of users_videos%rowtype
index by pls_integer;
l_videos t_user_videos;
begin
select *
bulk collect into l_videos
from users_videos
where user_name = p_user_name;
for i in 1..l_videos.COUNT
loop
dbms_output.put_line(l_videos(i).video_name);
end loop;
end print_user_videos;
/

59
samples/Pan/ceph-raid.pan Normal file
View File

@@ -0,0 +1,59 @@
unique template site/filesystems/ceph-raid;
prefix '/system/blockdevices';
variable CEPH_OSD_DISKS = {
# SAS disks partitions
disks = list();
foreach (disk; data; value('/hardware/harddisks')) {
if (data['capacity'] > 1000 * GB) {
append(disks, disk);
};
};
disks;
};
'partitions' = {
foreach (idx; disk; CEPH_OSD_DISKS) {
partitions_add(
disk, dict(
format('%s1', disk), 10 * GB,
format('%s2', disk), 5 * GB,
format('%s3', disk), -1));
SELF[format('%s1', disk)]['offset'] = 1;
};
SELF;
};
#raid for data
'md' = {
for (i = 0; i < length(CEPH_OSD_DISKS); i = i + 2) {
for (j = 2; j <= 3; j = j + 1) {
SELF[escape(format('md/%s0%s%d', CEPH_OSD_DISKS[i], CEPH_OSD_DISKS[i+1], j ))] = dict(
"device_list", list(format('partitions/%s%d', CEPH_OSD_DISKS[i], j), format('partitions/%s%d', CEPH_OSD_DISKS[i+1], j)),
"raid_level", 'RAID0',
"metadata", '1.2',
);
};
};
SELF;
};
# ceph OSD and journal fs
'/system/filesystems' = {
# ga over software raids..
foreach (disk; data; value('/system/blockdevices/md')) { #check for data part/disk
if (match(unescape(disk), '^md/.+0.+3$')) {
append(merge(CEPH_FSOPTS_BASE, CEPH_DISK_OPTIONS[CEPH_FS], dict(
'mountpoint', format('/var/lib/ceph/osd/%s', replace('md/([a-z0A-Z]+)[0-9]*$', '$1', unescape(disk))),
'block_device', format('md/%s', disk),
)));
} else if (match(unescape(disk), '^md/.+0.+2$')) {
append(merge(CEPH_FSOPTS_DUMMY, dict(
'mountpoint', format('/dummy/%s', unescape(disk)),
'block_device', format('md/%s', disk)
)));
};
};
SELF;
};

11
samples/Pan/cluster-A.pan Normal file
View File

@@ -0,0 +1,11 @@
structure template site/nagios/hosts/cluster-A;
# let Nagios server A monitor B
# just an example to make the templates compile
"nagios-slave-B.example.org" = create (NAGIOS_QUATTOR_HOST);
"nagios-slave-B.example.org/alias" = "slave B";
"nagios-slave-B.example.org/hostgroups" = list( "quattor-nodes" );
# "another-host-in-A.example.org" = create (NAGIOS_QUATTOR_HOST);
# "another-host-in-A.example.org/alias" = "another monitored host in cluster A";

18
samples/Pan/databases.pan Normal file
View File

@@ -0,0 +1,18 @@
template site/databases;
# Defines the mapping between the full hostname and the IP
# address.
final variable DB_IP = dict(
escape("one"), "192.168.0.24",
escape("hyp01"), "192.168.0.25",
escape("vm"), "192.168.0.26",
);
# Defines the mapping between the full hostname and the
# physical machine.
# A different hardware template must be used for each machine
final variable DB_MACHINE = dict(
escape("one"), "hardware/machine/ibm/x3550/x_KDXXXX",
escape("hyp01"), "hardware/machine/ibm/hs21xm/blade_99HXXXX",
escape("vm"), "hardware/machine/one/example",
);

56
samples/Pan/functions.pan Normal file
View File

@@ -0,0 +1,56 @@
################################################################################
# This is 'namespaces/standard/pan/functions.tpl', a pan-templates's file
################################################################################
#
# VERSION: 3.2.7, 21/08/09 22:22
# AUTHOR: Martin Bock
# MAINTAINER: Example Maintainer <support@example.org>
# LICENSE: http://cern.ch/eu-datagrid/license.html
#
################################################################################
# Coding style: emulate <TAB> characters with 4 spaces, thanks!
################################################################################
#
# Function definitions
#
################################################################################
declaration template pan/functions;
include 'pan/types';
############################################################
##=
## @function push
## @# push zero or more values onto the end of a list.
##+If the list does not exist or is not defined a new list is
##+created.
## @syntax value:element
## @param:value... the values to push onto list
## @example
##+# "/data" will contain list (1,2,3,4)
##+"/data" = list(1,2);
##+"/data" = push(3,4);
##=
############################################################
function push = {
# Get the reference to SELF or create an empty list
# as necessary.
if (exists(SELF) && is_list(SELF)) {
v = SELF;
} else if (!exists(SELF) || !is_defined(SELF)) {
v = list();
} else {
error("push can only be applied to a list");
};
# Merge the arguments into the given array. Neither the
# first/next or merge functions can be used because the
# ARGV array cannot be directly referenced.
i = 0;
while (i < ARGC) {
v[length(v)] = ARGV[i];
i = i + 1;
};
v;
};

View File

@@ -0,0 +1,22 @@
unique template site/ceph/server/infernalis;
include 'components/dirperm/config';
"/software/components/dirperm/paths" = {
foreach (idx; mp; value('/system/filesystems')) {
if (match(mp['mountpoint'], format('^%s', CEPH_OSD_MP_BASE))) {
append(SELF, dict(
"path", mp['mountpoint'],
"owner", "ceph:ceph",
"perm", "0755",
"type", "d",
));
};
};
SELF;
};
include 'common/sysctl/service';
prefix "/software/components/metaconfig/services/{/etc/sysctl.conf}/contents";
'kernel.pid_max' = 4194303;

20
samples/Pan/libvirt.pan Normal file
View File

@@ -0,0 +1,20 @@
unique template site/ceph/client/libvirt;
include 'site/ceph/client/config';
variable CEPH_LIBVIRT_USER ?= 'oneadmin';
variable CEPH_LIBVIRT_GROUP ?= CEPH_LIBVIRT_USER;
prefix '/software/components/metaconfig/services/{/etc/ceph/ceph.client.libvirt.keyring}';
"contents" = if (is_defined(CEPH_LIBVIRT_SECRET)) {
dict("client.libvirt", dict(
"key", CEPH_LIBVIRT_SECRET,
)
);
} else {
dict();
};
'module' = 'tiny';
'mode' = 0600;
'owner' = CEPH_LIBVIRT_USER;
'group' = CEPH_LIBVIRT_GROUP;

19
samples/Pan/link.pan Normal file
View File

@@ -0,0 +1,19 @@
unique template site/dcache/link;
include 'components/dcache/config';
## links
## default preference value
"/software/components/dcache/link/def_pref" = "10";
## list of links that will be ignored during configuration
"/software/components/dcache/link/ignore_link" = list();
##
"/software/components/dcache/link/links" = dict(
## out_buf_write: all outside to write to the storage through this buffer
"out", dict("ugroup", list("all_net", "any_store"), "pgroup", list("out_buf"), "read", "10", "write", "10", "cache", "10"),
"in", dict("ugroup", list("in_net", "any_store"), "pgroup", list("priv"), "read", "20", "write", "20", "cache", "20"),
"dteam", dict("ugroup", list("dteam_store"), "pgroup", list("out_buf"), "read", "10", "write", "10", "cache", "10"),
"ops", dict("ugroup", list("ops_store"), "pgroup", list("out_buf"), "read", "10", "write", "10", "cache", "10"),
"cms", dict("ugroup", list("cms_store"), "pgroup", list("out_buf"), "read", "10", "write", "10", "cache", "10"),
"test", dict("ugroup", list("test_store"), "pgroup", list("behar_test"), "read", "10", "write", "10", "cache", "10"),
);

29
samples/Pan/mysql.pan Normal file
View File

@@ -0,0 +1,29 @@
unique template common/opennebula/mysql;
prefix "/software/packages";
"{mysql-server}" = dict();
include 'components/mysql/config';
prefix "/software/components/mysql";
"serviceName" = {
if (RPM_BASE_FLAVOUR_VERSIONID == 7) {
"mariadb";
} else {
"mysqld";
};
};
prefix "/software/components/mysql/servers/one";
"host" = FULL_HOSTNAME; # localhost is added by component
"adminpwd" = OPENNEBULA_MYSQL_ADMIN;
"adminuser" = "root";
prefix "/software/components/mysql/databases/opennebula";
"server" = "one";
"users/oneadmin/password" = OPENNEBULA_MYSQL_ONEADMIN;
"users/oneadmin/rights" = list("ALL PRIVILEGES");
"createDb" = false; # if false, run script
"initScript/file" = "/dev/null";
prefix "/software/components/chkconfig/service";
"mysqld" = dict("on", "", "startstop", true);

View File

@@ -0,0 +1,18 @@
template config/nodes_properties;
variable SITES ?= list('example');
#variable NEW_NODES_PROPS ?= {
variable NODES_PROPS = {
nodes_add = dict();
nodes_props = dict();
allsites = SITES;
ok = first(allsites, k, v);
while (ok) {
nodes_add = merge(create(format("config/%s_nodes_properties", v)), nodes_props);
nodes_props = merge(nodes_add[v], nodes_props);
ok = next(allsites, k, v);
};
nodes_props;
};

14
samples/Pan/onevm.pan Normal file
View File

@@ -0,0 +1,14 @@
unique template site/one/onevm;
include 'components/chkconfig/config';
# set opennebula map
include 'quattor/aii/opennebula/schema';
bind "/system/opennebula" = opennebula_vmtemplate;
include 'site/config-vm';
include 'quattor/aii/opennebula/default';
"/software/packages/{acpid}" = dict();
"/software/components/chkconfig/service/acpid" = dict('on', '', 'startstop', true);

26
samples/Pan/osd-fetch.pan Normal file
View File

@@ -0,0 +1,26 @@
unique template site/ceph/osdschemas/osd-fetch;
prefix '/software/components/ceph/clusters/ceph';
variable FETCHED_OSDS = {
t = dict();
rep = 2;
foreach(idx; host; CEPH_NODES) {
prof = replace('.data$', '.os', host);
d = value(format('%s:/software/components/ceph/localdaemons/osds', prof));
t[shorten_fqdn(host)] = dict(
'fqdn', host,
'osds', d
);
numosd = length(d);
if (numosd > rep){
rep = numosd;
};
};
all = dict('osdhosts', t, 'maxosd', rep);
};
'osdhosts' = FETCHED_OSDS['osdhosts'];
variable CEPH_OSD_DOWN_REPORTERS ?= FETCHED_OSDS['maxosd'] + 2;
variable CEPH_OSD_DOWN_REPORTS ?= CEPH_OSD_DOWN_REPORTERS + CEPH_OSD_DOWN_REPORTERS / 4 + 1;

45
samples/Pan/pakiti.pan Normal file
View File

@@ -0,0 +1,45 @@
#
# Generated by RepositoryTask on 12/09/13 15:41
#
# name = pakiti
# owner = support@example.org
# url = http://quattor.web.lal.in2p3.fr/packages/pakiti
#
structure template repository/pakiti;
"name" = "pakiti";
"owner" = "support@example.org";
"protocols" = list(
dict("name", "http",
"url", "http://quattor.web.lal.in2p3.fr/packages/pakiti")
);
"contents" = dict(
# pkg = pakiti-client-2.1.4-1-noarch
escape("pakiti-client-2.1.4-1-noarch"), dict("name", "pakiti-client", "version", "2.1.4-1", "arch", "noarch"),
# pkg = pakiti-client-2.1.4-2-noarch
escape("pakiti-client-2.1.4-2-noarch"), dict("name", "pakiti-client", "version", "2.1.4-2", "arch", "noarch"),
# pkg = pakiti-client-2.1.4-3-noarch
escape("pakiti-client-2.1.4-3-noarch"), dict("name", "pakiti-client", "version", "2.1.4-3", "arch", "noarch"),
# pkg = pakiti-client-2.1.4-4-noarch
escape("pakiti-client-2.1.4-4-noarch"), dict("name", "pakiti-client", "version", "2.1.4-4", "arch", "noarch"),
# pkg = pakiti-client-2.1.5-0-noarch
escape("pakiti-client-2.1.5-0-noarch"), dict("name", "pakiti-client", "version", "2.1.5-0", "arch", "noarch"),
# pkg = pakiti-client-manual-2.1.4-2-noarch
escape("pakiti-client-manual-2.1.4-2-noarch"), dict("name", "pakiti-client-manual", "version", "2.1.4-2", "arch", "noarch"),
# pkg = pakiti-client-manual-2.1.4-3-noarch
escape("pakiti-client-manual-2.1.4-3-noarch"), dict("name", "pakiti-client-manual", "version", "2.1.4-3", "arch", "noarch"),
# pkg = pakiti-client-manual-2.1.4-4-noarch
escape("pakiti-client-manual-2.1.4-4-noarch"), dict("name", "pakiti-client-manual", "version", "2.1.4-4", "arch", "noarch"),
# pkg = pakiti-server-2.1.4-1-noarch
escape("pakiti-server-2.1.4-1-noarch"), dict("name", "pakiti-server", "version", "2.1.4-1", "arch", "noarch"),
# pkg = pakiti-server-2.1.4-2-noarch
escape("pakiti-server-2.1.4-2-noarch"), dict("name", "pakiti-server", "version", "2.1.4-2", "arch", "noarch"),
# pkg = pakiti-server-2.1.4-3-noarch
escape("pakiti-server-2.1.4-3-noarch"), dict("name", "pakiti-server", "version", "2.1.4-3", "arch", "noarch"),
# pkg = pakiti-server-2.1.4-4-noarch
escape("pakiti-server-2.1.4-4-noarch"), dict("name", "pakiti-server", "version", "2.1.4-4", "arch", "noarch"),
# pkg = pakiti-server-2.1.5-0-noarch
escape("pakiti-server-2.1.5-0-noarch"), dict("name", "pakiti-server", "version", "2.1.5-0", "arch", "noarch"),
);

View File

@@ -0,0 +1,30 @@
# Template installing a script to remove all accounts with 'fqan' in
# their name. Used after fixing VOConfigTask in SCDB 2.3.2 to remove
# obsolete accounts not removed by ncm-accounts.
#
# The script is added and executed only on nodes where NODE_VO_ACCOUNTS
# is true. It is intended to be run as GLITE_BASE_CONFIG_SITE (define
# this variable to the script namespace).
#
# Michel Jouvin - 13/9/09
unique template site/misc/purge_fqan_accounts;
variable LAL_PURGE_ACCOUNTS_SCRIPT = '/tmp/purge_fqan_accounts';
include 'components/filecopy/config';
'/software/components/filecopy/services' = {
if ( is_defined(NODE_VO_ACCOUNTS) && NODE_VO_ACCOUNTS ) {
debug('Adding purge_fqan_accounts');
SELF[escape(LAL_PURGE_ACCOUNTS_SCRIPT)] = dict(
'config', file_contents('site/misc/purge_fqan_accounts.sh'),
'owner', 'root:root',
'perms', '0755',
'restart', LAL_PURGE_ACCOUNTS_SCRIPT,
);
} else {
debug(format('VO accounts disabled (NODE_VO_ACCOUNTS=%s', NODE_VO_ACCOUNTS));
};
SELF;
};

30
samples/Pan/resources.pan Normal file
View File

@@ -0,0 +1,30 @@
unique template site/one/resources;
# datastores templates
prefix "/software/components/opennebula/datastores/0";
"name" = "ceph.example";
"bridge_list" = list(FULL_HOSTNAME); # for now, do this from the headnode
"ceph_host" = CEPH_MON_HOSTS;
"ceph_secret" = CEPH_LIBVIRT_UUID;
"ceph_user" = "libvirt";
"ceph_user_key" = CEPH_LIBVIRT_SECRET;
"datastore_capacity_check" = true;
"pool_name" = "one";
"type" = "IMAGE_DS";
"rbd_format" = 2;
prefix "/software/components/opennebula/datastores/1";
"name" = "nfs.example";
"datastore_capacity_check" = true;
"ds_mad" = "fs";
"tm_mad" = "shared";
"type" = "IMAGE_DS";
# untouchables resources
prefix "/software/components/opennebula/untouchables";
"datastores" = list('system');
# extra conf
prefix "/software/components/opennebula";
"ssh_multiplex" = true;
"tm_system_ds" = "ssh";

20
samples/Pan/simple.pan Normal file
View File

@@ -0,0 +1,20 @@
unique template site/ceph/osdlocal/simple;
variable CEPH_JOURNAL_PART ?= dict();
prefix '/software/components/ceph';
'localdaemons/osds' = {
d = dict();
foreach(idx; osdmnt; value('/system/filesystems')) {
part = osdmnt['block_device'];
disk = replace('\S+/([a-zA-Z]+)[0-9]*$', '$1', part);
if (match(osdmnt['mountpoint'], '/var/lib/ceph/osd/\w+')){
d[escape(osdmnt['mountpoint'])] = dict(
'journal_path', format('/dev/%s%d', disk, CEPH_JOURNAL_PART['data']),
'crush_weight', weight_of(part),
);
};
};
d;
};

151
samples/Pan/types.pan Normal file
View File

@@ -0,0 +1,151 @@
@contributor{
name = First Contributor
email = first@example.org
}
@contributor{
name = Second Contributor
email = second@example.org
}
@documentation{
Data type and function definitions for basic types
}
declaration template pan/types;
include 'pan/legacy';
@documentation{
This type implements a date/time format consistent with
ASN.1 typically used by LDAP. The actual specification is the
"GeneralizedTime" format as specified on page 38 of the X.208
ITU-T recommendation and references within.
Ex: 20040825120123Z
20040825120123+0100
20040825120123,5
20040825120123.5
20040825120123.5-0123
}
function is_asndate = {
# Check cardinality and type of argument.
if (ARGC != 1 || !is_string(ARGV[0]))
error("usage: is_asndate(string)");
# Match the datetime pattern, extracting interesting fields.
result = matches(ARGV[0],
'^(\d{4})(\d{2})(\d{2})(\d{2})(\d{2})(\d{2})(?:[,\.](\d+))?([Zz]|(?:[-+]\d{2}\d{2}))?$');
if (length(result) >= 7) {
# Do further tests on various components of the date.
# NOTE: the to_long(to_double(x)) construct below is to avoid having
# the to_long function treat strings with leading zeros as octal
# numbers. E.g. to_long("09") will throw an exception because '9' is
# not a valid octal digit.
year = to_long(result[1]);
month = to_long(to_double(result[2]));
day = to_long(to_double(result[3]));
hour = to_long(to_double(result[4]));
minute = to_long(to_double(result[5]));
second = to_long(to_double(result[6]));
frac = 0;
if (length(result) > 7) {
frac = to_long(to_double(result[7]));
};
zone = '+0000';
if (length(result) > 8) {
zone = result[8];
};
# Check the range of months.
if (month < 1 || month > 12) {
error("is_asndate: invalid month");
return(false);
};
# Check the range of days.
if (day < 1 || day > 31) {
error("is_asndate: invalid day");
return(false);
};
# Be more specific on the days in each month.
if (month == 4 || month == 6 || month == 9 || month == 11) {
if (day > 30) {
error("is_asndate: invalid day");
};
};
# February is always a bother. Too lazy to check that the leap
# years have been specified correctly.
if (month == 2 && day > 29) {
error("is_asndate: invalid day");
};
# Check the time.
if (hour > 23) {
error("is_asndate: invalid hour");
return(false);
};
if (minute > 59) {
error("is_asndate: invalid minute");
return(false);
};
# Allow for leap seconds here (since it is easy).
if (second > 60) {
error("is_asndate: invalid minute");
return(false);
};
# Check the time zone format.
if (zone != "Z" && zone != "z") {
tz = matches(zone, '^[-+](\d{2})(\d{2})$');
hoffset = to_long(to_double(tz[1]));
moffset = to_long(to_double(tz[2]));
if (hoffset >= 12) {
error("is_asndate: invalid hour offset in time zone");
return(false);
};
if (moffset > 59) {
error("is_asndate: invalid minute offset in time zone");
return(false);
};
};
} else {
error("is_asndate: invalid format for time");
return(false);
};
# If it gets to this point, then the date must be OK.
true;
};
type type_asndate = string with {
is_asndate(SELF);
};
@documentation{
desc = Type that enforces the existence of a named interface.
}
type valid_interface = string with {
if (exists(format('/system/network/interfaces/%s', SELF))) {
return(true);
};
foreach(ifc; attr; value('/system/network/interfaces')) {
if (attr['device'] == SELF){
return(true);
};
};
false;
};
@documentation{
desc = CPU architectures understood by Quattor
}
type cpu_architecture = string with match (SELF, '^(i386|ia64|x86_64|sparc|aarch64|ppc64(le)?)$');

32
samples/Pan/unit.pan Normal file
View File

@@ -0,0 +1,32 @@
unique template site/dcache/unit;
include 'components/dcache/config';
## unit/ugroups
## list of ugroups that will be ignored during configuration
"/software/components/dcache/unit/ignore_ugroup" = list();
"/software/components/dcache/unit/units" = dict(
"protocol", list(
dict("cond", "*/*", "ugroup", list("default_protocol"))
),
"net", list(
dict("cond", "192.168.0.0/255.255.0.0", "ugroup", list("in_net", "all_net")),
dict("cond", "192.168.10.0/255.255.255.0", "ugroup", list("in_server", "in_net", "all_net")),
dict("cond", "192.168.11.0/255.255.255.0", "ugroup", list("in_wn", "in_net", "all_net")),
dict("cond", "192.168.12.0/255.255.255.0", "ugroup", list("in_wn", "in_net", "all_net")),
dict("cond", "192.168.13.0/255.255.255.0", "ugroup", list("in_wn", "in_net", "all_net")),
dict("cond", "192.168.14.0/255.255.255.0", "ugroup", list("in_wn", "in_net", "all_net")),
dict("cond", "192.168.15.0/255.255.255.0", "ugroup", list("in_wn", "in_net", "all_net")),
dict("cond", "192.168.16.0/255.255.255.0", "ugroup", list("in_wn", "in_net", "all_net")),
dict("cond", "192.168.17.0/255.255.255.0", "ugroup", list("in_wn", "in_net", "all_net")),
dict("cond", "0.0.0.0/0.0.0.0", "ugroup", list("all_net")),
),
"store", list(
dict("cond", "*@*", "ugroup", list("any_store")),
dict("cond", "myStore:STRING@osm", "ugroup", list("default_store", "any_store")),
dict("cond", "dteam:dteam-base@osm", "ugroup", list("dteam_store", "any_store")),
dict("cond", "ops:ops-base@osm", "ugroup", list("ops_store", "any_store")),
dict("cond", "cms:cms-base@osm", "ugroup", list("cms_store", "any_store")),
dict("cond", "test:cms-test@osm", "ugroup", list("test_store")),
),
);

34
samples/Pep8/div.pep Normal file
View File

@@ -0,0 +1,34 @@
main: SUBSP 8, i
DECI 0, s
DECI 2, s
CALL div
DECO 4, s
CHARO '\n', i
DECO 6, s
CHARO '\n', i
STOP
; Divides two numbers following the euclidian method
;
; Parameters:
; SP + 2: Dividend
; SP + 4: Divider
; Returns:
; SP + 6: Quotient
; SP + 8: Remain
div: LDX 0, i
LDA dividend, s
divlp: CPA divider, s
BRLT divout
ADDX 1, i
SUBA divider, s
BR divlp
divout: STX quot, s
STA rem, s
RET0
dividend: .EQUATE 2
divider: .EQUATE 4
quot: .EQUATE 6
rem: .EQUATE 8
.END

23
samples/Pep8/flag.pep Normal file
View File

@@ -0,0 +1,23 @@
_start: LDA 0,i
LDX 0,i
LDA 20, i
ADDA 51, i
CPA 0,i
BRLT s3
BR s4
s1: LDBYTEA s3, x
NOTA
STBYTEA s3, x
ADDX 1,i
CPX 12, i
BRNE s1
s2: STOP
s4: LDA 31, d
LDX 50, d
RET0
STOP
s3: CPX -27746, d
ANDX -8241, i
SUBA -12337, sxf
LDX -12289, sx
.END

675
samples/Pep8/linked.pep Normal file
View File

@@ -0,0 +1,675 @@
; Linked list of integers API
;
; Contains the basis of the structure and a
; variety of available functions to call on it.
;
; Calling conventions:
;
; - When the number of arguments is <= 2, the fastcall convention will be used:
; Arguments will be passed by registers, no assumption is made concerning the
; state of the registers during execution, they will need to be saved.
;
; - When the number of arguments exceeds 2, the cdecl convention will be used:
; Arguments will be passed on the stack, no assumption is made concerning the
; state of the registers during execution, they will need to be saved.
; Simple test program, do no include when using the library
main: SUBSP 4, i
DECI mnelmt, s
CALL newlst
LDX mnlst, s
CALL lstgetst
LDX mnlst, s
CALL lstsetst
LDX mnlst, s
CALL lstgetst
LDX mnlst, s
CALL shftest
LDX mnlst, s
CALL ushftest
LDX mnlst, s
CALL shftest
ADDSP 4, i
STOP
; Pointer to the list
mnlst: .EQUATE 0
; Element read
mnelmt: .EQUATE 2
; TESTS
; Simple test for the get operation
; Gets the first element of the list and prints it
;
; REQUIRES: Non-empty list
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; void
lstgetst: SUBSP 2, i
LDA 0, i
CALL lstget
STA 0, s
DECO 0, s
CHARO '\n', i
ADDSP 2, i
RET0
; Test for the set operation
; Sets the first element of the list to a given value
; The value is read from stdin
;
; REQUIRES: Non-empty list
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; void
lstsetst: SUBSP 6, i
STX 0, s
DECI 4, s
LDA 0, i
STA 2, s
CALL lstset
ADDSP 6, i
RET0
; Tests shift operation on a list
; Gets the last element of the list and prints it
;
; REQUIRES: Non-empty list
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; void
shftest: SUBSP 2, i
CALL lstshft
STA 0, s
DECO 0, s
CHARO '\n', i
ADDSP 2, i
RET0
; Tests unshift operation on a list
; Unshifts a new element read from keyboard
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; void
ushftest: SUBSP 2, i
DECI 0, s
LDA 0, s
CALL lstunshf
ADDSP 2, i
RET0
; LIBRARY
; Creates a new list with `element` as head
;
; Parameters:
; SP + 4: Element
;
; Returns:
; SP + 2: Pointer to the list
newlst: LDA lstlen, i
CALL new
STX 2, s
CALL newnode
SUBSP 2, i
STX 0, s
LDX nodeelmt, i
LDA 6, s
STA 0, sxf
LDA 0, s
LDX lsthead, i
STA 4, sxf
ADDSP 2, i
RET0
; Gets a node at specified index in a list
;
; Parameters:
; - A: Index
; - X: Pointer to the list
;
; Returns:
; - A: Error code (0 if no error was produced)
; - X: Pointer to the node
;
; Errors:
; -1: Index < 0
; -2: Index >= list.length
nodeat: SUBSP 10, i
STA ndaind, s
STX ndalst, s
LDX lsthead, i
LDA ndalst, sxf
STA ndanode, s
LDA ndaind, s
CPA 0, i
LDA 0, i
STA ndacurri, s
BRGE ndagez
LDA -1, i
ADDSP 10, i
RET0
ndagez: LDX ndalst, s
CALL listlen
STA ndalstln, s
LDA ndaind, s
CPA ndalstln, s
BRLT ndalp
LDA -2, i
ADDSP 10, i
RET0
ndalp: LDA ndacurri, s
CPA ndaind, s
BREQ ndaout
LDX nodenxt, i
LDA ndanode, sxf
STA ndanode, s
LDA ndacurri, s
ADDA 1, i
STA ndacurri, s
BR ndalp
ndaout: LDX ndanode, s
LDA 0, i
ADDSP 10, i
RET0
ndaind: .EQUATE 0
ndanode: .EQUATE 2
ndalst: .EQUATE 4
ndalstln: .EQUATE 6
ndacurri: .EQUATE 8
; Length of the list passed as a parameter
;
; Parameters:
; - X: List
;
; Returns:
; - A: Length
listlen: SUBSP 4, i
STX lenode, s
LDX lenode, sf
STX lenode, s
LDA 0, i
STA lencpt, s
llenlp: LDA lenode, s
CPA 0, i
BREQ lenout
LDA lencpt, s
ADDA 1, i
STA lencpt, s
LDX nodenxt, i
LDA lenode, sxf
STA lenode, s
BR llenlp
lenout: LDA lencpt, s
ADDSP 4, i
RET0
lenode: .EQUATE 0
lencpt: .EQUATE 2
; Gets an element in a list at a specified index
;
; Parameters:
; - A: Index
; - X: Address of the list
;
; Returns:
; - A: Element value
;
; Error:
; If out of bounds, prints an error message and stops the program
lstget: SUBSP 2, i
STA 0, s
CALL nodeat
CPA 0, i
BRNE getoob
LDA 0, x
ADDSP 2, i
RET0
; Out of bounds
getoob: STRO getstrob, d
DECO 0, s
CHARO '\n', i
STOP
; String for out of bounds error
getstrob: .ASCII "Invalid index on get, index = \x00"
; Sets an element in a list at a specified index to a new value
;
; Parameters:
; - SP + 2: Pointer to the list
; - SP + 4: Index
; - SP + 6: Element
;
; Returns:
; - A: 0 if all went well, an error code otherwise (analogous to the error codes in nodeat)
lstset: CHARO '\n', i
DECO lstsetlp, s
CHARO ' ', i
DECO lstsetin, s
CHARO ' ', i
DECO lstsetel, s
CHARO '\n', i
SUBSP 2, i
LDX lstsetlp, s
LDA lstsetin, s
CALL nodeat
CPA 0, i
BRNE lstsetrt
STX lstsetnp, s
LDA lstsetel, s
LDX nodeelmt, i
STA lstsetnp, sxf
LDA 0, i
lstsetrt: ADDSP 2, i
RET0
; Pointer to the list
lstsetlp: .EQUATE 4
; Element to set the value at
lstsetel: .EQUATE 8
; Index of the node
lstsetin: .EQUATE 6
; Pointer to the node
lstsetnp: .EQUATE 0
; Removes the first element of the list in parameter and returns its value
;
; REQUIRES: Non-empty list
;
; Parameters:
; ⁻ X: Pointer to the list
;
; Returns :
; - A: Element removed
lstshft: SUBSP 8, i
STX lshflp, s
LDX lsthead, i
LDA lshflp, sxf
CPA 0, i
BREQ shfterr
STA lshfohd, s
LDX nodenxt, i
LDA lshfohd, sxf
STA lshfnhd, s
LDX lsthead, i
STA lshflp, sxf
LDX nodeelmt, i
LDA lshfohd, sxf
ADDSP 8, i
RET0
shfterr: STRO shfterrm, d
STOP
; Pointer to the list
lshflp: .EQUATE 0
; Pointer to the old head
lshfohd: .EQUATE 2
; Old head's element
lshfhdel: .EQUATE 4
; Pointer to the new head
lshfnhd: .EQUATE 6
; Error message on shift
shfterrm: .ASCII "Cannot do shift on empty list.\n\x00"
; Inserts a new element at the beginning of a list
;
; Parameters:
; - X: Pointer to the list
; - A: Element to add to the list
;
; Returns:
; - A: Error code, 0 if all right, a code otherwise
lstunshf: SUBSP 8, i
STA lunshelm, s
STX lunslp, s
CALL newnode
STX lunsnhd, s
LDX lsthead, i
LDA lunslp, sxf
STA lunsohd, s
LDX nodenxt, i
LDA lunsohd, s
STA lunsnhd, sxf
LDA lunshelm, s
LDX nodeelmt, i
STA lunsohd, sxf
LDX lsthead, i
LDA lunsnhd, s
STA lunslp, sxf
ADDSP 8, i
RET0
; Pointer to the list
lunslp: .EQUATE 0
; Pointer to the old head
lunsohd: .EQUATE 2
; Pointer to the new head
lunsnhd: .EQUATE 4
; Element to add
lunshelm: .EQUATE 6
; Finds whether or not an element is present in a list
;
; Parameters:
; - X: Pointer to the list
; - A: Element to be found
;
; Returns:
; - A: 0 if element was not found, 1 if it was
lstfnd: SUBSP 6, i
STX lstfndlp, s
STA lstfndel, s
LDX lsthead, i
LDA lstfndlp, sxf
STA lstfndnd, s
fndloop: CPA 0, i
BREQ notfnd
LDX nodeelmt, i
LDA lstfndnd, sxf
CPA lstfndel, s
BREQ found
LDX nodenxt, i
LDA lstfndnd, sxf
STA lstfndnd, s
BR fndloop
notfnd: LDA 0, i
ADDSP 6, i
RET0
found: LDA 1, i
ADDSP 6, i
RET0
; Pointer to the list
lstfndlp: .EQUATE 0
; Element to search
lstfndel: .EQUATE 2
; Current node
lstfndnd: .EQUATE 4
; Pushes a new element at the end of the list
;
; Parameters:
; - X: Pointer to the list
; - A: Element to push
;
; Returns:
; - A: 0 if all went well, an error code otherwise
lstpsh: SUBSP 8, i
STX lpshlp, s
STA lpshel, s
CALL newnode
STX lpshnd, s
LDX lpshlp, s
CALL listlen
CPA 0, i
BREQ lpshshft
SUBA 1, i
LDX lpshlp, s
CALL nodeat
STX lpshlnd, s
LDX nodenxt, i
LDA lpshnd, s
STA lpshlnd, sxf
LDA lpshel, s
LDX nodeelmt, i
STA lpshnd, sxf
ADDSP 8, i
RET0
lpshshft: LDX lpshlp, s
LDA lpshel, s
CALL lstunshf
ADDSP 8, i
RET0
; Pointer to the list
lpshlp: .EQUATE 0
; Element to add to the list
lpshel: .EQUATE 2
; Node to add to the list
lpshnd: .EQUATE 4
; Node to append
lpshlnd: .EQUATE 6
; Pops the last element of a list
;
; Parameters:
; - X: Pointer to the list
;
; Returns:
; - A: Element removed from the list
lstpop: SUBSP 6, i
STX lpoplp, s
CALL listlen
CPA 0, i
BRNE poperrem
CPA 1, i
BREQ popshft
SUBA 2, i
LDX lpoplp, s
CALL nodeat
STX lpopndpr, s
LDX nodenxt, i
LDA lpopndpr, sxf
LDA 0, i
LDX nodenxt, i
STA lpopndpr, sxf
STA lpoplnd, s
LDX nodeelmt, i
LDA lpoplnd, s
ADDSP 6, i
RET0
poperrem: STRO poperrsm, d
STOP
popshft: LDX lpoplp, s
CALL lstshft
ADDSP 6, i
RET0
; Pointer to the list
lpoplp: .EQUATE 0
; Node to remove
lpoplnd: .EQUATE 2
;New last node
lpopndpr: .EQUATE 4
; Message to print when popping an empty list
poperrsm: .ASCII "Error: cannot pop an empty list.\n\x00"
; Inserts an element in a list at a given position
;
; REQUIRES: Non-empty list
;
; Parameters:
; - SP + 2: Pointer to the list
; - SP + 4: Index to insert at
; - SP + 6: Element to add
;
; Returns:
; - A: Error code: 0 if all went well, -1 if index < 0, -2 if index > list.length
lstinsat: SUBSP 6, i
LDA lstinsid, s
CPA 0, i
BRLT lstinslz
BREQ lstinush
LDX lstinslp, s
CALL listlen
CPA lstinsel, s
BRLT lstinsgl
BREQ lstinpsh
LDX lstinslp, s
LDA lstinsel, s
SUBA 1, i
CALL nodeat
STX lstinsnd, s
LDX nodenxt, i
LDA lstinsnd, sxf
STA lstinscx, s
CALL newnode
STX lstinscn, s
LDX nodeelmt, i
LDA lstinsel, s
STA lstinscn, sxf
LDX nodenxt, i
LDA lstinscx, s
STA lstinscn, sxf
LDA lstinscn, s
LDX nodenxt, i
STA lstinsnd, sxf
ADDSP 6, i
RET0
lstinush: LDX lstinslp, s
LDA lstinsel, s
CALL lstunshf
ADDSP 6, i
RET0
lstinpsh: LDX lstinslp, s
LDA lstinsel, s
CALL lstpsh
ADDSP 6, i
RET0
; Insert with index < 0
lstinslz: LDA -1, i
ADDSP 6, i
RET0
; Insert with index > list.length
lstinsgl: LDA -2, i
ADDSP 6, i
RET0
; List pointer
lstinslp: .EQUATE 8
; Index of the newly created node
lstinsid: .EQUATE 10
; Element to add
lstinsel: .EQUATE 12
; Node to change the pointer to the next
lstinsnd: .EQUATE 0
; Node to insert
lstinscn: .EQUATE 2
; Pointer to the node after the created one (might be null)
lstinscx: .EQUATE 4
; Removes a node at a given index in a list,
; returns the element previously contained
;
; Parameters:
; - X: Pointer to the list
; - A: Index of the element
;
; Returns:
; - A: Element removed
;
; Error:
; In case of error, the program aborts with an error message
lstremat: SUBSP 8, i
STX lremlp, s
STA lremid, s
CPA 0, i
BRLT lstremob
BREQ lstremz
CALL listlen
CPA lremid, s
BRGE lstremob
SUBA 1, i
CPA lremid, s
BREQ lrempop
LDA lremid, s
LDX lremlp, s
CALL nodeat
STX lremnd, s
LDA lremid, s
SUBA 1, i
LDX lremlp, s
CALL nodeat
STX lrempnd, s
LDX nodenxt, i
LDA lremnd, sxf
STA lrempnd, sxf
LDX nodeelmt, i
LDA lremnd, sxf
ADDSP 8, i
RET0
lstremz: LDX lremlp, s
CALL lstshft
ADDSP 8, i
RET0
lrempop: LDX lremlp, s
CALL lstpop
ADDSP 8, i
RET0
lstremob: STRO lremobst, d
DECO lremid, s
CHARO '\n', i
STOP
; Pointer to the list
lremlp: .EQUATE 0
; Index to remove an element at
lremid: .EQUATE 2
; Pointer to the node before the removed element
lrempnd: .EQUATE 4
; Pointer to the node to remove
lremnd: .EQUATE 6
; Error out of bounds string for remove_at
lremobst: .ASCII "Error: Out of bounds in remove_at, index = \x00"
; Creates a new node from scratch
; Sets its content to 0/NULL
;
; Parameters:
; void
;
; Return:
; - X: Address of the node
newnode: LDA nodeln, i
SUBSP 2, i
CALL new
STX 0, s
LDA 0, i
LDX nodenxt, i
STA 0, sxf
LDX nodeelmt, i
STA 0, sxf
LDX 0, s
ADDSP 2, i
RET0
; Allocates a new structure in the heap
;
; Parameters:
; - A: Length of the structure to allocate (bytes)
;
; Returns:
; - X: Address of the allocated structure
new: ADDA hpptr, d
LDX hpptr, d
STA hpptr, d
RET0
; Node in a linked list
;
; Contains two fields:
; - Element: Offset 0
; - Next: Offset 2
;
nodeln: .EQUATE 4
nodeelmt: .EQUATE 0
nodenxt: .EQUATE 2
; Linked list capsule
;
; Contains one field:
; - Head: Offset 0
;
lstlen: .EQUATE 2
lsthead: .EQUATE 0
; Pointer to the next available byte on the heap
hpptr: .ADDRSS heap
; Start of the heap
heap: .BLOCK 1
.END

434
samples/Pep8/msq.pep Normal file
View File

@@ -0,0 +1,434 @@
; Reads a square from stdin, then computes whether it is a magic square or not.
;
; A Magic Square is a square following a specific set of rules, namely:
; - The sum of each row must be the same as the sum of the diagonal
; - The sum of the anti-diagonal must be the same as the sum of the diagonal
; - The sum of each column must be the same as the sum of the diagonal
;
; If any column, row, or anti-diagonal does not follow the aformented rules,
; the program will output its number to stdout.
;
; Columns are identified by a negative digit, ranging from -1 to -n
; The anti-diagonal is identified by the number 0.
; Finally, rows are identified by a positive integer, ranging from 1 to n.
;
; Formatting:
; First a number `n` is read from Stdin, it will determine the size of the square
; Then, enter the data for the square, `n` entries will be read
; The data is sequentially added to the square in memory, from the upper-left corner
; to the lower-right corner, in a zig-zag pattern
;
; Example:
; 3
; 4 9 3
; 3 5 7
; 8 1 6
;
; Limitation: Since there is no dynamic allocation, the size
; of the square is capped at a maximum of 32*32.
; Any size lower than 1 or higher than 32 will produce
; an error and the termination of the program.
;_start
DECI sidelen, d
LDA sidelen, d
CPA 1, i
BRLT sderror
CPA 32, i
BRGT sderror
LDX sidelen, d
CALL mult
STA sqlen, d
CALL fillsq
LDA sidelen, d
LDX square, i
CALL diagsum
STA dgsm, d
CALL colsums
LDA sidelen, d
LDX square, i
CALL cdiagsum
CPA dgsm, d
BREQ cnt
DECO 0, i
CHARO '\n', i
cnt: STA cdsm, d
CALL rowsums
STOP
el: .BLOCK 2
; Length of a side of the square
sidelen: .WORD 0
; Total length of the square
sqlen: .BLOCK 2
; 32 * 32 square of integers
square: .BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 255
.BLOCK 8
; Prints an error and terminates the program
sderror: STRO stderr, d
STOP
; Parameters: A: Size of a side of the square
; X: Base address of the square
; cscolid: Identifier of the column (0-based)
; Computes the sum of each column
; If the sum is not the same as dgsm, its index will be printed (in negative form)
;
; Parameters: A: Size of a side of the square
; X: Address of the square
;
; Return: void
colsums:STA clsmsqsz, d
STX clsmsqad, d
SUBA 1, i
STA clsmyp, d
clssmlp:CPA 0 ,i
BRLT clsmout
STA cscolid, d
LDA clsmsqsz, d
LDX clsmsqsz, d
CALL colsum
CPA dgsm, d
BREQ clsdecpt
LDX clsmyp, d
NEGX
STX clsmyp, d
DECO clsmyp, d
CHARO '\n', i
LDX clsmyp, d
NEGX
STX clsmyp, d
clsdecpt: LDA clsmyp, d
SUBA 1, i
STA clsmyp, d
BR clssmlp
clsmout: RET0
clsmsqad: .BLOCK 2
clsmsqsz: .BLOCK 2
clsmyp_: .BLOCK 2
; Compute the sum of each row
; Prints its index if the value does not match dgsum
;
; Parameters: A: Size of a side of the square
; X: Address of the square
;
; Returns: void
rowsums: STA maxrows, d
STX rowssqad, d
LDA 0, i
STA tmprwsm, d
STA rowid, d
rwsmslp: CPA maxrows, d
BRGE rwsmsout
STA rwxpos, d
LDA maxrows, d
LDX rowssqad, d
CALL rowsum
CPA dgsm, d
STA tmprwsm, d
BREQ rwinccpt
DECO rowid, d
CHARO '\n', i
rwinccpt: LDA rowid, d
ADDA 1, i
STA rowid, d
BR rwsmslp
rwsmsout: RET0
; Number of rows to compute
maxrows: .BLOCK 2
; Square address
rowssqad: .BLOCK 2
; Current rowid
rowid: .BLOCK 2
; Current rowsum
tmprwsm: .BLOCK 2
; Gets an element at the indexes given as parameter
; The square is supposed to contain only integers
; No check will be made on the correctness of the indexes
;
; Parameters: A: Size of a side of the square (in elements)
; X: Base address of the square
; xpos: Position in X for the element (0-indexed)
; ypos: Position in Y for the element (0-indexed)
;
; Return: A will contain the element
;
; Side-effects: Registers A and X will neither be saved nor restored upon call
; ypos will be altered
elemat: STX elsqaddr, d
ASLA
LDX xpos, d
CALL mult
STA xpos, d
LDX ypos, d
ASLX
STX ypos, d
ADDA ypos, d
ADDA elsqaddr, d
STA elsqaddr, d
LDX elsqaddr, d
LDA 0, x
RET0
; X-index in square (in elements)
xpos: .BLOCK 2
; Y-index in square (in elements)
ypos: .BLOCK 2
; Address to fetch elements at
elsqaddr: .BLOCK 2
; Fills the square with input from the user
;
; Pass via register A the number of inputs to be read
fillsq: LDX 0, i
filloop: SUBA 1, i
CPA 0, i
BRLT fillout
DECI square, x
ADDX 2, i
BR filloop
fillout: RET0
; Computes the sum of the digits of a column
; The square is supposed to contain integers only
;
; Parameters: A: Size of a side of the square
; X: Base address of the square
; cscolid: Identifier of the column (0-based)
;
; Return: A: Sum of the digits of the column
colsum: STA csclsqsz, d
STX csclsqad, d
LDA 0, i
STA csclsum, d
STA csclxpos, d
clsmloop: CPA csclsqsz, d
BRGE colout
LDA cscolid, d
STA ypos, d
LDA csclxpos, d
STA xpos, d
LDA csclsqsz, d
LDX csclsqad, d
CALL elemat
ADDA csclsum, d
STA csclsum, d
LDA csclxpos, d
ADDA 1, i
STA csclxpos, d
BR clsmloop
colout: LDA csclsum, d
RET0
; Identifier of the column which sum is to be computed
cscolid: .BLOCK 2
; Temporary for x position
csclxpos: .BLOCK 2
; Base address of the square
csclsqad: .BLOCK 2
; Size of a side of the square
csclsqsz: .BLOCK 2
; Sum of the column
csclsum: .BLOCK 2
; Computes the sum of the digits of a row
; The square is supposed to contain integers only
;
; Parameters: A: Size of a side of the square
; X: Base address of the square
; rwxpos: Row index (0-based)
;
; Returns: A: Sum of the digits of the row
rowsum: STA rwsqsz, d
STX rwbsqadr, d
LDA 0,i
STA rwsum, d
STA rwypos, d
rwsumlp: LDA rwypos, d
CPA rwsqsz, d
BRGE rwsumout
STA ypos, d
LDA rwxpos, d
STA xpos, d
LDA rwsqsz, d
LDX rwbsqadr, d
CALL elemat
ADDA rwsum, d
STA rwsum, d
LDA rwypos, d
ADDA 1, i
STA rwypos, d
BR rwsumlp
rwsumout: LDA rwsum, d
RET0
; Square size (in elements)
rwsqsz: .BLOCK 2
; Square base address
rwbsqadr: .BLOCK 2
; Position of the row to compute
rwxpos: .BLOCK 2
; Current column visited
rwypos: .BLOCK 2
; Sum of the row
rwsum: .BLOCK 2
; Computes the sum for the antidiagonal of a square
; The square is supposed to contain integers only
;
; Parameters: A: Size of a side of the square (elements)
; X: Base address of the square
;
; Returns: A: Sum of the antidiagonal
cdiagsum: STA cdsqsz, d
SUBA 1,i
STA cdtmpy, d
LDA 0, i
STA cdtmpx, d
STA cdsum, d
STX cdsqaddr, d
cdiaglp: LDA cdtmpx, d
STA xpos, d
LDA cdtmpy, d
STA ypos, d
CPA 0, i
BRLT cdout
LDA cdsqsz, d
LDX cdsqaddr, d
CALL elemat
ADDA cdsum, d
STA cdsum,d
LDA cdtmpx, d
ADDA 1, i
STA cdtmpx, d
LDA cdtmpy, d
SUBA 1, i
STA cdtmpy, d
BR cdiaglp
cdout: LDA cdsum, d
RET0
; Temporary handle for square size (elements)
cdsqsz: .BLOCK 2
; Square address
cdsqaddr: .BLOCK 2
; Keep x address
cdtmpx: .BLOCK 2
; Keep y address
cdtmpy: .BLOCK 2
; Sum of antidiagonal
cdsum: .BLOCK 2
; Computes the sum for the diagonal of a square
; The square is supposed to contain integers only
;
; Parameters: A: Size of a side of the square (elements)
; X: Base address of the square
;
; Returns: A: Sum of the diagonal
;
diagsum: STA dsqsz, d
STX dsqaddr, d
LDA 0, i
STA tmpsum, d
STA curra, d
dglp: CPA dsqsz, d
BRGE dglpout
STA xpos, d
STA ypos, d
LDA dsqsz, d
LDX dsqaddr, d
CALL elemat
ADDA tmpsum, d
STA tmpsum, d
LDA curra, d
ADDA 1, i
STA curra, d
BR dglp
dglpout: LDA tmpsum, d
RET0
; Address of the square
dsqaddr: .BLOCK 2
; Size of a side of the square (elements)
dsqsz: .BLOCK 2
; Current value of the x and y indexes
curra: .BLOCK 2
; Sum of the values
tmpsum: .BLOCK 2
; Muliplies two ints
;
; Parameters:
; Register A : Left part of the multiplication
; Register X : Right part of the multiplication
;
; Return:
; Register A : Result of the multiplication
;
; Side-effects:
; Uses multmp as a temporary value
mult: STA multmp, d
LDA 0, i
muloop: CPX 0, i
BRLE mulout
ADDA multmp, d
SUBX 1, i
BR muloop
mulout: RET0
; Temporary variable for mult function
; Holds the initial value of A
multmp: .WORD 0
; For debugging purposes
; Prints the content of the square to stdout
;
; Parameters: A: Size of a side
; X: Base address of square
;
; Side-effects:
; Consider variables sidesz, sqaddr, sqmaxa as local, they will be written
; Registers A and X will not be saved nor restored upon call
printsq: STA sidesz, d
STX sqaddr, d
LDX sidesz, d
CALL mult
ASLA
ADDA sqaddr, d
STA sqmaxa, d
LDX sqaddr, d
LDA 0, i
priloop: DECO 0, x
CHARO ' ', i
ADDX 2, i
CPX sqmaxa, d
BREQ priout
ADDA 1, i
CPA sidesz, d
BRLT priloop
LDA 0, i
CHARO '\n', i
BR priloop
priout: RET0
; Size of a side of the square
sidesz: .BLOCK 2
; Address of the square
sqaddr: .BLOCK 2
; Maximum address to iterate upon
sqmaxa: .BLOCK 2
; ------------------ GLOBALLY ACCESSIBLE SYMBOLS -------------------- ;
;
; Sum of the diagonal for the square
; Reference value for magic-square
dgsm: .WORD 0
; Sum of the counter-diagonal
cdsm: .WORD 0
; Input error string
stderr: .ASCII "A number between 1 and 32 (both inclusive) must be entered as value for the size of the square for the program to work.\n\x00"
.END

227
samples/Pep8/qsort.pep Normal file
View File

@@ -0,0 +1,227 @@
; Sorts a statically defined array using the recursive implementation
; of the quicksort algorithm.
;
; In this implementation, the pivot is supposed to be the rightmost
; value of the slice of the array being sorted.
;
; Note that the code presented below should work on any array,
; whether defined statically or dynamically.
;
; Calling conventions:
; Except when mentionned otherwise, every parameter is to be passed on the stack.
; The return values are also on the stack.
; No assumption is to be made on the content of a register on a function call.
; The values of the registers are to be locally saved for further use if necessary.
main: SUBSP 4, i
LDA 11, i
ASLA
STA 2, s
LDA arr, i
STA 0, s
CALL printarr
SUBSP 2, i
LDA arr, i
STA 0, s
LDA 0, i
STA 2, s
LDA 10, i
STA 4, s
CALL qsort
ADDSP 2, i
CHARO '\n', i
LDA 11, i
ASLA
STA 2, s
LDA arr, i
STA 0, s
CALL printarr
STOP
; Sorts an array using the quicksort algorithm
;
; Parameters:
; - SP + 2: Address of the array
; - SP + 4: Left bound
; - SP + 6: Right bound
; Returns:
; void
qsort: SUBSP 2, i
LDA qsarrlb, s
CPA qsarrrb, s
BRGE qsortout
SUBSP 6, i
LDA 10, s
STA 0, s
LDA 12, s
STA 2, s
LDA 14, s
STA 4, s
CALL part
LDA 10, s
STA 0, s
LDA 12, s
STA 2, s
LDA 6, s
SUBA 1, i
STA 4, s
CALL qsort
LDA 10, s
STA 0, s
LDA 6, s
ADDA 1, i
STA 2, s
LDA 14, s
STA 4, s
CALL qsort
ADDSP 6, i
qsortout: ADDSP 2, i
RET0
; Address of the array
qsarradd: .EQUATE 4
; Left bound
qsarrlb: .EQUATE 6
; Right bound
qsarrrb: .EQUATE 8
; Pivot value returned by the part command
qsortp: .EQUATE 0
; Partitions an array in two following the quicksort rules.
;
; All the lower values compared to the pivot will be on the left
; All the upper values compared to the pivot will be on the right
; The pivot's final index is then returned
;
; Parameters:
; - SP + 2: Address of the array
; - SP + 4: Left bound
; - SP + 6: Right bound
;
; Returns:
; - SP + 8: Pivot final index
part: SUBSP 8, i
LDA parrrb, s
STA partpiv, s
LDA parrlb, s
STA pstind, s
STA piter, s
partflp: CPA parrrb, s
BRGE partout
LDX piter, s
ASLX
LDA paraddr, sxf
STA parrival, s
LDX partpiv, s
ASLX
LDA paraddr, sxf
CPA parrival, s
BRLT parlpinc
SUBSP 6, i ; Call swap(arr, i, st_index)
LDA 16, s
STA 0, s
LDA 8, s
STA 2, s
LDA 10, s
STA 4, s
CALL swap
ADDSP 6, i
LDA pstind, s
ADDA 1, i
STA pstind, s
parlpinc: LDA piter, s
ADDA 1, i
STA piter, s
BR partflp
partout: SUBSP 6, i ; Call swap(arr, piv, st_index)
LDA 16, s
STA 0, s
LDA 12, s
STA 2, s
LDA 10, s
STA 4, s
CALL swap
ADDSP 6, i
LDA pstind, s
ADDSP 8, i
STA 8, s
RET0
; Address of the array
paraddr: .EQUATE 10
; Left bound
parrlb: .EQUATE 12
; Right bound
parrrb: .EQUATE 14
; Pivot value
partpiv: .EQUATE 6
; st_index
pstind: .EQUATE 4
; For iterator value
piter: .EQUATE 2
; arr[i] value
parrival: .EQUATE 0
; Swaps the value of two elements of an array of integers
;
; Parameters:
; - SP + 2: Address of the array
; - SP + 4: Index of the 1st element to swap
; - SP + 6: Index of the 2nd element to swap
;
; Returns:
; void
swap: SUBSP 2, i
LDX fstelind, s
ASLX
LDA arraddr, sxf
STA swaptmp, s
LDX secelind, s
ASLX
LDA arraddr, sxf
LDX fstelind, s
ASLX
STA arraddr, sxf
LDA swaptmp, s
LDX secelind, s
ASLX
STA arraddr, sxf
ADDSP 2, i
RET0
; Temporary value for the swap
swaptmp: .EQUATE 0
; Address of the array on which the swap is done
arraddr: .EQUATE 4
; Index of the first element
fstelind: .EQUATE 6
; Index of the second element
secelind: .EQUATE 8
; Prints the content of an array
;
; Parameters:
; SP + 2: Address of the array
; SP + 4: Length of the array
;
; Returns:
; void
printarr: LDX 0, i
parrlp: CPX 4, s
BRGE parrout
DECO 2, sxf
CHARO ' ', i
ADDX 2, i
BR parrlp
parrout: RET0
; Unsorted array for testing purposes
arr: .WORD 9
.WORD 5
.WORD 8
.WORD 10
.WORD 4
.WORD 7
.WORD 0
.WORD 3
.WORD 2
.WORD 1
.WORD 6
.END

61
samples/Pep8/stri_buf.pep Normal file
View File

@@ -0,0 +1,61 @@
main:
; Reads a string in stdin, returns the buffer it was read in
; Stops reading at the first encounter of a \n character.
;
; Parameters:
; void
;
; Returns:
; - X: Address of the buffer
stri: SUBSP 2, i
LDA 32, i
CALL new
CPX buflen, s
BRGE strinlrg
strinlrg: LDA buflen, d
LDX 2, i
CALL mult
STA buflen
CALL new
buflen: .EQUATE 0
; Copies the content of a buffer to another one
;
; Parameters:
; - SP + 2: Destination buffer
; - SP + 4: Source buffer
; - SP + 6: Length to copy
memcpy: LDX 0, i
memcplp: CPX cpylen, s
BREQ memcpout
LDBYTEA srcbuf, sxf
STBYTEA dstbuf, sxf
ADDX 1, i
BR memcplp
memcpout: RET0
; Destination buffer
dtsbuf: .EQUATE 2
; Source buffer
srcbuf: .EQUATE 4
; Copy length
cpylen: .EQUATE 6
; Allocates a new structure in the heap
;
; Parameters:
; - A: Length of the structure to allocate (bytes)
;
; Returns:
; - X: Address of the allocated structure
new: ADDA hpptr, d
LDX hpptr, d
STA hpptr, d
RET0
; Pointer to the next available byte on the heap
hpptr: .ADDRSS heap
; Start of the heap
heap: .BLOCK 1
.END

View File

@@ -0,0 +1,50 @@
main: SUBSP 34, i
LDA 31, i
STA 0, s
CALL fgets
ADDSP 2, i
CALL ststro
STOP
; Reads a string from stdin, stops reading when one of the following is true:
; - Read a \n
; - Read a maximum of `max` chars
;
; Parameters:
; - SP + 2: `max`, the maximum number of chars to read
; - SP + 4: `buffer` of length `max` + 1
; Returns:
; void
fgets: LDX 0, i
LDA 0, i
fgetslp: CHARI buffer, sx
LDBYTEA buffer, sx
CPA '\n', i
BREQ fout
CPX max, s
BREQ fout
ADDX 1, i
BR fgetslp
fout: LDA '\x00', i
STBYTEA buffer, sx
RET0
max: .EQUATE 2
buffer: .EQUATE 4
; Prints a string stored in stack
;
; Parameters:
; SP + 2: `string`
; Returns:
; void
ststro: LDX 0, i
LDA 0, i
strolp: LDBYTEA string, sx
CPA '\x00', i
BREQ strout
CHARO string, sx
ADDX 1, i
BR strolp
strout: RET0
string: .EQUATE 2
.END

Some files were not shown because too many files have changed in this diff Show More