Compare commits

...

66 Commits

Author SHA1 Message Date
Theodore Dubois
14a7cb2d1b Add calculuswhiz/Assembly-Syntax-Definition grammar and use it for Unix Assembly (#4096) 2018-04-18 15:29:33 +02:00
Paul Chaignon
54ae7e7b4d Strategies take result from previous strategy into account (#4099)
Each strategy takes as candidates the language outputted by the
previous strategy if any. This was already the case for the
Classifier and Heuristic strategies as these couldn't generate new
candidate languages (as opposed to the Modeline, Filename, Shebang,
and Extension strategies).

In practice, this signifies that if, for example, the Shebang
strategy finds two possible languages for a given file (as is
currently possible with the perl interpreter), the next strategy, the
Extension strategy, will use this information and further reduce the
set of possible language.
Currently, without this commit, the Extension strategy would discard
the results from the previous strategy and start anew, possibly
returning a different language from those returned by the Shebang
strategy.
2018-04-17 10:02:57 +02:00
Brayden Banks
5363e045bb Teach Generated about Cargo lock files (#4100) 2018-04-15 12:04:19 +02:00
Colin Seymour
cc4da98616 Fix adding/replacing a grammar (#4097)
* Licensed needs a full path now

* Add docker installed/running guard

* Docker is required for adding/replacing grammars

* Use more elegant method

Hat-tip to @Alhadis 🎩
2018-04-12 12:14:25 +02:00
Juan Julián Merelo Guervós
a9ff59aef5 Additions to the Perl family of languages (#4066)
* Mainly fixing problems with Perl heuristics

And also adding a little bit of text to the README file to help with local use and test.

* Adds new sample

* Adds a couple of samples more, not represented before

* Moves installation intructions to CONTRIBUTING.md

Refs #2309 and also changes github.com to an uniform capitalization.

* Correcting error. Great job, CI

* Moving another file

* Adds samples and new checks for perl/perl6

* Stupid mistake

* Changing regex for perl5 vs perl6

Initial suggestion by @pchaigno, slightly changed to eliminate false positives such as "classes" or "modules" at the beginning of a line in the =pod

BTW, it would be interesting to just eliminate these areas for language detection.

* Eliminates Rexfile from Perl6

And adds .pod6

* Followup to #2709

I just found I had this sitting here, so I might as well follow
instructions to fix it.

* Adds example for pod6

* Eliminates .pod because it's its own language

* Removes bad directory

* Reverting changes that were already there

* Restored CONTRIBUTING.md from head

I see installation of cmake is advised in README.md

* Eliminates `.pod6`

To leave way for #3366 or succeeding PRs.

* Removed by request, since we're no longer adding this extension

* Sorting by alphabetical order filenames

* Moved from sample to test fixtures
2018-04-11 17:32:26 +02:00
John Gardner
7b9ec3d1b3 Add support for sed as a programming language (#4093) 2018-04-11 16:09:54 +02:00
Cyrille Le Clerc
51d3711faf Detect Maven wrapper "mvnw" (#4042)
* Detect Maven wrapper "mvnw"

* Fix build, filenames must be sorted in the "filenames" section of languages.yml, filenames cannot be grouped by topic

* Remove `mvnw` file from languages/Shell/filenames according to @Alhadis recommendation as we are sure that `mvnw` always starts with the shebang `#!/bin/sh`.

* Remove space chars added by mistake
2018-04-08 16:04:34 +01:00
Jens Fischer
14fcd75773 Add HXML support (#4092)
* Add HXML support

* Fix test failures
2018-04-08 11:23:00 +01:00
Jens Fischer
34c623eaba Replace haxe-sublime-bundle with haxe-TmLanguage (#4079)
* Replace haxe-sublime-bundle with haxe-TmLanguage

* Add missing changes

* Typo fix
2018-04-06 17:25:59 +01:00
John Gardner
324bc83914 Register "cperl" as an alias of Perl 5 (#4067)
* Register `cperl` as an alias of Perl 5

Emacs ships with an enhanced major-mode for editing Perl with embedded C
sections (called `cperl-mode`). This commit enables Linguist to identify
Perl files containing cperl modelines.

* Add `cperl` to list of Perl 5 interpreters
2018-04-07 01:43:51 +10:00
kasper3
ecc62784ca Add pwsh PowerShell interpreter (#4073)
* Register pwsh as interpreter for PowerShell

* Add pwsh sample

* Remove pwsh from Shell section
2018-04-06 15:25:09 +01:00
Colin Seymour
f452612666 Update Licensee and Licensed gems (#3982)
* Update licensee version

This pulls in Licensed 0.10.0 too.

* Use a full path to the grammars

Licensed now enforces this as it's easier then guessing.

* Ensure full path

* Use new path for FSProject

* Starting to adjust tests

* require licensee again

* Fix grammar tests

* verify -> status

* whitelist -> allowed

* explicitly set cache_path in configuration

default for licensed v1.0 changed from `vendor/licenses` to `.licenses`

* load configuration from file location

default configuration file location changed from `vendor/licenses/config.yml` to `.licensed.yml`

* update gemspec for licensed 1.0.0

* Remove unused license hash
2018-04-03 16:35:24 +01:00
Paul Chaignon
0bf4b8a482 Remove samples/LANG/filenames as a source of truth (#4078)
All filenames must now be explicitly listed in languages.yml. A test
makes sure they are.
2018-04-02 11:09:06 +02:00
Stan Hu
718c9efaca Bump Charlock Holmes to 0.7.6 (#4086) 2018-04-01 16:16:00 +01:00
Colin Seymour
49593a6a6d Update Travis Config (#4081)
* Switch to trusty

Precise is likely to die sometime after March 2018 -
https://blog.travis-ci.com/2017-08-31-trusty-as-default-status

* Remove Ruby 2.1 and 2.2

Ruby 2.1 is already EOL. 2.2 is EOL end March '18

* Need libicu52 on trusty
2018-03-29 09:10:13 +01:00
Nish Tahir
ba1cf12776 Replace Kotlin language grammar (#4065) 2018-03-28 16:57:17 +01:00
Joseph Stachelek
03f394626d Exclude R package doc directory (#4068)
* Exclude R package doc directory

* Match starting slash of R doc directory
2018-03-28 16:49:13 +01:00
Ben Wiley
cf385d9e77 Support for PICO-8 as Lua variant (#4072)
* Support for PICO-8 as Lua variant

Treats [PICO-8 fantasy console](https://www.lexaloffle.com/pico-8.php) files (.p8) as Lua.

A search for `pico8 extension:p8` on GitHub shows [hundreds of repositories](https://github.com/search?utf8=%E2%9C%93&q=pico8+extension%3Ap8&type=Repositories&ref=advsearch&l=&l=) using this extension. Pico-8 stores information describing the code, graphics and sound for a game in a single .p8 text file. Due to the nature of Pico-8, this information is relatively terse and the main feature of the file is Lua code. Therefore it makes sense to give .p8 files Lua syntax highlighting.

* Create treegen.p8

https://github.com/lvictorino/pico8/blob/master/treegen.p8
2018-03-28 16:47:55 +01:00
Steve Pike
dd3b1eec91 enable syntax highlighting for .jinja2 ext (#4051)
* enable syntax highlighting for .jinja2 ext

This is a commonly used jinja (2!) extension

* add sample jinja2 file

* move jinja to django samples dir

* added a link to the jinja docs in the sample file

* change sample jinja2 file to a one that exists on GH
2018-03-06 13:53:24 +00:00
John Gardner
2b65318a61 Register 12 new JSON filenames/extensions (#4053)
Added:
* .avsc
* .gltf
* .htmlhintrc
* .jscsrc
* .jslintrc
* .jsonl
* .tern-config
* .tern-project
* .tfstate
* .tfstate.backup
* .webapp
* .webmanifest
2018-03-02 20:33:47 +11:00
John Gardner
1b3cdda4f7 Add script to alphabetise submodule list (#4054) 2018-03-02 20:33:09 +11:00
Nicolas Stucki
50d46eed38 Update Scala syntax grammar (#4044)
* Add vscode-scala

* Use vscode-scala template for scala sources

* Add grammar licence for vscode-scala

* Fix vendor in README

* Sort vendors in grammars.yml

* Fix license file name
2018-02-28 12:03:31 +00:00
bruno cuconato
1bbcfa5683 add CoNLL-U format (#4029)
* * add CoNLL-U format
- add to languages.yml
- add textmate grammar
  - add to vendor/README
  - add to grammars.yml
- add samples

* rm other extensions as I couldn't find properly licensed examples of them in the wild

* substitutesamples for something with appropriate license

* update grammar submodule so it finds the LICENSE

* add license to grammar

* * conllu
- readd other extensions
- abridge samples and a new one
- update grammar submodule: correct extension of grammar file

* rm .conllx extension
2018-02-21 15:27:32 +00:00
Jason Malinowski
c2d3170064 Support VB.NET *.Generated.vb along with *.Generated.cs files (#4027) 2018-02-21 11:56:55 +00:00
Paul Chaignon
fe3981ff03 Whitelist 4 new TextMate grammar fields (#4039)
swallow, foregroundColor, and backgroundColor are older fields for TextMate 1.
2018-02-19 16:07:15 +01:00
Paul Chaignon
3769216c7a Associate .x extension to Linker Script language (#4040) 2018-02-19 10:50:05 +01:00
Colin Seymour
052c048fb5 Update sublime-netlinx ref (#4037) 2018-02-16 11:40:41 +00:00
Damien Guard
cec3a26496 Remove NANT grammar from CSharp (#4034)
We no longer include this grammar in atom/language-csharp as it caused issues with highlighting bash etc. and was unmaintained.
2018-02-16 11:40:11 +00:00
Mike McQuaid
4f0f9bd51d CONTRIBUTING: note license. (#4036)
This is to be more explicit about the contribution process and license.
2018-02-16 09:27:12 +00:00
Paul Chaignon
04e7956407 Whitelist injectionSelector in grammars (#4032) 2018-02-13 12:45:21 +01:00
Nathaniel J. Smith
2abf488e65 Treat "python3" as an alias for "python" (#4026)
Pygments has separate highlighters for "python" (meaning python 2) and "python3" (meaning python 3). As a result, there are lots of files out there (especially ReSTructured text) that contain code blocks whose language is explicitly given as "python3" or "py3". Currently these are unrecognized by linguist. Instead, we should use our python highlighter for them (which works for both python 2 and python 3).

References:
  http://pygments.org/docs/lexers/#pygments.lexers.python.Python3Lexer
  https://github.com/github/markup/issues/1019
  https://github.com/python-trio/async_generator/pull/12
2018-02-08 09:52:21 +00:00
Tobias V. Langhoff
812797b51d Add "asm" as alias for assembly (#4019) 2018-01-31 11:56:47 +00:00
Tobias V. Langhoff
dc32876113 Fix anchor link in README.md (#4018) 2018-01-30 21:41:45 +01:00
Colin Seymour
a18ad1d489 Release v6.0.1 (#4016)
* Update grammar submodule refs

* Bump version to v.6.0.1
2018-01-30 15:17:58 +00:00
Vicent Martí
25ac140d58 compiler: Allow loading grammars from syntaxes folder (#4015) 2018-01-30 12:49:28 +01:00
Vicent Martí
f7835f7119 compiler: Do not load TreeSitter grammars (#4013) 2018-01-29 18:14:14 +01:00
Vicent Martí
a7f835a653 compiler: Prefer specific grammar formats (#4012)
* compiler: Simplify the Dockerfile

* compiler: Prefer grammar extensions based on type
2018-01-29 14:40:23 +01:00
Ilias Van Peer
6220286f42 Re-enable Elm support (#4007)
The submodule was pointing to a repo which no longer has a tmLanguage
definition. We've now reinstated such a repo, with a tmLanguage file
existing in the master branch.

Since we reinstated this repo with the same name (as the old name
redirected to a new repo), the commit used is the only change.
2018-01-28 09:22:47 +00:00
Colin Seymour
15e2b74dec Release v6.0.0 (#4002)
* Update all submodules

* Ensure always using lastest docker image

* Allow passing in GEM_VERSION from env

This is useful to building test gems in a cache friendly way using:
`GEM_VERSION=$(git describe --tags 2>/dev/null | sed 's/-/./g' | sed
's/v//') bundle exec rake build_gem`

* Update submodules one last time

* Set version 6.0.0
2018-01-26 13:12:12 +00:00
gauravsavadi
969333610c Replacing MATLAB Language bundle (#4000)
* Updated readme to new MATLAB language grammar

* added MATLAB language grammer

* Update .gitmodules

* Update .gitmodules

* Update grammars.yml
2018-01-26 07:10:21 +00:00
Vicent Martí
8438c6cd3e compiler: Do not output empty grammars (#4001) 2018-01-25 14:49:03 +01:00
Dylan Simon
60f748d47b Add .x as XDR/RPCGEN (#3472)
* Add .x as XDR/RPCGEN

XDR/RPC language as documented in RFC5531, RFC4506.
Samples are from glibc and RFCs.

* Add Logos samples

https://github.com/JonasGessner/NoCarrier/blob/master/NoCarrier.x - MIT
cf31f4e466/llvm-gcc-R3/gcc/testsuite/objc/execute/string1.x - GPL2
f6415578fa/perapp-plugin/Tweak.x - GPL3
d1b3e83888/NCHax.x - Apache

* Add disambiguate heuristics for .x

* Add RPC to vendor/README.md
2018-01-25 09:15:09 +00:00
Seppe Stas
8da6ddf9d9 Override languages being included by language statistics (#3807)
* Add detectable key to languages

This key allows to override the language being included in the
language stats of a repository.

* Make detectable override-able using .gitattributes

* Mention `linguist-detectable` in README

* Remove detectable key from languages

Reverts changes in 0f7c0df5.

* Update commit hash to the one that was merged

PR #3806 changed the commit hash. The original commit was not
actually merged into the test/attributes branch.

* Fix check to ensure detectable is defined

* Add include in language stats tests when detectable set

* Ignore detectable when vendored, documentation or overridden

* Add documentation on detectable override in README

* Improve documentation on detectable override in README
2018-01-23 12:17:48 +00:00
Rajendra arora
fef7a12c85 Added Travis-ci build passing markup icon (#3995) 2018-01-20 18:52:26 +00:00
Vicent Martí
b80ca35b75 Update release docs with grammars tarball instructions (#3994) 2018-01-19 12:12:02 +01:00
Vicent Martí
c8171322f5 script: Add build-grammars-tarball (#3992) 2018-01-18 18:23:55 +01:00
Colin Seymour
4c1e61892a Bump escape_utils to ~> 1.2.0 (#3981)
There are very few changes between 1.1.x and 1.2.x, and I can't see any
that would indicate this would break anything. It does however fix
https://github.com/github/linguist/issues/3797 and
https://github.com/github/linguist/issues/3649 thus allowing peeps to
install Linguist on Windows using rubyinstaller2.
2018-01-18 09:53:50 +00:00
Paul Chaignon
4db659dede Whitelist hideFromUser key in grammars (#3989)
hideFromuser is an undocumented key from TextMate to hide some
grammars from the user.
2018-01-16 10:49:33 +01:00
Colin Seymour
ed73a72cbe Add issue and pull request templates (#3972)
* Add issue and pull request templates

* Implement feedback

* Request new and old grammar refs

* Add note about vendor, documentation, and generated lists

* Implement @Alhadis's suggestions
2018-01-15 10:11:53 +00:00
Brandon Elam Barker
512f077da8 adding the .kojo extension for Scala (#3960) 2018-01-13 09:38:34 +00:00
Josh Padnick
3260b06241 Format .tfvars file as HashiCorp Config Language. (#3885)
* Format .tfvars file as HashiCorp Config Language.

* Add sample terraform.tfvars file to demonstrate HCL rendering.
2018-01-12 17:27:41 +00:00
BRAMILLE Sébastien
ef3b0b6af3 Add solidity language (#3973)
* add solidity language

* add solidity color

* move samples to test fixtures

they're not used by the bayesian classifier

* Update languages.yml

* Rename RefundVault.sol to RefundVault.solidity

* Rename pygments-example.sol to pygments-example.solidity

* Change color from #383838 to #AA6746

`Color #383838 is too close to ["3F3F3F", "383838"]`

* Fix test

* Remove test/fixtures and add samples

* Remove extension

* Remove sample file
2018-01-12 17:26:51 +00:00
Colin Seymour
434023460e Revert "Check generated Jest snap file" (#3984)
* Revert "Remove Arduino as a language (#3933)"

This reverts commit 8e628ecc36.

* Revert "Check generated Jest snap file (#3874)"

This reverts commit ca714340e8.
2018-01-12 11:49:02 +00:00
oldmud0
8e628ecc36 Remove Arduino as a language (#3933)
* Remove Arduino as a language

* Move Arduino samples to C++

* Move .ino entry to its correct place
2018-01-11 10:48:19 +00:00
Yuya Takeyama
ca714340e8 Check generated Jest snap file (#3874)
* Check generated Jest snap file

* Check file name rule first

ref: https://github.com/github/linguist/pull/3874/files#r146168309

* Check extension first

It must be cheaper
ref: https://github.com/github/linguist/pull/3874/files#r146168426
2018-01-11 09:25:13 +00:00
DoctorWhoof
a4e6fc78c8 Added a few Monkey2 examples (#3811)
* Added Monkey2 (extension .monkey2) example

This compiles with the most up to date Monkey2 release (V1.1.06).

* Sorting example in Monkey2

* Add files via upload

* Gui example using the MojoX module
2018-01-11 09:23:54 +00:00
Egor Zhdan
db1d4f7893 Add Materialize.css to the vendor list (#3943) 2018-01-11 09:48:49 +01:00
Paolo Di Tommaso
bee7e55618 Add Nextflow language support (#3870)
* Added nextflow language
* Added main.nf to list of filenames
* Fixed duplicate groovy scope
* Removed hello-world example
* Update grammar submodule
* Removed main.nf from filenames
* Added nextflow.config example
2018-01-09 12:47:59 +01:00
Ashe Connor
5fbe9c0902 Allow classifier to run on symlinks as usual (#3948)
* Fixups for symlink detection, incl. test

* assert the heuristics return none for symlink
2018-01-08 09:01:16 +11:00
Paul Chaignon
a840668599 perl6 alias for Perl 6 (#3977)
Many repository rely on `perl6` as a Markdown key for code snippet
highlighting. The new Perl 6 name breaks this behavior as it requires
`perl-6` as the Markdown key.
2018-01-07 21:32:55 +01:00
Paul Chaignon
38cb8871ba Update URL for HTTP grammar (#3966) 2018-01-01 21:57:13 +01:00
Colin Seymour
d0b906f128 Update README and CONTRIBUTING documentation (#3955)
* Add more troubleshooting info

* Add more updates

* A lot more words and reformatting

* Few more tweaks

* Add how it works on GitHub.com

* More clarifications

* Feedback tweaks

* Add missing run

* Learn grammar
2017-12-21 13:17:20 +00:00
Ashe Connor
d4c2d83af9 Do not traverse symlinks in heuristics (#3946) 2017-12-12 21:53:36 +11:00
Ashe Connor
0b81b21983 Grammar compiler invocation fix (#3945)
* Correct grammar-compiler invocation in build_gem

/cc @vmg

* || true so we can release with broken grammars
2017-12-12 09:41:21 +01:00
Colin Seymour
1a769c4665 Handle repo cleanup race more elegantly (#3930)
* Don't attempt to get pwd for error message

* Print error instead of raising exception

This is more user-friendly too.

* Switch back to raise, but rescue it too

* Refactor
2017-12-11 12:37:21 +00:00
Vicent Martí
e7e64bf39a compiler: Add error output to the compiler (#3935) 2017-12-04 19:20:38 +01:00
177 changed files with 6896 additions and 1138 deletions

26
.github/ISSUE_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,26 @@
<!--- Provide a general summary of the issue in the Title above -->
## Preliminary Steps
Please confirm you have...
- [ ] reviewed [How Linguist Works](https://github.com/github/linguist#how-linguist-works),
- [ ] reviewed the [Troubleshooting](https://github.com/github/linguist#troubleshooting) docs,
- [ ] considered implementing an [override](https://github.com/github/linguist#overrides),
- [ ] verified an issue has not already been logged for your issue ([linguist issues](https://github.com/issues?utf8=%E2%9C%93&q=is%3Aissue+repo%3Agithub/linguist)).
<!-- Please review these preliminary steps before logging your issue. You may find the information referenced may answer or explain the behaviour you are seeing. It'll help us to know you've reviewed this information. -->
## Problem Description
<!--- Provide a more detailed introduction to the issue itself, and why you consider it to be a bug -->
### URL of the affected repository:
### Last modified on:
<!-- YYYY-MM-DD -->
### Expected language:
<!-- expected language -->
### Detected language:
<!-- detected language -->

46
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,46 @@
<!--- Briefly describe what you're changing. -->
## Description
<!--- If necessary, go into depth of what this pull request is doing. -->
## Checklist:
<!--- Go over all the following points, and put an `x` in all the boxes that apply. -->
<!--- If you're unsure about any of these, don't hesitate to ask. We're here to help! -->
- [ ] **I am associating a language with a new file extension.**
- [ ] The new extension is used in hundreds of repositories on GitHub.com
- Search results for each extension:
<!-- Replace FOOBAR with the new extension, and KEYWORDS with keywords unique to the language. Repeat for each extension added. -->
- https://github.com/search?utf8=%E2%9C%93&type=Code&ref=searchresults&q=extension%3AFOOBAR+KEYWORDS+NOT+nothack
- [ ] I have included a real-world usage sample for all extensions added in this PR:
- Sample source(s):
- [URL to each sample source, if applicable]
- Sample license(s):
- [ ] I have included a change to the heuristics to distinguish my language from others using the same extension.
- [ ] **I am adding a new language.**
- [ ] The extension of the new language is used in hundreds of repositories on GitHub.com.
- Search results for each extension:
<!-- Replace FOOBAR with the new extension, and KEYWORDS with keywords unique to the language. Repeat for each extension added. -->
- https://github.com/search?utf8=%E2%9C%93&type=Code&ref=searchresults&q=extension%3AFOOBAR+KEYWORDS+NOT+nothack
- [ ] I have included a real-world usage sample for all extensions added in this PR:
- Sample source(s):
- [URL to each sample source, if applicable]
- Sample license(s):
- [ ] I have included a syntax highlighting grammar.
- [ ] I have included a change to the heuristics to distinguish my language from others using the same extension.
- [ ] **I am fixing a misclassified language**
- [ ] I have included a new sample for the misclassified language:
- Sample source(s):
- [URL to each sample source, if applicable]
- Sample license(s):
- [ ] I have included a change to the heuristics to distinguish my language from others using the same extension.
- [ ] **I am changing the source of a syntax highlighting grammar**
<!-- Update the Lightshow URLs below to show the new and old grammars in action. -->
- Old: https://github-lightshow.herokuapp.com/
- New: https://github-lightshow.herokuapp.com/
- [ ] **I am adding new or changing current functionality**
<!-- This includes modifying the vendor, documentation, and generated lists. -->
- [ ] I have added or updated the tests for the new or changed functionality.

1455
.gitmodules vendored

File diff suppressed because it is too large Load Diff

View File

@@ -5,19 +5,18 @@ addons:
apt:
packages:
- libicu-dev
- libicu48
- libicu52
before_install: script/travis/before_install
script:
- bundle exec rake
- script/licensed verify
- script/licensed status
rvm:
- 2.1
- 2.2
- 2.3.3
- 2.4.0
- 2.5.0
notifications:
disabled: true
@@ -27,6 +26,6 @@ git:
depth: 3
cache: bundler
dist: precise
dist: trusty
bundler_args: --without debug

View File

@@ -1,67 +1,16 @@
# Contributing
Hi there! We're thrilled that you'd like to contribute to this project. Your help is essential for keeping it great. This project adheres to the [Contributor Covenant Code of Conduct](http://contributor-covenant.org/). By participating, you are expected to uphold this code.
Hi there! We're thrilled that you'd like to contribute to this project. Your help is essential for keeping it great.
Contributions to this project are [released](https://help.github.com/articles/github-terms-of-service/#6-contributions-under-repository-license) to the public under the [project's open source license](LICENSE).
This project adheres to the [Contributor Covenant Code of Conduct](http://contributor-covenant.org/). By participating, you are expected to uphold this code.
The majority of contributions won't need to touch any Ruby code at all.
## Adding an extension to a language
## Getting started
We try only to add new extensions once they have some usage on GitHub. In most cases we prefer that extensions be in use in hundreds of repositories before supporting them in Linguist.
To add support for a new extension:
1. Add your extension to the language entry in [`languages.yml`][languages], keeping the extensions in alphabetical order.
1. Add at least one sample for your extension to the [samples directory][samples] in the correct subdirectory.
1. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
In addition, if this extension is already listed in [`languages.yml`][languages] then sometimes a few more steps will need to be taken:
1. Make sure that example `.yourextension` files are present in the [samples directory][samples] for each language that uses `.yourextension`.
1. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.yourextension` files. (ping **@lildude** to help with this) to ensure we're not misclassifying files.
1. If the Bayesian classifier does a bad job with the sample `.yourextension` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
## Adding a language
We try only to add languages once they have some usage on GitHub. In most cases we prefer that each new file extension be in use in hundreds of repositories before supporting them in Linguist.
To add support for a new language:
1. Add an entry for your language to [`languages.yml`][languages]. Omit the `language_id` field for now.
1. Add a grammar for your language: `script/add-grammar https://github.com/JaneSmith/MyGrammar`. Please only add grammars that have [one of these licenses][licenses].
1. Add samples for your language to the [samples directory][samples] in the correct subdirectory.
1. Add a `language_id` for your language using `script/set-language-ids`. **You should only ever need to run `script/set-language-ids --update`. Anything other than this risks breaking GitHub search :cry:**
1. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
In addition, if your new language defines an extension that's already listed in [`languages.yml`][languages] (such as `.foo`) then sometimes a few more steps will need to be taken:
1. Make sure that example `.foo` files are present in the [samples directory][samples] for each language that uses `.foo`.
1. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.foo` files. (ping **@lildude** to help with this) to ensure we're not misclassifying files.
1. If the Bayesian classifier does a bad job with the sample `.foo` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
Remember, the goal here is to try and avoid false positives!
## Fixing a misclassified language
Most languages are detected by their file extension defined in [languages.yml][languages]. For disambiguating between files with common extensions, linguist applies some [heuristics](/lib/linguist/heuristics.rb) and a [statistical classifier](lib/linguist/classifier.rb). This process can help differentiate between, for example, `.h` files which could be either C, C++, or Obj-C.
Misclassifications can often be solved by either adding a new filename or extension for the language or adding more [samples][samples] to make the classifier smarter.
## Fixing syntax highlighting
Syntax highlighting in GitHub is performed using TextMate-compatible grammars. These are the same grammars that TextMate, Sublime Text and Atom use. Every language in [languages.yml][languages] is mapped to its corresponding TM `scope`. This scope will be used when picking up a grammar for highlighting.
Assuming your code is being detected as the right language, in most cases this is due to a bug in the language grammar rather than a bug in Linguist. [`grammars.yml`][grammars] lists all the grammars we use for syntax highlighting on github.com. Find the one corresponding to your code's programming language and submit a bug report upstream. If you can, try to reproduce the highlighting problem in the text editor that the grammar is designed for (TextMate, Sublime Text, or Atom) and include that information in your bug report.
You can also try to fix the bug yourself and submit a Pull Request. [TextMate's documentation](https://manual.macromates.com/en/language_grammars) offers a good introduction on how to work with TextMate-compatible grammars. You can test grammars using [Lightshow](https://github-lightshow.herokuapp.com).
Once the bug has been fixed upstream, we'll pick it up for GitHub in the next release of Linguist.
## Testing
For development you are going to want to checkout out the source. To get it, clone the repo and run [Bundler](http://gembundler.com/) to install its dependencies.
Before you can start contributing to Linguist, you'll need to set up your environment first. Clone the repo and run `script/bootstrap` to install its dependencies.
git clone https://github.com/github/linguist.git
cd linguist/
@@ -77,7 +26,91 @@ To run Linguist from the cloned repository:
bundle exec bin/linguist --breakdown
To run the tests:
### Dependencies
Linguist uses the [`charlock_holmes`](https://github.com/brianmario/charlock_holmes) character encoding detection library which in turn uses [ICU](http://site.icu-project.org/), and the libgit2 bindings for Ruby provided by [`rugged`](https://github.com/libgit2/rugged). [Docker](https://www.docker.com/) is also required when adding or updating grammars. These components have their own dependencies - `icu4c`, and `cmake` and `pkg-config` respectively - which you may need to install before you can install Linguist.
For example, on macOS with [Homebrew](http://brew.sh/): `brew install cmake pkg-config icu4c docker` and on Ubuntu: `apt-get install cmake pkg-config libicu-dev docker-ce`.
## Adding an extension to a language
We try only to add new extensions once they have some usage on GitHub. In most cases we prefer that extensions be in use in hundreds of repositories before supporting them in Linguist.
To add support for a new extension:
1. Add your extension to the language entry in [`languages.yml`][languages], keeping the extensions in alphabetical and case-sensitive (uppercase before lowercase) order, with the exception of the primary extension; the primary extension should be first.
1. Add at least one sample for your extension to the [samples directory][samples] in the correct subdirectory. We'd prefer examples of real-world code showing common usage. The more representative of the structure of the language, the better.
1. Open a pull request, linking to a [GitHub search result](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
If you are adding a sample, please state clearly the license covering the code in the sample, and if possible, link to the original source of the sample.
Additionally, if this extension is already listed in [`languages.yml`][languages] and associated with another language, then sometimes a few more steps will need to be taken:
1. Make sure that example `.yourextension` files are present in the [samples directory][samples] for each language that uses `.yourextension`.
1. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.yourextension` files. (ping **@lildude** to help with this) to ensure we're not misclassifying files.
1. If the Bayesian classifier does a bad job with the sample `.yourextension` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
## Adding a language
We try only to add languages once they have some usage on GitHub. In most cases we prefer that each new file extension be in use in hundreds of repositories before supporting them in Linguist.
To add support for a new language:
1. Add an entry for your language to [`languages.yml`][languages]. Omit the `language_id` field for now.
1. Add a syntax-highlighting grammar for your language using: `script/add-grammar https://github.com/JaneSmith/MyGrammar`
This command will analyze the grammar and, if no problems are found, add it to the repository. If problems are found, please report them to the grammar maintainer as you will not be able to add the grammar if problems are found.
**Please only add grammars that have [one of these licenses][licenses].**
1. Add samples for your language to the [samples directory][samples] in the correct subdirectory.
1. Add a `language_id` for your language using `script/set-language-ids`.
**You should only ever need to run `script/set-language-ids --update`. Anything other than this risks breaking GitHub search :cry:**
1. Open a pull request, linking to a [GitHub search results](https://github.com/search?utf8=%E2%9C%93&q=extension%3Aboot+NOT+nothack&type=Code&ref=searchresults) showing in-the-wild usage.
Please state clearly the license covering the code in the samples. Link directly to the original source if possible.
In addition, if your new language defines an extension that's already listed in [`languages.yml`][languages] (such as `.foo`) then sometimes a few more steps will need to be taken:
1. Make sure that example `.foo` files are present in the [samples directory][samples] for each language that uses `.foo`.
1. Test the performance of the Bayesian classifier with a relatively large number (1000s) of sample `.foo` files. (ping **@lildude** to help with this) to ensure we're not misclassifying files.
1. If the Bayesian classifier does a bad job with the sample `.foo` files then a [heuristic](https://github.com/github/linguist/blob/master/lib/linguist/heuristics.rb) may need to be written to help.
Remember, the goal here is to try and avoid false positives!
## Fixing a misclassified language
Most languages are detected by their file extension defined in [`languages.yml`][languages]. For disambiguating between files with common extensions, Linguist applies some [heuristics](/lib/linguist/heuristics.rb) and a [statistical classifier](lib/linguist/classifier.rb). This process can help differentiate between, for example, `.h` files which could be either C, C++, or Obj-C.
Misclassifications can often be solved by either adding a new filename or extension for the language or adding more [samples][samples] to make the classifier smarter.
## Fixing syntax highlighting
Syntax highlighting in GitHub is performed using TextMate-compatible grammars. These are the same grammars that TextMate, Sublime Text and Atom use. Every language in [`languages.yml`][languages] is mapped to its corresponding TextMate `scopeName`. This scope name will be used when picking up a grammar for highlighting.
Assuming your code is being detected as the right language, in most cases syntax highlighting problems are due to a bug in the language grammar rather than a bug in Linguist. [`vendor/README.md`][grammars] lists all the grammars we use for syntax highlighting on GitHub.com. Find the one corresponding to your code's programming language and submit a bug report upstream. If you can, try to reproduce the highlighting problem in the text editor that the grammar is designed for (TextMate, Sublime Text, or Atom) and include that information in your bug report.
You can also try to fix the bug yourself and submit a Pull Request. [TextMate's documentation](https://manual.macromates.com/en/language_grammars) offers a good introduction on how to work with TextMate-compatible grammars. You can test grammars using [Lightshow](https://github-lightshow.herokuapp.com).
Once the bug has been fixed upstream, we'll pick it up for GitHub in the next release of Linguist.
## Changing the source of a syntax highlighting grammar
We'd like to ensure Linguist and GitHub.com are using the latest and greatest grammars that are consistent with the current usage but understand that sometimes a grammar can lag behind the evolution of a language or even stop being developed. This often results in someone grasping the opportunity to create a newer and better and more actively maintained grammar, and we'd love to use it and pass on it's functionality to our users.
Switching the source of a grammar is really easy:
script/add-grammar --replace MyGrammar https://github.com/PeterPan/MyGrammar
This command will analyze the grammar and, if no problems are found, add it to the repository. If problems are found, please report these problems to the grammar maintainer as you will not be able to add the grammar if problems are found.
**Please only add grammars that have [one of these licenses][licenses].**
Please then open a pull request for the updated grammar.
## Testing
You can run the tests locally with:
bundle exec rake test
@@ -102,7 +135,7 @@ Linguist is maintained with :heart: by:
As Linguist is a production dependency for GitHub we have a couple of workflow restrictions:
- Anyone with commit rights can merge Pull Requests provided that there is a :+1: from a GitHub member of staff
- Anyone with commit rights can merge Pull Requests provided that there is a :+1: from a GitHub staff member.
- Releases are performed by GitHub staff so we can ensure GitHub.com always stays up to date with the latest release of Linguist and there are no regressions in production.
### Releasing
@@ -123,9 +156,11 @@ If you are the current maintainer of this gem:
1. Test behavior locally, branch deploy, whatever needs to happen
1. Merge github/linguist PR
1. Tag and push: `git tag vx.xx.xx; git push --tags`
1. Create a GitHub release with the pushed tag (https://github.com/github/linguist/releases/new)
1. Build a grammars tarball (`./script/build-grammars-tarball`) and attach it to the GitHub release
1. Push to rubygems.org -- `gem push github-linguist-3.0.0.gem`
[grammars]: /grammars.yml
[grammars]: /vendor/README.md
[languages]: /lib/linguist/languages.yml
[licenses]: https://github.com/github/linguist/blob/257425141d4e2a5232786bf0b13c901ada075f93/vendor/licenses/config.yml#L2-L11
[samples]: /samples

176
README.md
View File

@@ -1,38 +1,130 @@
# Linguist
[![Build Status](https://travis-ci.org/github/linguist.svg?branch=master)](https://travis-ci.org/github/linguist)
[issues]: https://github.com/github/linguist/issues
[new-issue]: https://github.com/github/linguist/issues/new
This library is used on GitHub.com to detect blob languages, ignore binary or vendored files, suppress generated files in diffs, and generate language breakdown graphs.
See [Troubleshooting](#troubleshooting) and [`CONTRIBUTING.md`](/CONTRIBUTING.md) before filing an issue or creating a pull request.
See [Troubleshooting](#troubleshooting) and [`CONTRIBUTING.md`](CONTRIBUTING.md) before filing an issue or creating a pull request.
## How Linguist works
Linguist takes the list of languages it knows from [`languages.yml`](/lib/linguist/languages.yml) and uses a number of methods to try and determine the language used by each file, and the overall repository breakdown.
Linguist starts by going through all the files in a repository and excludes all files that it determines to be binary data, [vendored code](#vendored-code), [generated code](#generated-code), [documentation](#documentation), or are defined as `data` (e.g. SQL) or `prose` (e.g. Markdown) languages, whilst taking into account any [overrides](#overrides).
If an [explicit language override](#using-gitattributes) has been used, that language is used for the matching files. The language of each remaining file is then determined using the following strategies, in order, with each step either identifying the precise language or reducing the number of likely languages passed down to the next strategy:
- Vim or Emacs modeline,
- commonly used filename,
- shell shebang,
- file extension,
- heuristics,
- naïve Bayesian classification
The result of this analysis is used to produce the language stats bar which displays the languages percentages for the files in the repository. The percentages are calculated based on the bytes of code for each language as reported by the [List Languages](https://developer.github.com/v3/repos/#list-languages) API.
![language stats bar](https://cloud.githubusercontent.com/assets/173/5562290/48e24654-8ddf-11e4-8fe7-735b0ce3a0d3.png)
### How Linguist works on GitHub.com
When you push changes to a repository on GitHub.com, a low priority background job is enqueued to analyze your repository as explained above. The results of this analysis are cached for the lifetime of your repository and are only updated when the repository is updated. As this analysis is performed by a low priority background job, it can take a while, particularly during busy periods, for your language statistics bar to reflect your changes.
## Usage
Install the gem:
$ gem install github-linguist
#### Dependencies
Linguist uses the [`charlock_holmes`](https://github.com/brianmario/charlock_holmes) character encoding detection library which in turn uses [ICU](http://site.icu-project.org/), and the libgit2 bindings for Ruby provided by [`rugged`](https://github.com/libgit2/rugged). These components have their own dependencies - `icu4c`, and `cmake` and `pkg-config` respectively - which you may need to install before you can install Linguist.
For example, on macOS with [Homebrew](http://brew.sh/): `brew install cmake pkg-config icu4c` and on Ubuntu: `apt-get install cmake pkg-config libicu-dev`.
### Application usage
Linguist can be used in your application as follows:
```ruby
require 'rugged'
require 'linguist'
repo = Rugged::Repository.new('.')
project = Linguist::Repository.new(repo, repo.head.target_id)
project.language #=> "Ruby"
project.languages #=> { "Ruby" => 119387 }
```
### Command line usage
A repository's languages stats can also be assessed from the command line using the `linguist` executable. Without any options, `linguist` will output the breakdown that correlates to what is shown in the language stats bar. The `--breakdown` flag will additionally show the breakdown of files by language.
You can try running `linguist` on the root directory in this repository itself:
```console
$ bundle exec bin/linguist --breakdown
68.57% Ruby
22.90% C
6.93% Go
1.21% Lex
0.39% Shell
Ruby:
Gemfile
Rakefile
bin/git-linguist
bin/linguist
ext/linguist/extconf.rb
github-linguist.gemspec
lib/linguist.rb
```
## Troubleshooting
### My repository is detected as the wrong language
![language stats bar](https://cloud.githubusercontent.com/assets/173/5562290/48e24654-8ddf-11e4-8fe7-735b0ce3a0d3.png)
If the language stats bar is reporting a language that you don't expect:
The Language stats bar displays languages percentages for the files in the repository. The percentages are calculated based on the bytes of code for each language as reported by the [List Languages](https://developer.github.com/v3/repos/#list-languages) API. If the bar is reporting a language that you don't expect:
1. Click on the name of the language in the stats bar to see a list of the files that are identified as that language.
1. If you see files that you didn't write, consider moving the files into one of the [paths for vendored code](/lib/linguist/vendor.yml), or use the [manual overrides](#overrides) feature to ignore them.
1. If the files are being misclassified, search for [open issues][issues] to see if anyone else has already reported the issue. Any information you can add, especially links to public repositories, is helpful.
1. Click on the name of the language in the stats bar to see a list of the files that are identified as that language.
Keep in mind this performs a search so the [code search restrictions](https://help.github.com/articles/searching-code/#considerations-for-code-search) may result in files identified in the language statistics not appearing in the search results. [Installing Linguist locally](#usage) and running it from the [command line](#command-line-usage) will give you accurate results.
1. If you see files that you didn't write in the search results, consider moving the files into one of the [paths for vendored code](/lib/linguist/vendor.yml), or use the [manual overrides](#overrides) feature to ignore them.
1. If the files are misclassified, search for [open issues][issues] to see if anyone else has already reported the issue. Any information you can add, especially links to public repositories, is helpful. You can also use the [manual overrides](#overrides) feature to correctly classify them in your repository.
1. If there are no reported issues of this misclassification, [open an issue][new-issue] and include a link to the repository or a sample of the code that is being misclassified.
Keep in mind that the repository language stats are only [updated when you push changes](#how-linguist-works-on-githubcom), and the results are cached for the lifetime of your repository. If you have not made any changes to your repository in a while, you may find pushing another change will correct the stats.
### My repository isn't showing my language
Linguist does not consider [vendored code](#vendored-code), [generated code](#generated-code), [documentation](#documentation), or `data` (e.g. SQL) or `prose` (e.g. Markdown) languages (as defined by the `type` attribute in [`languages.yml`](/lib/linguist/languages.yml)) when calculating the repository language statistics.
If the language statistics bar is not showing your language at all, it could be for a few reasons:
1. Linguist doesn't know about your language.
1. The extension you have chosen is not associated with your language in [`languages.yml`](/lib/linguist/languages.yml).
1. All the files in your repository fall into one of the categories listed above that Linguist excludes by default.
If Linguist doesn't know about the language or the extension you're using, consider [contributing](CONTRIBUTING.md) to Linguist by opening a pull request to add support for your language or extension. For everything else, you can use the [manual overrides](#overrides) feature to tell Linguist to include your files in the language statistics.
### There's a problem with the syntax highlighting of a file
Linguist detects the language of a file but the actual syntax-highlighting is powered by a set of language grammars which are included in this project as a set of submodules [and may be found here](https://github.com/github/linguist/blob/master/vendor/README.md).
Linguist detects the language of a file but the actual syntax-highlighting is powered by a set of language grammars which are included in this project as a set of submodules [as listed here](/vendor/README.md).
If you experience an issue with the syntax-highlighting on GitHub, **please report the issue to the upstream grammar repository, not here.** Grammars are updated every time we build the Linguist gem so upstream bug fixes are automatically incorporated as they are fixed.
If you experience an issue with the syntax-highlighting on GitHub, **please report the issue to the upstream grammar repository, not here.** Grammars are updated every time we build the Linguist gem and so upstream bug fixes are automatically incorporated as they are fixed.
## Overrides
Linguist supports a number of different custom overrides strategies for language definitions and vendored paths.
Linguist supports a number of different custom override strategies for language definitions and file paths.
### Using gitattributes
Add a `.gitattributes` file to your project and use standard git-style path matchers for the files you want to override to set `linguist-documentation`, `linguist-language`, `linguist-vendored`, and `linguist-generated`. `.gitattributes` will be used to determine language statistics and will be used to syntax highlight files. You can also manually set syntax highlighting using [Vim or Emacs modelines](#using-emacs-or-vim-modelines).
Add a `.gitattributes` file to your project and use standard git-style path matchers for the files you want to override using the `linguist-documentation`, `linguist-language`, `linguist-vendored`, `linguist-generated` and `linguist-detectable` attributes. `.gitattributes` will be used to determine language statistics and will be used to syntax highlight files. You can also manually set syntax highlighting using [Vim or Emacs modelines](#using-emacs-or-vim-modelines).
```
$ cat .gitattributes
@@ -41,7 +133,7 @@ $ cat .gitattributes
#### Vendored code
Checking code you didn't write, such as JavaScript libraries, into your git repo is a common practice, but this often inflates your project's language stats and may even cause your project to be labeled as another language. By default, Linguist treats all of the paths defined in [lib/linguist/vendor.yml](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml) as vendored and therefore doesn't include them in the language statistics for a repository.
Checking code you didn't write, such as JavaScript libraries, into your git repo is a common practice, but this often inflates your project's language stats and may even cause your project to be labeled as another language. By default, Linguist treats all of the paths defined in [`vendor.yml`](/lib/linguist/vendor.yml) as vendored and therefore doesn't include them in the language statistics for a repository.
Use the `linguist-vendored` attribute to vendor or un-vendor paths.
@@ -53,7 +145,7 @@ jquery.js linguist-vendored=false
#### Documentation
Just like vendored files, Linguist excludes documentation files from your project's language stats. [lib/linguist/documentation.yml](lib/linguist/documentation.yml) lists common documentation paths and excludes them from the language statistics for your repository.
Just like vendored files, Linguist excludes documentation files from your project's language stats. [`documentation.yml`](/lib/linguist/documentation.yml) lists common documentation paths and excludes them from the language statistics for your repository.
Use the `linguist-documentation` attribute to mark or unmark paths as documentation.
@@ -65,13 +157,28 @@ docs/formatter.rb linguist-documentation=false
#### Generated code
Not all plain text files are true source files. Generated files like minified js and compiled CoffeeScript can be detected and excluded from language stats. As an added bonus, unlike vendored and documentation files, these files are suppressed in diffs.
Not all plain text files are true source files. Generated files like minified JavaScript and compiled CoffeeScript can be detected and excluded from language stats. As an added bonus, unlike vendored and documentation files, these files are suppressed in diffs. [`generated.rb`](/lib/linguist/generated.rb) lists common generated paths and excludes them from the language statistics of your repository.
Use the `linguist-generated` attribute to mark or unmark paths as generated.
```
$ cat .gitattributes
Api.elm linguist-generated=true
```
#### Detectable
Only programming languages are included in the language statistics. Languages of a different type (as defined in [`languages.yml`](/lib/linguist/languages.yml)) are not "detectable" causing them not to be included in the language statistics.
Use the `linguist-detectable` attribute to mark or unmark paths as detectable.
```
$ cat .gitattributes
*.kicad_pcb linguist-detectable=true
*.sch linguist-detectable=true
tools/export_bom.py linguist-detectable=false
```
### Using Emacs or Vim modelines
If you do not want to use `.gitattributes` to override the syntax highlighting used on GitHub.com, you can use Vim or Emacs style modelines to set the language for a single file. Modelines can be placed anywhere within a file and are respected when determining how to syntax-highlight a file on GitHub.com
@@ -90,52 +197,15 @@ vim: set ft=cpp:
-*- mode: php;-*-
```
## Usage
Install the gem:
```
$ gem install github-linguist
```
Then use it in your application:
```ruby
require 'rugged'
require 'linguist'
repo = Rugged::Repository.new('.')
project = Linguist::Repository.new(repo, repo.head.target_id)
project.language #=> "Ruby"
project.languages #=> { "Ruby" => 119387 }
```
These stats are also printed out by the `linguist` executable. You can use the
`--breakdown` flag, and the binary will also output the breakdown of files by language.
You can try running `linguist` on the root directory in this repository itself:
```
$ bundle exec linguist --breakdown
100.00% Ruby
Ruby:
Gemfile
Rakefile
bin/linguist
github-linguist.gemspec
lib/linguist.rb
```
## Contributing
Please check out our [contributing guidelines](CONTRIBUTING.md).
## License
The language grammars included in this gem are covered by their repositories'
respective licenses. `grammars.yml` specifies the repository for each grammar.
respective licenses. [`vendor/README.md`](/vendor/README.md) lists the repository for each grammar.
All other files are covered by the MIT license, see `LICENSE`.

View File

@@ -56,7 +56,7 @@ end
task :build_gem => :samples do
rm_rf "grammars"
sh "script/convert-grammars"
sh "script/grammar-compiler compile -o grammars || true"
languages = YAML.load_file("lib/linguist/languages.yml")
File.write("lib/linguist/languages.json", Yajl.dump(languages))
`gem build github-linguist.gemspec`

View File

@@ -117,9 +117,8 @@ def git_linguist(args)
end
parser.parse!(args)
git_dir = `git rev-parse --git-dir`.strip
raise "git-linguist must be run in a Git repository (#{Dir.pwd})" unless $?.success?
raise "git-linguist must be run in a Git repository" unless $?.success?
wrapper = GitLinguist.new(git_dir, commit, incremental)
case args.pop
@@ -141,6 +140,10 @@ def git_linguist(args)
$stderr.print(parser.help)
exit 1
end
rescue Exception => e
$stderr.puts e.message
$stderr.puts e.backtrace
exit 1
end
git_linguist(ARGV)

View File

@@ -2,7 +2,7 @@ require File.expand_path('../lib/linguist/version', __FILE__)
Gem::Specification.new do |s|
s.name = 'github-linguist'
s.version = Linguist::VERSION
s.version = ENV['GEM_VERSION'] || Linguist::VERSION
s.summary = "GitHub Language detection"
s.description = 'We use this library at GitHub to detect blob languages, highlight code, ignore binary files, suppress generated files in diffs, and generate language breakdown graphs.'
@@ -14,8 +14,8 @@ Gem::Specification.new do |s|
s.executables = ['linguist', 'git-linguist']
s.extensions = ['ext/linguist/extconf.rb']
s.add_dependency 'charlock_holmes', '~> 0.7.5'
s.add_dependency 'escape_utils', '~> 1.1.0'
s.add_dependency 'charlock_holmes', '~> 0.7.6'
s.add_dependency 'escape_utils', '~> 1.2.0'
s.add_dependency 'mime-types', '>= 1.19'
s.add_dependency 'rugged', '>= 0.25.1'
@@ -27,6 +27,6 @@ Gem::Specification.new do |s|
s.add_development_dependency 'rake'
s.add_development_dependency 'yajl-ruby'
s.add_development_dependency 'color-proximity', '~> 0.2.1'
s.add_development_dependency 'licensed'
s.add_development_dependency 'licensee', '~> 8.8.0'
s.add_development_dependency 'licensed', '~> 1.0.0'
s.add_development_dependency 'licensee'
end

View File

@@ -9,6 +9,8 @@ vendor/grammars/Agda.tmbundle:
- source.agda
vendor/grammars/Alloy.tmbundle:
- source.alloy
vendor/grammars/Assembly-Syntax-Definition:
- source.assembly.unix
vendor/grammars/AutoHotkey:
- source.ahk
vendor/grammars/BrightScript.tmbundle:
@@ -48,6 +50,8 @@ vendor/grammars/Lean.tmbundle:
- source.lean
vendor/grammars/LiveScript.tmbundle:
- source.livescript
vendor/grammars/MATLAB-Language-grammar:
- source.matlab
vendor/grammars/MQL5-sublime:
- source.mql5
vendor/grammars/MagicPython:
@@ -195,6 +199,9 @@ vendor/grammars/atom-language-clean:
vendor/grammars/atom-language-julia:
- source.julia
- source.julia.console
vendor/grammars/atom-language-nextflow:
- source.nextflow
- source.nextflow-groovy
vendor/grammars/atom-language-p4:
- source.p4
vendor/grammars/atom-language-perl6:
@@ -243,6 +250,8 @@ vendor/grammars/chapel-tmbundle:
vendor/grammars/cmake.tmbundle:
- source.cache.cmake
- source.cmake
vendor/grammars/conllu-linguist-grammar:
- text.conllu
vendor/grammars/cool-tmbundle:
- source.cool
vendor/grammars/cpp-qt.tmbundle:
@@ -313,12 +322,9 @@ vendor/grammars/graphviz.tmbundle:
- source.dot
vendor/grammars/groovy.tmbundle:
- source.groovy
vendor/grammars/haxe-sublime-bundle:
- source.erazor
- source.haxe.2
- source.hss.1
vendor/grammars/haxe-TmLanguage:
- source.hx
- source.hxml
- source.nmml
vendor/grammars/html.tmbundle:
- text.html.basic
vendor/grammars/idl.tmbundle:
@@ -351,8 +357,6 @@ vendor/grammars/jflex.tmbundle:
- source.jflex
vendor/grammars/json.tmbundle:
- source.json
vendor/grammars/kotlin-sublime-package:
- source.Kotlin
vendor/grammars/language-agc:
- source.agc
vendor/grammars/language-apl:
@@ -384,7 +388,6 @@ vendor/grammars/language-csharp:
- source.cake
- source.cs
- source.csx
- source.nant-build
vendor/grammars/language-csound:
- source.csound
- source.csound-document
@@ -436,6 +439,8 @@ vendor/grammars/language-jolie:
vendor/grammars/language-jsoniq:
- source.jq
- source.xq
vendor/grammars/language-kotlin:
- source.kotlin
vendor/grammars/language-less:
- source.css.less
vendor/grammars/language-maxscript:
@@ -481,6 +486,8 @@ vendor/grammars/language-ruby:
- source.ruby
- source.ruby.gemfile
- text.html.erb
vendor/grammars/language-sed:
- source.sed
vendor/grammars/language-shellscript:
- source.shell
- text.shell-session
@@ -538,9 +545,6 @@ vendor/grammars/marko-tmbundle:
- text.marko
vendor/grammars/mathematica-tmbundle:
- source.mathematica
vendor/grammars/matlab.tmbundle:
- source.matlab
- source.octave
vendor/grammars/maven.tmbundle:
- text.xml.pom
vendor/grammars/mediawiki.tmbundle:
@@ -619,7 +623,6 @@ vendor/grammars/sass-textmate-bundle:
- source.sass
vendor/grammars/scala.tmbundle:
- source.sbt
- source.scala
vendor/grammars/scheme.tmbundle:
- source.scheme
vendor/grammars/scilab.tmbundle:
@@ -726,6 +729,8 @@ vendor/grammars/verilog.tmbundle:
- source.verilog
vendor/grammars/vhdl:
- source.vhdl
vendor/grammars/vscode-scala-syntax:
- source.scala
vendor/grammars/vue-syntax-highlight:
- text.html.vue
vendor/grammars/wdl-sublime-syntax-highlighter:

View File

@@ -11,11 +11,13 @@ module Linguist
#
# path - A path String (does not necessarily exists on the file system).
# content - Content of the file.
# symlink - Whether the file is a symlink.
#
# Returns a Blob.
def initialize(path, content)
def initialize(path, content, symlink: false)
@path = path
@content = content
@symlink = symlink
end
# Public: Filename
@@ -69,5 +71,12 @@ module Linguist
"." + segments[index..-1].join(".")
end
end
# Public: Is this a symlink?
#
# Returns true or false.
def symlink?
@symlink
end
end
end

View File

@@ -383,7 +383,10 @@ module Linguist
!vendored? &&
!documentation? &&
!generated? &&
language && DETECTABLE_TYPES.include?(language.type)
language && ( defined?(detectable?) && !detectable?.nil? ?
detectable? :
DETECTABLE_TYPES.include?(language.type)
)
end
end
end

View File

@@ -15,6 +15,7 @@
- ^[Mm]an/
- ^[Ee]xamples/
- ^[Dd]emos?/
- (^|/)inst/doc/
## Documentation files ##

View File

@@ -26,6 +26,11 @@ module Linguist
@mode ||= File.stat(@fullpath).mode.to_s(8)
end
def symlink?
return @symlink if defined? @symlink
@symlink = (File.symlink?(@fullpath) rescue false)
end
# Public: Read file contents.
#
# Returns a String.

View File

@@ -57,6 +57,7 @@ module Linguist
generated_net_designer_file? ||
generated_net_specflow_feature_file? ||
composer_lock? ||
cargo_lock? ||
node_modules? ||
go_vendor? ||
npm_shrinkwrap_or_package_lock? ||
@@ -222,7 +223,7 @@ module Linguist
#
# Returns true or false
def generated_net_designer_file?
name.downcase =~ /\.designer\.cs$/
name.downcase =~ /\.designer\.(cs|vb)$/
end
# Internal: Is this a codegen file for Specflow feature file?
@@ -378,6 +379,13 @@ module Linguist
!!name.match(/.\.zep\.(?:c|h|php)$/)
end
# Internal: Is the blob a generated Rust Cargo lock file?
#
# Returns true or false.
def cargo_lock?
!!name.match(/Cargo\.lock/)
end
# Is the blob a VCR Cassette file?
#
# Returns true or false

View File

@@ -16,6 +16,8 @@ module Linguist
#
# Returns an Array of languages, or empty if none matched or were inconclusive.
def self.call(blob, candidates)
return [] if blob.symlink?
data = blob.data[0...HEURISTICS_CONSIDER_BYTES]
@heuristics.each do |heuristic|
@@ -511,5 +513,15 @@ module Linguist
end
end
disambiguate ".x" do |data|
if /\b(program|version)\s+\w+\s*{|\bunion\s+\w+\s+switch\s*\(/.match(data)
Language["RPC"]
elsif /^%(end|ctor|hook|group)\b/.match(data)
Language["Logos"]
elsif /OUTPUT_ARCH\(|OUTPUT_FORMAT\(|SECTIONS/.match(data)
Language["Linker Script"]
end
end
end
end

View File

@@ -539,14 +539,6 @@ module Linguist
end
end
if fns = filenames[name]
fns.each do |filename|
if !options['filenames'].include?(filename)
options['filenames'] << filename
end
end
end
Language.create(
:name => name,
:color => options['color'],

View File

@@ -239,6 +239,10 @@ ApacheConf:
extensions:
- ".apacheconf"
- ".vhost"
filenames:
- ".htaccess"
- apache2.conf
- httpd.conf
tm_scope: source.apache-config
ace_mode: apache_conf
language_id: 16
@@ -279,16 +283,6 @@ Arc:
tm_scope: none
ace_mode: text
language_id: 20
Arduino:
type: programming
color: "#bd79d1"
extensions:
- ".ino"
tm_scope: source.c++
ace_mode: c_cpp
codemirror_mode: clike
codemirror_mime_type: text/x-c++src
language_id: 21
AsciiDoc:
type: prose
ace_mode: asciidoc
@@ -311,6 +305,7 @@ Assembly:
type: programming
color: "#6E4C13"
aliases:
- asm
- nasm
extensions:
- ".asm"
@@ -528,6 +523,7 @@ C++:
- ".hxx"
- ".inc"
- ".inl"
- ".ino"
- ".ipp"
- ".re"
- ".tcc"
@@ -738,6 +734,17 @@ Closure Templates:
- ".soy"
tm_scope: text.html.soy
language_id: 357046146
CoNLL-U:
type: data
extensions:
- ".conllu"
- ".conll"
tm_scope: text.conllu
ace_mode: text
aliases:
- CoNLL
- CoNLL-X
language_id: 421026389
CoffeeScript:
type: programming
tm_scope: source.coffee
@@ -1464,6 +1471,8 @@ GN:
- ".gni"
interpreters:
- gn
filenames:
- ".gn"
tm_scope: source.gn
ace_mode: python
codemirror_mode: python
@@ -1701,6 +1710,7 @@ HCL:
extensions:
- ".hcl"
- ".tf"
- ".tfvars"
ace_mode: ruby
codemirror_mode: ruby
codemirror_mime_type: text/x-ruby
@@ -1741,6 +1751,7 @@ HTML+Django:
group: HTML
extensions:
- ".jinja"
- ".jinja2"
- ".mustache"
- ".njk"
aliases:
@@ -1810,6 +1821,13 @@ HTTP:
codemirror_mode: http
codemirror_mime_type: message/http
language_id: 152
HXML:
type: data
ace_mode: text
extensions:
- ".hxml"
tm_scope: source.hxml
language_id: 786683730
Hack:
type: programming
ace_mode: php
@@ -1872,7 +1890,7 @@ Haxe:
extensions:
- ".hx"
- ".hxsl"
tm_scope: source.haxe.2
tm_scope: source.hx
language_id: 158
Hy:
type: programming
@@ -1920,6 +1938,8 @@ INI:
- ".pro"
- ".properties"
filenames:
- ".editorconfig"
- ".gitconfig"
- buildozer.spec
tm_scope: source.ini
aliases:
@@ -2032,12 +2052,23 @@ JSON:
searchable: false
extensions:
- ".json"
- ".avsc"
- ".geojson"
- ".gltf"
- ".JSON-tmLanguage"
- ".jsonl"
- ".tfstate"
- ".tfstate.backup"
- ".topojson"
- ".webapp"
- ".webmanifest"
filenames:
- ".arcconfig"
- ".htmlhintrc"
- ".jscsrc"
- ".jshintrc"
- ".tern-config"
- ".tern-project"
- composer.lock
- mcmod.info
language_id: 174
@@ -2047,6 +2078,7 @@ JSON5:
- ".json5"
filenames:
- ".babelrc"
- ".jslintrc"
tm_scope: source.js
ace_mode: javascript
codemirror_mode: javascript
@@ -2251,7 +2283,7 @@ Kotlin:
- ".kt"
- ".ktm"
- ".kts"
tm_scope: source.Kotlin
tm_scope: source.kotlin
ace_mode: text
codemirror_mode: clike
codemirror_mime_type: text/x-kotlin
@@ -2372,6 +2404,7 @@ Linker Script:
extensions:
- ".ld"
- ".lds"
- ".x"
filenames:
- ld.script
tm_scope: none
@@ -2483,6 +2516,7 @@ Lua:
- ".lua"
- ".fcgi"
- ".nse"
- ".p8"
- ".pd_lua"
- ".rbxs"
- ".wlua"
@@ -2905,6 +2939,18 @@ NewLisp:
codemirror_mode: commonlisp
codemirror_mime_type: text/x-common-lisp
language_id: 247
Nextflow:
type: programming
ace_mode: groovy
tm_scope: source.nextflow
color: "#3ac486"
extensions:
- ".nf"
filenames:
- nextflow.config
interpreters:
- nextflow
language_id: 506780613
Nginx:
type: data
extensions:
@@ -3195,6 +3241,7 @@ PHP:
- ".phps"
- ".phpt"
filenames:
- ".php"
- ".php_cs"
- ".php_cs.dist"
- Phakefile
@@ -3339,9 +3386,15 @@ Perl:
- ".psgi"
- ".t"
filenames:
- Makefile.PL
- Rexfile
- ack
- cpanfile
interpreters:
- cperl
- perl
aliases:
- cperl
language_id: 282
Perl 6:
type: programming
@@ -3358,10 +3411,10 @@ Perl 6:
- ".pm"
- ".pm6"
- ".t"
filenames:
- Rexfile
interpreters:
- perl6
aliases:
- perl6
tm_scope: source.perl6fe
ace_mode: perl
codemirror_mode: perl
@@ -3483,6 +3536,8 @@ PowerShell:
- ".ps1"
- ".psd1"
- ".psm1"
interpreters:
- pwsh
language_id: 293
Processing:
type: programming
@@ -3625,6 +3680,7 @@ Python:
- python3
aliases:
- rusthon
- python3
language_id: 303
Python console:
type: programming
@@ -3675,6 +3731,7 @@ R:
- ".rsx"
filenames:
- ".Rprofile"
- expr-dist
interpreters:
- Rscript
ace_mode: r
@@ -3747,6 +3804,17 @@ RMarkdown:
- ".rmd"
tm_scope: source.gfm
language_id: 313
RPC:
type: programming
aliases:
- rpcgen
- oncrpc
- xdr
ace_mode: c_cpp
extensions:
- ".x"
tm_scope: source.c
language_id: 1031374237
RPM Spec:
type: data
tm_scope: source.rpm-spec
@@ -3990,6 +4058,7 @@ Ruby:
- Berksfile
- Brewfile
- Buildfile
- Capfile
- Dangerfile
- Deliverfile
- Fastfile
@@ -4170,6 +4239,7 @@ Scala:
color: "#c22d40"
extensions:
- ".scala"
- ".kojo"
- ".sbt"
- ".sc"
interpreters:
@@ -4252,8 +4322,29 @@ Shell:
- ".bash_logout"
- ".bash_profile"
- ".bashrc"
- ".cshrc"
- ".login"
- ".profile"
- ".zlogin"
- ".zlogout"
- ".zprofile"
- ".zshenv"
- ".zshrc"
- 9fs
- PKGBUILD
- bash_logout
- bash_profile
- bashrc
- cshrc
- gradlew
- login
- man
- profile
- zlogin
- zlogout
- zprofile
- zshenv
- zshrc
interpreters:
- ash
- bash
@@ -4334,6 +4425,12 @@ Smarty:
codemirror_mime_type: text/x-smarty
tm_scope: text.html.smarty
language_id: 353
Solidity:
type: programming
color: "#AA6746"
ace_mode: text
tm_scope: source.solidity
language_id: 237469032
SourcePawn:
type: programming
color: "#5c7611"
@@ -4497,6 +4594,8 @@ TOML:
type: data
extensions:
- ".toml"
filenames:
- Cargo.lock
tm_scope: source.toml
ace_mode: toml
codemirror_mode: toml
@@ -4516,6 +4615,9 @@ Tcl:
- ".tcl"
- ".adp"
- ".tm"
filenames:
- owh
- starfield
interpreters:
- tclsh
- wish
@@ -4591,6 +4693,7 @@ Text:
- ".no"
filenames:
- COPYING
- COPYING.regex
- COPYRIGHT.regex
- FONTLOG
- INSTALL
@@ -4605,6 +4708,7 @@ Text:
- delete.me
- keep.me
- read.me
- readme.1st
- test.me
tm_scope: none
ace_mode: text
@@ -4706,7 +4810,7 @@ Unix Assembly:
extensions:
- ".s"
- ".ms"
tm_scope: source.assembly
tm_scope: source.assembly.unix
ace_mode: assembly_x86
language_id: 120
Uno:
@@ -4792,6 +4896,7 @@ Vim script:
extensions:
- ".vim"
filenames:
- ".gvimrc"
- ".nvimrc"
- ".vimrc"
- _vimrc
@@ -5030,6 +5135,7 @@ XML:
- ".zcml"
filenames:
- ".classpath"
- ".cproject"
- ".project"
- App.config
- NuGet.config
@@ -5138,6 +5244,7 @@ YAML:
filenames:
- ".clang-format"
- ".clang-tidy"
- ".gemrc"
ace_mode: yaml
codemirror_mode: yaml
codemirror_mime_type: text/x-yaml
@@ -5255,6 +5362,19 @@ reStructuredText:
codemirror_mode: rst
codemirror_mime_type: text/x-rst
language_id: 419
sed:
type: programming
color: "#64b970"
extensions:
- ".sed"
interpreters:
- gsed
- minised
- sed
- ssed
ace_mode: text
tm_scope: source.sed
language_id: 847830017
wdl:
type: programming
color: "#42f1f4"

View File

@@ -7,7 +7,8 @@ module Linguist
GIT_ATTR = ['linguist-documentation',
'linguist-language',
'linguist-vendored',
'linguist-generated']
'linguist-generated',
'linguist-detectable']
GIT_ATTR_OPTS = { :priority => [:index], :skip_system => true }
GIT_ATTR_FLAGS = Rugged::Repository::Attributes.parse_opts(GIT_ATTR_OPTS)
@@ -70,6 +71,14 @@ module Linguist
end
end
def detectable?
if attr = git_attributes['linguist-detectable']
return boolean_attribute(attr)
else
nil
end
end
def data
load_blob!
@data
@@ -80,6 +89,11 @@ module Linguist
@size
end
def symlink?
# We don't create LazyBlobs for symlinks.
false
end
def cleanup!
@data.clear if @data
end

View File

@@ -3,15 +3,20 @@ module Linguist
# Public: Use shebang to detect language of the blob.
#
# blob - An object that quacks like a blob.
# candidates - A list of candidate languages.
#
# Examples
#
# Shebang.call(FileBlob.new("path/to/file"))
#
# Returns an Array with one Language if the blob has a shebang with a valid
# interpreter, or empty if there is no shebang.
def self.call(blob, _ = nil)
Language.find_by_interpreter interpreter(blob.data)
# Returns an array of languages from the candidate list for which the
# blob's shebang is valid. Returns an empty list if there is no shebang.
# If the candidate list is empty, any language is a valid candidate.
def self.call(blob, candidates)
return [] if blob.symlink?
languages = Language.find_by_interpreter interpreter(blob.data)
candidates.any? ? candidates & languages : languages
end
# Public: Get the interpreter from the shebang

View File

@@ -2,8 +2,21 @@ module Linguist
module Strategy
# Detects language based on extension
class Extension
def self.call(blob, _)
Language.find_by_extension(blob.name.to_s)
# Public: Use the file extension to detect the blob's language.
#
# blob - An object that quacks like a blob.
# candidates - A list of candidate languages.
#
# Examples
#
# Extension.call(FileBlob.new("path/to/file"))
#
# Returns an array of languages associated with a blob's file extension.
# Selected languages must be in the candidate list, except if it's empty,
# in which case any language is a valid candidate.
def self.call(blob, candidates)
languages = Language.find_by_extension(blob.name.to_s)
candidates.any? ? candidates & languages : languages
end
end
end

View File

@@ -2,9 +2,22 @@ module Linguist
module Strategy
# Detects language based on filename
class Filename
def self.call(blob, _)
# Public: Use the filename to detect the blob's language.
#
# blob - An object that quacks like a blob.
# candidates - A list of candidate languages.
#
# Examples
#
# Filename.call(FileBlob.new("path/to/file"))
#
# Returns an array of languages with a associated blob's filename.
# Selected languages must be in the candidate list, except if it's empty,
# in which case any language is a valid candidate.
def self.call(blob, candidates)
name = blob.name.to_s
Language.find_by_filename(name)
languages = Language.find_by_filename(name)
candidates.any? ? candidates & languages : languages
end
end
end

View File

@@ -109,6 +109,8 @@ module Linguist
# Returns an Array with one Language if the blob has a Vim or Emacs modeline
# that matches a Language name or alias. Returns an empty array if no match.
def self.call(blob, _ = nil)
return [] if blob.symlink?
header = blob.first_lines(SEARCH_SCOPE).join("\n")
footer = blob.last_lines(SEARCH_SCOPE).join("\n")
Array(Language.find_by_alias(modeline(header + footer)))

View File

@@ -80,6 +80,9 @@
# Animate.css
- (^|/)animate\.(css|less|scss|styl)$
# Materialize.css
- (^|/)materialize\.(css|less|scss|styl|js)$
# Select2
- (^|/)select2/.*\.(css|scss|js)$
@@ -273,6 +276,13 @@
- (^|/)gradlew\.bat$
- (^|/)gradle/wrapper/
## Java ##
# Maven
- (^|/)mvnw$
- (^|/)mvnw\.cmd$
- (^|/)\.mvn/wrapper/
## .NET ##
# Visual Studio IntelliSense

View File

@@ -1,3 +1,3 @@
module Linguist
VERSION = "5.3.3"
VERSION = "6.0.1"
end

159
samples/CoNLL-U/CF1.conllu Normal file
View File

@@ -0,0 +1,159 @@
# text = PT no governo
# source = CETENFolha n=1 cad=Opinião sec=opi sem=94a
# sent_id = CF1-1
# id = 1
1 PT PT PROPN PROP|M|S|@NPHR Gender=Masc|Number=Sing 0 root _ _
2-3 no _ _ _ _ _ _ _ _
2 em em ADP <sam->|PRP|@N< _ 4 case _ _
3 o o DET <-sam>|<artd>|ART|M|S|@>N Definite=Def|Gender=Masc|Number=Sing|PronType=Art 4 det _ _
4 governo governo NOUN <np-def>|N|M|S|@P< Gender=Masc|Number=Sing 1 nmod _ _
# text = BRASÍLIA Pesquisa Datafolha publicada hoje revela um dado supreendente: recusando uma postura radical, a esmagadora maioria (77%) dos eleitores quer o PT participando do Governo Fernando Henrique Cardoso.
# source = CETENFolha n=1 cad=Opinião sec=opi sem=94a &W
# sent_id = CF1-3
# id = 2
1 BRASÍLIA Brasília PROPN PROP|F|S|@ADVL> Gender=Fem|Number=Sing 6 dep _ _
2 Pesquisa Pesquisa PROPN _ Gender=Fem|Number=Sing 6 nsubj _ ChangedBy=Issue119|MWE=Pesquisa_Datafolha|MWEPOS=PROPN
3 Datafolha Datafolha PROPN _ Number=Sing 2 flat:name _ ChangedBy=Issue119
4 publicada publicar VERB <mv>|V|PCP|F|S|@ICL-N< Gender=Fem|Number=Sing|VerbForm=Part 2 acl _ _
5 hoje hoje ADV ADV|@<ADVL _ 4 advmod _ _
6 revela revelar VERB <mv>|V|PR|3S|IND|@FS-STA Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin 0 root _ _
7 um um DET <arti>|ART|M|S|@>N Definite=Ind|Gender=Masc|Number=Sing|PronType=Art 8 det _ _
8 dado dado NOUN <np-idf>|N|M|S|@<ACC Gender=Masc|Number=Sing 6 obj _ _
9 supreendente surpreendente ADJ ADJ|M|S|@N< Gender=Masc|Number=Sing 8 amod _ ChangedBy=Issue165|SpaceAfter=No
10 : : PUNCT PU|@PU _ 26 punct _ _
11 recusando recusar VERB <mv>|V|GER|@ICL-ADVL> VerbForm=Ger 26 advcl _ _
12 uma um DET <arti>|ART|F|S|@>N Definite=Ind|Gender=Fem|Number=Sing|PronType=Art 13 det _ _
13 postura postura NOUN <np-idf>|N|F|S|@<ACC Gender=Fem|Number=Sing 11 obj _ _
14 radical radical ADJ ADJ|F|S|@N< Gender=Fem|Number=Sing 13 amod _ ChangedBy=Issue165|SpaceAfter=No
15 , , PUNCT PU|@PU _ 26 punct _ _
16 a o DET <artd>|ART|F|S|@>N Definite=Def|Gender=Fem|Number=Sing|PronType=Art 18 det _ _
17 esmagadora esmagador ADJ ADJ|F|S|@>N Gender=Fem|Number=Sing 18 amod _ _
18 maioria maioria NOUN <np-def>|N|F|S|@SUBJ> Gender=Fem|Number=Sing 26 nsubj _ _
19 ( ( PUNCT PU|@PU _ 21 punct _ ChangedBy=Issue165|SpaceAfter=No
20 77 77 NUM <card>|NUM|M|P|@>N NumType=Card 21 nummod _ ChangedBy=Issue165|ChangedBy=Issue168|SpaceAfter=No
21 % % SYM <np-def>|N|M|P|@N<PRED Gender=Masc|Number=Plur 18 appos _ ChangedBy=Issue165|SpaceAfter=No
22 ) ) PUNCT PU|@PU _ 21 punct _ _
23-24 dos _ _ _ _ _ _ _ _
23 de de ADP <sam->|PRP|@N< _ 25 case _ _
24 os o DET <-sam>|<artd>|ART|M|P|@>N Definite=Def|Gender=Masc|Number=Plur|PronType=Art 25 det _ _
25 eleitores eleitor NOUN <np-def>|N|M|P|@P< Gender=Masc|Number=Plur 18 nmod _ _
26 quer querer VERB <mv>|V|PR|3S|IND|@FS-N<PRED Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin 8 acl:relcl _ _
27 o o DET <artd>|ART|M|S|@>N Definite=Def|Gender=Masc|Number=Sing|PronType=Art 28 det _ _
28 PT PT PROPN PROP|M|S|@<ACC Gender=Masc|Number=Sing 26 obj _ _
29 participando participar VERB <mv>|V|GER|@ICL-<OC VerbForm=Ger 26 xcomp _ _
30-31 do _ _ _ _ _ _ _ _
30 de de ADP <sam->|PRP|@<PIV _ 32 case _ _
31 o o DET <-sam>|<artd>|ART|M|S|@>N Definite=Def|Gender=Masc|Number=Sing|PronType=Art 32 det _ _
32 Governo governo NOUN <prop>|<np-def>|N|M|S|@P< Gender=Masc|Number=Sing 29 obl _ _
33 Fernando Fernando PROPN _ Gender=Masc|Number=Sing 32 nmod _ ChangedBy=Issue119|MWE=Fernando_Henrique_Cardoso|MWEPOS=PROPN
34 Henrique Henrique PROPN _ Number=Sing 33 flat:name _ ChangedBy=Issue119
35 Cardoso Cardoso PROPN _ Number=Sing 33 flat:name _ SpaceAfter=No
36 . . PUNCT PU|@PU _ 6 punct _ _
# text = Tem sentido -- aliás, muitíssimo sentido.
# source = CETENFolha n=1 cad=Opinião sec=opi sem=94a &D
# sent_id = CF1-4
# id = 3
1 Tem ter VERB <mv>|V|PR|3S|IND|@FS-STA Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin 0 root _ _
2 sentido sentido NOUN <np-idf>|N|M|S|@<ACC Gender=Masc|Number=Sing 1 obj _ _
3 -- -- PUNCT PU|@PU _ 1 punct _ _
4 aliás aliás ADV <kc>|ADV|@<ADVL _ 1 advmod _ ChangedBy=Issue165|SpaceAfter=No
5 , , PUNCT PU|@PU _ 7 punct _ _
6 muitíssimo muitíssimo DET <quant>|<SUP>|DET|M|S|@>N Gender=Masc|Number=Sing|PronType=Ind 7 det _ _
7 sentido sentido NOUN <np-idf>|N|M|S|@N<PRED Gender=Masc|Number=Sing 2 appos _ ChangedBy=Issue165|SpaceAfter=No
8 . . PUNCT PU|@PU _ 1 punct _ _
# text = Muito mais do que nos tempos na ditadura, a solidez do PT está, agora, ameaçada.
# source = CETENFolha n=1 cad=Opinião sec=opi sem=94a
# sent_id = CF1-5
# id = 4
1 Muito muito ADV <quant>|ADV|@>A _ 2 advmod _ _
2 mais mais ADV <quant>|<KOMP>|<COMP>|ADV|@ADVL> _ 22 advmod _ _
3-4 do _ _ _ _ _ _ _ _
3 de de ADP <sam->|PRP|@COM _ 8 case _ _
4 o o PRON <dem>|<-sam>|DET|M|S|@P< Gender=Masc|Number=Sing|PronType=Dem 3 fixed _ _
5 que que PRON <rel>|INDP|M|S|@N< Gender=Masc|Number=Sing|PronType=Rel 3 fixed _ _
6-7 nos _ _ _ _ _ _ _ _
6 em em ADP <sam->|<first-cjt>|PRP|@KOMP< _ 8 case _ _
7 os o DET <-sam>|<artd>|ART|M|P|@>N Definite=Def|Gender=Masc|Number=Plur|PronType=Art 8 det _ _
8 tempos tempo NOUN <first-cjt>|<np-def>|N|M|P|@P< Gender=Masc|Number=Plur 2 obl _ _
9-10 na _ _ _ _ _ _ _ _
9 em em ADP <sam->|PRP|@N< _ 11 case _ _
10 a o DET <-sam>|<artd>|ART|F|S|@>N Definite=Def|Gender=Fem|Number=Sing|PronType=Art 11 det _ _
11 ditadura ditadura NOUN <np-def>|N|F|S|@P< Gender=Fem|Number=Sing 8 nmod _ ChangedBy=Issue165|SpaceAfter=No
12 , , PUNCT PU|@PU _ 2 punct _ _
13 a o DET <artd>|ART|F|S|@>N Definite=Def|Gender=Fem|Number=Sing|PronType=Art 14 det _ _
14 solidez solidez NOUN <np-def>|N|F|S|@SUBJ> Gender=Fem|Number=Sing 22 nsubj _ _
15-16 do _ _ _ _ _ _ _ _
15 de de ADP <sam->|PRP|@N< _ 17 case _ _
16 o o DET <-sam>|<artd>|ART|M|S|@>N Definite=Def|Gender=Masc|Number=Sing|PronType=Art 17 det _ _
17 PT PT PROPN PROP|M|S|@P< Gender=Masc|Number=Sing 14 nmod _ _
18 está estar AUX <mv>|V|PR|3S|IND|@FS-STA Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin 22 cop _ ChangedBy=Issue165|ChangedBy=Issue167|SpaceAfter=No
19 , , PUNCT PU|@PU _ 20 punct _ _
20 agora agora ADV <kc>|ADV|@<ADVL _ 22 advmod _ ChangedBy=Issue165|SpaceAfter=No
21 , , PUNCT PU|@PU _ 20 punct _ _
22 ameaçada ameaçar VERB <mv>|V|PCP|F|S|@ICL-<SC Gender=Fem|Number=Sing|VerbForm=Part 0 root _ ChangedBy=Issue165|SpaceAfter=No
23 . . PUNCT PU|@PU _ 22 punct _ _
# text = Nem Lula nem o partido ainda encontraram um discurso para se diferenciar.
# source = CETENFolha n=1 cad=Opinião sec=opi sem=94a
# sent_id = CF1-6
# id = 5
1 Nem nem CCONJ <parkc-1>|KC|@CO _ 2 cc _ _
2 Lula Lula PROPN <first-cjt>|PROP|M|S|@SUBJ> Gender=Masc|Number=Sing 7 nsubj _ _
3 nem nem CCONJ <co-subj>|<parkc-2>|KC|@CO _ 5 cc _ _
4 o o DET <artd>|ART|M|S|@>N Definite=Def|Gender=Masc|Number=Sing|PronType=Art 5 det _ _
5 partido partido NOUN <cjt>|<np-def>|N|M|S|@SUBJ> Gender=Masc|Number=Sing 2 conj _ _
6 ainda ainda ADV ADV|@ADVL> _ 7 advmod _ _
7 encontraram encontrar VERB <mv>|V|PS/MQP|3P|IND|@FS-STA Mood=Ind|Number=Plur|Person=3|VerbForm=Fin 0 root _ _
8 um um DET _ Definite=Ind|Gender=Masc|Number=Sing|PronType=Art 9 det _ _
9 discurso discurso NOUN <np-idf>|N|M|S|@<ACC Gender=Masc|Number=Sing 7 obj _ _
10 para para ADP _ _ 12 case _ _
11 se se PRON PERS|M|3S|ACC|@ACC>-PASS Case=Acc|Gender=Masc|Number=Sing|Person=3|PronType=Prs 12 expl _ ChangedBy=Issue135
12 diferenciar diferenciar VERB _ VerbForm=Inf 9 acl _ ChangedBy=Issue165|SpaceAfter=No
13 . . PUNCT PU|@PU _ 7 punct _ _
# text = Eles se dizem oposição, mas ainda não informaram o que vão combater.
# source = CETENFolha n=1 cad=Opinião sec=opi sem=94a
# sent_id = CF1-7
# id = 6
1 Eles eles PRON PERS|M|3P|NOM|@SUBJ> Case=Nom|Gender=Masc|Number=Plur|Person=3|PronType=Prs 3 nsubj _ _
2 se se PRON PERS|M|3P|ACC|@ACC>-PASS Case=Acc|Gender=Masc|Number=Plur|Person=3|PronType=Prs 3 expl _ ChangedBy=Issue135
3 dizem dizer VERB <first-cjt>|<mv>|<se-passive>|V|PR|3P|IND|@FS-STA Mood=Ind|Number=Plur|Person=3|Tense=Pres|VerbForm=Fin 0 root _ _
4 oposição oposição NOUN <np-idf>|N|F|S|@<OC Gender=Fem|Number=Sing 3 xcomp _ ChangedBy=Issue165|SpaceAfter=No
5 , , PUNCT PU|@PU _ 9 punct _ _
6 mas mas CCONJ <co-fcl>|KC|@CO _ 9 cc _ _
7 ainda ainda ADV ADV|@>A _ 8 advmod _ _
8 não não ADV _ Polarity=Neg 9 advmod _ _
9 informaram informar VERB <cjt>|<mv>|V|PS/MQP|3P|IND|@FS-STA Mood=Ind|Number=Plur|Person=3|VerbForm=Fin 3 conj _ _
10 o o PRON _ Gender=Masc|Number=Sing|PronType=Dem 11 det _ _
11 que que PRON <interr>|INDP|M|S|@ACC> Gender=Masc|Number=Sing|PronType=Int 13 obj _ _
12 vão ir AUX <aux>|V|PR|3P|IND|@FS-<ACC Mood=Ind|Number=Plur|Person=3|Tense=Pres|VerbForm=Fin 13 aux _ _
13 combater combater VERB <mv>|V|INF|@ICL-AUX< VerbForm=Inf 9 ccomp _ ChangedBy=Issue165|SpaceAfter=No
14 . . PUNCT PU|@PU _ 3 punct _ _
# text = Muitas das prioridades do novo governo coincidem com as prioridades do PT.
# source = CETENFolha n=1 cad=Opinião sec=opi sem=94a
# sent_id = CF1-8
# id = 7
1 Muitas muito PRON <quant>|DET|F|P|@SUBJ> Gender=Fem|Number=Plur|PronType=Ind 9 nsubj _ _
2-3 das _ _ _ _ _ _ _ _
2 de de ADP <sam->|PRP|@N< _ 4 case _ _
3 as o DET <-sam>|<artd>|ART|F|P|@>N Definite=Def|Gender=Fem|Number=Plur|PronType=Art 4 det _ _
4 prioridades prioridade NOUN <np-def>|N|F|P|@P< Gender=Fem|Number=Plur 1 nmod _ _
5-6 do _ _ _ _ _ _ _ _
5 de de ADP <sam->|PRP|@N< _ 8 case _ _
6 o o DET <-sam>|<artd>|ART|M|S|@>N Definite=Def|Gender=Masc|Number=Sing|PronType=Art 8 det _ _
7 novo novo ADJ ADJ|M|S|@>N Gender=Masc|Number=Sing 8 amod _ _
8 governo governo NOUN <np-def>|N|M|S|@P< Gender=Masc|Number=Sing 4 nmod _ _
9 coincidem coincidir VERB <mv>|V|PR|3P|IND|@FS-STA Mood=Ind|Number=Plur|Person=3|Tense=Pres|VerbForm=Fin 0 root _ _
10 com com ADP PRP|@<PIV _ 12 case _ _
11 as o DET <artd>|ART|F|P|@>N Definite=Def|Gender=Fem|Number=Plur|PronType=Art 12 det _ _
12 prioridades prioridade NOUN <np-def>|N|F|P|@P< Gender=Fem|Number=Plur 9 obj _ _
13-14 do _ _ _ _ _ _ _ _
13 de de ADP <sam->|PRP|@N< _ 15 case _ _
14 o o DET <-sam>|<artd>|ART|M|S|@>N Definite=Def|Gender=Masc|Number=Sing|PronType=Art 15 det _ _
15 PT PT PROPN PROP|M|S|@P< Gender=Masc|Number=Sing 12 nmod _ ChangedBy=Issue165|SpaceAfter=No
16 . . PUNCT PU|@PU _ 9 punct _ _

View File

@@ -0,0 +1,122 @@
# newdoc id = weblog-blogspot.com_zentelligence_20040423000200_ENG_20040423_000200
# sent_id = weblog-blogspot.com_zentelligence_20040423000200_ENG_20040423_000200-0001
# text = What if Google Morphed Into GoogleOS?
1 What what PRON WP PronType=Int 0 root 0:root _
2 if if SCONJ IN _ 4 mark 4:mark _
3 Google Google PROPN NNP Number=Sing 4 nsubj 4:nsubj _
4 Morphed morph VERB VBD Mood=Ind|Tense=Past|VerbForm=Fin 1 advcl 1:advcl _
5 Into into ADP IN _ 6 case 6:case _
6 GoogleOS GoogleOS PROPN NNP Number=Sing 4 obl 4:obl SpaceAfter=No
7 ? ? PUNCT . _ 4 punct 4:punct _
# sent_id = weblog-blogspot.com_zentelligence_20040423000200_ENG_20040423_000200-0002
# text = What if Google expanded on its search-engine (and now e-mail) wares into a full-fledged operating system?
1 What what PRON WP PronType=Int 0 root 0:root _
2 if if SCONJ IN _ 4 mark 4:mark _
3 Google Google PROPN NNP Number=Sing 4 nsubj 4:nsubj _
4 expanded expand VERB VBD Mood=Ind|Tense=Past|VerbForm=Fin 1 advcl 1:advcl _
5 on on ADP IN _ 15 case 15:case _
6 its its PRON PRP$ Gender=Neut|Number=Sing|Person=3|Poss=Yes|PronType=Prs 15 nmod:poss 15:nmod:poss _
7 search search NOUN NN Number=Sing 9 compound 9:compound SpaceAfter=No
8 - - PUNCT HYPH _ 9 punct 9:punct SpaceAfter=No
9 engine engine NOUN NN Number=Sing 15 compound 15:compound _
10 ( ( PUNCT -LRB- _ 9 punct 9:punct SpaceAfter=No
11 and and CCONJ CC _ 13 cc 13:cc _
12 now now ADV RB _ 13 advmod 13:advmod _
13 e-mail e-mail NOUN NN Number=Sing 9 conj 9:conj SpaceAfter=No
14 ) ) PUNCT -RRB- _ 15 punct 15:punct _
15 wares wares NOUN NNS Number=Plur 4 obl 4:obl _
16 into into ADP IN _ 22 case 22:case _
17 a a DET DT Definite=Ind|PronType=Art 22 det 22:det _
18 full full ADV RB _ 20 advmod 20:advmod SpaceAfter=No
19 - - PUNCT HYPH _ 20 punct 20:punct SpaceAfter=No
20 fledged fledged ADJ JJ Degree=Pos 22 amod 22:amod _
21 operating operating NOUN NN Number=Sing 22 compound 22:compound _
22 system system NOUN NN Number=Sing 4 obl 4:obl SpaceAfter=No
23 ? ? PUNCT . _ 4 punct 4:punct _
# sent_id = weblog-blogspot.com_zentelligence_20040423000200_ENG_20040423_000200-0003
# text = [via Microsoft Watch from Mary Jo Foley ]
1 [ [ PUNCT -LRB- _ 4 punct 4:punct SpaceAfter=No
2 via via ADP IN _ 4 case 4:case _
3 Microsoft Microsoft PROPN NNP Number=Sing 4 compound 4:compound _
4 Watch Watch PROPN NNP Number=Sing 0 root 0:root _
5 from from ADP IN _ 6 case 6:case _
6 Mary Mary PROPN NNP Number=Sing 4 nmod 4:nmod _
7 Jo Jo PROPN NNP Number=Sing 6 flat 6:flat _
8 Foley Foley PROPN NNP Number=Sing 6 flat 6:flat _
9 ] ] PUNCT -RRB- _ 4 punct 4:punct _
# newdoc id = weblog-blogspot.com_marketview_20050511222700_ENG_20050511_222700
# sent_id = weblog-blogspot.com_marketview_20050511222700_ENG_20050511_222700-0001
# text = (And, by the way, is anybody else just a little nostalgic for the days when that was a good thing?)
1 ( ( PUNCT -LRB- _ 14 punct 14:punct SpaceAfter=No
2 And and CCONJ CC _ 14 cc 14:cc SpaceAfter=No
3 , , PUNCT , _ 14 punct 14:punct _
4 by by ADP IN _ 6 case 6:case _
5 the the DET DT Definite=Def|PronType=Art 6 det 6:det _
6 way way NOUN NN Number=Sing 14 obl 14:obl SpaceAfter=No
7 , , PUNCT , _ 14 punct 14:punct _
8 is be AUX VBZ Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin 14 cop 14:cop _
9 anybody anybody PRON NN Number=Sing 14 nsubj 14:nsubj _
10 else else ADJ JJ Degree=Pos 9 amod 9:amod _
11 just just ADV RB _ 13 advmod 13:advmod _
12 a a DET DT Definite=Ind|PronType=Art 13 det 13:det _
13 little little ADJ JJ Degree=Pos 14 obl:npmod 14:obl:npmod _
14 nostalgic nostalgic NOUN NN Number=Sing 0 root 0:root _
15 for for ADP IN _ 17 case 17:case _
16 the the DET DT Definite=Def|PronType=Art 17 det 17:det _
17 days day NOUN NNS Number=Plur 14 nmod 14:nmod _
18 when when ADV WRB PronType=Rel 23 advmod 23:advmod _
19 that that PRON DT Number=Sing|PronType=Dem 23 nsubj 23:nsubj _
20 was be AUX VBD Mood=Ind|Number=Sing|Person=3|Tense=Past|VerbForm=Fin 23 cop 23:cop _
21 a a DET DT Definite=Ind|PronType=Art 23 det 23:det _
22 good good ADJ JJ Degree=Pos 23 amod 23:amod _
23 thing thing NOUN NN Number=Sing 17 acl:relcl 17:acl:relcl SpaceAfter=No
24 ? ? PUNCT . _ 14 punct 14:punct SpaceAfter=No
25 ) ) PUNCT -RRB- _ 14 punct 14:punct _
# sent_id = weblog-blogspot.com_marketview_20050511222700_ENG_20050511_222700-0002
# text = This BuzzMachine post argues that Google's rush toward ubiquity might backfire -- which we've all heard before, but it's particularly well-put in this post.
1 This this DET DT Number=Sing|PronType=Dem 3 det 3:det _
2 BuzzMachine BuzzMachine PROPN NNP Number=Sing 3 compound 3:compound _
3 post post NOUN NN Number=Sing 4 nsubj 4:nsubj _
4 argues argue VERB VBZ Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin 0 root 0:root _
5 that that SCONJ IN _ 12 mark 12:mark _
6 Google Google PROPN NNP Number=Sing 8 nmod:poss 8:nmod:poss SpaceAfter=No
7 's 's PART POS _ 6 case 6:case _
8 rush rush NOUN NN Number=Sing 12 nsubj 12:nsubj _
9 toward toward ADP IN _ 10 case 10:case _
10 ubiquity ubiquity NOUN NN Number=Sing 8 nmod 8:nmod _
11 might might AUX MD VerbForm=Fin 12 aux 12:aux _
12 backfire backfire VERB VB VerbForm=Inf 4 ccomp 4:ccomp _
13 -- -- PUNCT , _ 12 punct 12:punct _
14 which which PRON WDT PronType=Rel 18 obj 18:obj _
15 we we PRON PRP Case=Nom|Number=Plur|Person=1|PronType=Prs 18 nsubj 18:nsubj SpaceAfter=No
16 've have AUX VBP Mood=Ind|Tense=Pres|VerbForm=Fin 18 aux 18:aux _
17 all all ADV RB _ 18 advmod 18:advmod _
18 heard hear VERB VBN Tense=Past|VerbForm=Part 12 acl:relcl 12:acl:relcl _
19 before before ADV RB _ 18 advmod 18:advmod SpaceAfter=No
20 , , PUNCT , _ 27 punct 27:punct _
21 but but CCONJ CC _ 27 cc 27:cc _
22 it it PRON PRP Case=Nom|Gender=Neut|Number=Sing|Person=3|PronType=Prs 27 nsubj:pass 27:nsubj:pass SpaceAfter=No
23 's be VERB VBZ Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin 27 aux:pass 27:aux:pass _
24 particularly particularly ADV RB _ 27 advmod 27:advmod _
25 well well ADV RB Degree=Pos 27 advmod 27:advmod SpaceAfter=No
26 - - PUNCT HYPH _ 27 punct 27:punct SpaceAfter=No
27 put put VERB VBN Tense=Past|VerbForm=Part 4 conj 4:conj _
28 in in ADP IN _ 30 case 30:case _
29 this this DET DT Number=Sing|PronType=Dem 30 det 30:det _
30 post post NOUN NN Number=Sing 27 obl 27:obl SpaceAfter=No
31 . . PUNCT . _ 4 punct 4:punct _
# sent_id = weblog-blogspot.com_marketview_20050511222700_ENG_20050511_222700-0003
# text = Google is a nice search engine.
1 Google Google PROPN NNP Number=Sing 6 nsubj 6:nsubj _
2 is be AUX VBZ Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin 6 cop 6:cop _
3 a a DET DT Definite=Ind|PronType=Art 6 det 6:det _
4 nice nice ADJ JJ Degree=Pos 6 amod 6:amod _
5 search search NOUN NN Number=Sing 6 compound 6:compound _
6 engine engine NOUN NN Number=Sing 0 root 0:root SpaceAfter=No
7 . . PUNCT . _ 6 punct 6:punct _

View File

@@ -0,0 +1,121 @@
# sent_id = s1
# text = ئاسماننى كۆپكۈك، دەريا، كۆل سۇلىرىنى سۈپسۈزۈك تۇرۇشقا، دەل - دەرەخلەرنى بۈك - باراقسان بولۇشقا، ھايۋانلارنى ئەركىن ئازادە ياشاشقا ئىگە قىلىش... بىزنىڭ ئورتاق ئارزۇيىمىز.
1 ئاسماننى _ NOUN N _ 30 csubj _ Translit=asmanni
2 كۆپكۈك _ VERB V _ 1 orphan _ SpaceAfter=No|Translit=köpkük
3 ، _ PUNCT Y _ 2 punct _ Translit=,
4 دەريا _ NOUN N _ 7 nmod:poss _ SpaceAfter=No|Translit=derya
5 ، _ PUNCT Y _ 4 punct _ Translit=,
6 كۆل _ NOUN N _ 4 conj _ Translit=köl
7 سۇلىرىنى _ NOUN N _ 9 obj _ Translit=sulirini
8 سۈپسۈزۈك _ ADJ A _ 9 advmod _ Translit=süpsüzük
9 تۇرۇشقا _ VERB V _ 1 conj _ SpaceAfter=No|Translit=turushqa
10 ، _ PUNCT Y _ 1 punct _ Translit=,
11 دەل _ ADV D _ 13 compound:redup _ Translit=del
12 - _ PUNCT Y _ 11 punct _ Translit=-
13 دەرەخلەرنى _ NOUN N _ 17 obj _ Translit=derexlerni
14 بۈك _ ADJ A _ 16 compound _ Translit=bük
15 - _ PUNCT Y _ 14 punct _ Translit=-
16 باراقسان _ ADJ A _ 17 advmod _ Translit=baraqsan
17 بولۇشقا _ VERB V _ 9 orphan _ SpaceAfter=No|Translit=bolushqa
18 ، _ PUNCT Y _ 17 punct _ Translit=,
19 ھايۋانلارنى _ NOUN N _ 24 obj _ Translit=haywanlarni
20 ئەركىن _ ADJ A _ 21 compound:redup _ Translit=erkin
21 ئازادە _ ADJ A _ 22 advmod _ Translit=azade
22 ياشاشقا _ NOUN N _ 24 advcl _ Translit=yashashqa
23 ئىگە _ NOUN N _ 24 compound _ Translit=ige
24 قىلىش _ VERB V _ 1 conj _ SpaceAfter=No|Translit=qilish
25 . _ PUNCT Y _ 1 punct _ SpaceAfter=No|Translit=.
26 . _ PUNCT Y _ 1 punct _ SpaceAfter=No|Translit=.
27 . _ PUNCT Y _ 1 punct _ Translit=.
28 بىزنىڭ _ PRON P _ 30 nmod:poss _ Translit=bizning
29 ئورتاق _ ADJ A _ 30 amod _ Translit=ortaq
30 ئارزۇيىمىز _ NOUN N _ 0 root _ SpaceAfter=No|Translit=arzuyimiz
31 . _ PUNCT Y _ 30 punct _ Translit=.
# sent_id = s2
# text = بۇ بۆلەكتىكى تېكىستلەرنى ئوقۇش ئارقىلىق، كىشىلەرنىڭ ھايۋانلار ۋە ئۆسۈملۈكلەرگە قانداق مۇئامىلە قىلغانلىقى، ئاقىۋىتىنىڭ قانداق بولغانلىقىنى كۆرۈپ باقايلى،
1 بۇ _ PRON P _ 2 det _ Translit=bu
2 بۆلەكتىكى _ NOUN N _ 3 nmod _ Translit=bölektiki
3 تېكىستلەرنى _ NOUN N _ 4 obj _ Translit=tëkistlerni
4 ئوقۇش _ VERB V _ 18 advcl _ Translit=oqush
5 ئارقىلىق _ ADP R _ 4 case _ SpaceAfter=No|Translit=arqiliq
6 ، _ PUNCT Y _ 5 punct _ Translit=,
7 كىشىلەرنىڭ _ NOUN N _ 13 nsubj _ Translit=kishilerning
8 ھايۋانلار _ NOUN N _ 13 obl _ Translit=haywanlar
9 ۋە _ CCONJ C _ 10 cc _ Translit=we
10 ئۆسۈملۈكلەرگە _ NOUN N _ 8 conj _ Translit=ösümlüklerge
11 قانداق _ PRON P _ 13 advmod _ Translit=qandaq
12 مۇئامىلە _ NOUN N _ 13 compound _ Translit=muamile
13 قىلغانلىقى _ VERB V _ 18 conj _ SpaceAfter=No|Translit=qilghanliqi
14 ، _ PUNCT Y _ 13 punct _ Translit=,
15 ئاقىۋىتىنىڭ _ NOUN N _ 17 nsubj _ Translit=aqiwitining
16 قانداق _ PRON P _ 17 advmod _ Translit=qandaq
17 بولغانلىقىنى _ VERB V _ 18 obj _ Translit=bolghanliqini
18 كۆرۈپ _ VERB V _ 0 root _ Translit=körüp
19 باقايلى _ VERB V _ 18 aux _ SpaceAfter=No|Translit=baqayli
20 ، _ PUNCT Y _ 19 punct _ Translit=,
# sent_id = s3
# text = يەنە ئەتراپىمىزدىكى مۇھىتنى ياخشى كۆزىتىپ، مۇھىتنى قوغداش ئۈچۈن نېمىلەرنى قىلالايدىغانلىقىمىز توغرۇلۇق ئويلىنىپ باقايلى.
1 يەنە _ ADV D _ 13 cc _ Translit=yene
2 ئەتراپىمىزدىكى _ NOUN N _ 3 amod _ Translit=etrapimizdiki
3 مۇھىتنى _ NOUN N _ 5 obj _ Translit=muhitni
4 ياخشى _ ADJ A _ 5 advmod _ Translit=yaxshi
5 كۆزىتىپ _ VERB V _ 13 advcl _ SpaceAfter=No|Translit=közitip
6 ، _ PUNCT Y _ 5 punct _ Translit=,
7 مۇھىتنى _ NOUN N _ 8 obj _ Translit=muhitni
8 قوغداش _ VERB V _ 11 advcl _ Translit=qoghdash
9 ئۈچۈن _ CCONJ C _ 8 case _ Translit=üchün
10 نېمىلەرنى _ PRON P _ 11 obj _ Translit=nëmilerni
11 قىلالايدىغانلىقىمىز _ VERB V _ 13 obj _ Translit=qilalaydighanliqimiz
12 توغرۇلۇق _ ADP R _ 11 case _ Translit=toghruluq
13 ئويلىنىپ _ VERB V _ 0 root _ Translit=oylinip
14 باقايلى _ VERB V _ 13 aux _ SpaceAfter=No|Translit=baqayli
15 . _ PUNCT Y _ 14 punct _ Translit=.
# sent_id = s4
# text = بىر يىلى باھار كۈنلىرىنىڭ بىرىدە، شىۋېتسارىيىنىڭ بىر ۋوگزالىدا ھاۋا تەڭشىگۈچ ئورنىتىلغان چىرايلىق، ئازادە بىر پويىز قوزغىلىش ئالدىدا تۇراتتى.
1 بىر _ NUM M _ 2 nummod _ Translit=bir
2 يىلى _ NOUN N _ 20 nmod:tmod _ Translit=yili
3 باھار _ NOUN N _ 4 nmod:poss _ Translit=bahar
4 كۈنلىرىنىڭ _ NOUN N _ 5 nmod:part _ Translit=künlirining
5 بىرىدە _ NUM M _ 20 nmod:tmod _ SpaceAfter=No|Translit=biride
6 ، _ PUNCT Y _ 5 punct _ Translit=,
7 شىۋېتسارىيىنىڭ _ NOUN N _ 9 nmod:poss _ Translit=shiwëtsariyining
8 بىر _ NUM M _ 9 det _ Translit=bir
9 ۋوگزالىدا _ NOUN N _ 20 obl _ Translit=wogzalida
10 ھاۋا _ NOUN N _ 11 compound _ Translit=hawa
11 تەڭشىگۈچ _ NOUN N _ 12 nsubj _ Translit=tengshigüch
12 ئورنىتىلغان _ NOUN N _ 17 amod _ Translit=ornitilghan
13 چىرايلىق _ ADJ A _ 17 amod _ SpaceAfter=No|Translit=chirayliq
14 ، _ PUNCT Y _ 13 punct _ Translit=,
15 ئازادە _ ADJ A _ 13 conj _ Translit=azade
16 بىر _ NUM M _ 17 det _ Translit=bir
17 پويىز _ NOUN N _ 20 nsubj _ Translit=poyiz
18 قوزغىلىش _ VERB V _ 19 nmod:poss _ Translit=qozghilish
19 ئالدىدا _ NOUN N _ 20 obl _ Translit=aldida
20 تۇراتتى _ VERB V _ 0 root _ SpaceAfter=No|Translit=turatti
21 . _ PUNCT Y _ 20 punct _ Translit=.
# sent_id = s5
# text = ۋوگزال سۇپىسى ئۇزاتقۇچىلار بىلەن تولۇپ كەتكەنىدى.
1 ۋوگزال _ NOUN N _ 2 nmod:poss _ Translit=wogzal
2 سۇپىسى _ NOUN N _ 5 nsubj _ Translit=supisi
3 ئۇزاتقۇچىلار _ NOUN N _ 5 obl _ Translit=uzatquchilar
4 بىلەن _ ADP R _ 3 case _ Translit=bilen
5 تولۇپ _ VERB V _ 0 root _ Translit=tolup
6 كەتكەنىدى _ VERB V _ 5 aux _ SpaceAfter=No|Translit=ketkenidi
7 . _ PUNCT Y _ 6 punct _ Translit=.
# sent_id = s6
# text = ئۇلارنىڭ ئۇزاتماقچى بولغىنى ئۆزگىچە مىھمان - قارلىغاچلار ئىدى.
1 ئۇلارنىڭ _ PRON P _ 2 nsubj _ Translit=ularning
2 ئۇزاتماقچى _ NOUN N _ 5 acl _ Translit=uzatmaqchi
3 بولغىنى _ AUX V _ 2 cop _ Translit=bolghini
4 ئۆزگىچە _ ADJ A _ 5 amod _ Translit=özgiche
5 مىھمان _ NOUN N _ 7 appos _ Translit=mihman
6 - _ PUNCT Y _ 5 punct _ Translit=-
7 قارلىغاچلار _ NOUN N _ 0 root _ Translit=qarlighachlar
8 ئىدى _ AUX V _ 7 cop _ SpaceAfter=No|Translit=idi
9 . _ PUNCT Y _ 8 punct _ Translit=.

View File

@@ -0,0 +1,58 @@
# Terragrunt is a thin wrapper for Terraform that provides extra tools for working with multiple Terraform modules,
# remote state, and locking: https://github.com/gruntwork-io/terragrunt
terragrunt = {
# Configure Terragrunt to automatically store tfstate files in an S3 bucket
remote_state {
backend = "s3"
config {
encrypt = true
bucket = "acme-main-terraform-state"
key = "${path_relative_to_include()}/terraform.tfstate"
region = "us-east-1"
dynamodb_table = "terraform-locks"
}
}
# Configure Terragrunt to use common var files to help you keep often-repeated variables (e.g., account ID) DRY.
# Note that even though Terraform automatically pulls in terraform.tfvars, we include it explicitly at the end of the
# list to make sure its variables override anything in the common var files.
terraform {
extra_arguments "common_vars" {
commands = ["${get_terraform_commands_that_need_vars()}"]
optional_var_files = [
"${get_tfvars_dir()}/${find_in_parent_folders("account.tfvars", "skip-account-if-does-not-exist")}",
"${get_tfvars_dir()}/${find_in_parent_folders("region.tfvars", "skip-region-if-does-not-exist")}",
"${get_tfvars_dir()}/${find_in_parent_folders("env.tfvars", "skip-env-if-does-not-exist")}",
"${get_tfvars_dir()}/terraform.tfvars"
]
}
}
}
key1 = "val1"
key2 = 0
key3 = 1
key4 = true
# Sample comments
key5 = false
key6 = ["hello", "from", "gruntwork.io"]
key7 = {
key1 = "hello"
key2 = "from"
key3 = "gruntwork.io"
}
key8 = [
{
keyA = "hello"
keyB = "there"
},
{
keyA = "hello"
keyB = "there"
}
]

View File

@@ -0,0 +1,38 @@
<h1>Workers</h1>
<table class="workers">
<tr>
<th>Job server</th>
<th>IP</th>
<th>File descriptor</th>
<th>Client ID</th>
<th>Functions</th>
</tr>
{% for server_info in server_infos %}
<tr {% if server_info['failed'] %} class="failure" {% endif %} >
<th>{{ server_info['hostport'][0] }}:{{ server_info['hostport'][1] }}</th>
<th>
{%- if server_info['failed'] -%} Not responding! {%- endif -%}
</th>
<th></th>
<th></th>
<th></th>
</tr>
{% if not server_info['failed'] %}
{% for worker in server_info['workers'] %}
<tr>
<td class="server"></td>
<td class="ip">{{ worker['ip'] }}</td>
<td class="file_descriptor">{{ worker['file_descriptor'] }}</td>
<td class="client_id">{{ worker['client_id'] }}</td>
<td class="functions">
{{ worker['tasks']|join(', ') }}
</td>
</tr>
{% endfor %}
{% endif %}
{% endfor %}
</table>

View File

@@ -0,0 +1,10 @@
buildGlobal.hxml
-lib mcover:2.1.1
-D unittest
-x TestMain
--macro mcover.MCover.coverage(['checkstyle'], ['src'], ['checkstyle.reporter', 'checkstyle.Main'])
--next
-cmd neko run -s src -s test -p resources/static-analysis.txt
-cmd neko run --default-config resources/default-config.json
-cmd neko run -c resources/default-config.json

31
samples/HXML/vshaxe.hxml Normal file
View File

@@ -0,0 +1,31 @@
# This file is generated with vshaxe-build - DO NOT EDIT MANUALLY!
-cp vscode-extern/src
-cp src-api
-cp src
-cp server/src
-cp server/protocol/src
-cp server/formatter/src
-cp server/test
-cp server/formatter/test
-cp syntaxes/src
-D analyzer-optimize
-D js_unflatten
-D hxnodejs-no-version-warning
-D JSTACK_MAIN=vshaxe.Main.main
-D JSTACK_ASYNC_ENTRY
-D JSTACK_FORMAT=vscode
-lib hxnodejs
-lib jstack
-lib haxe-hxparser
-lib compiletime
-lib mockatoo
-lib mconsole
-lib hx3compat
-lib hxargs
-lib json2object
-lib yaml
-lib plist
-debug
-js bin/build.js
--no-inline
-main Build

View File

@@ -0,0 +1,136 @@
{
"accessors": [
{
"bufferView": 0,
"componentType": 5126,
"count": 4,
"type": "VEC3",
"max": [
0.5,
0.5,
0.0
],
"min": [
-0.5,
-0.5,
0.0
],
"name": "Positions Accessor"
},
{
"bufferView": 1,
"componentType": 5126,
"count": 4,
"type": "VEC4",
"name": "Colors Accessor"
},
{
"bufferView": 2,
"componentType": 5126,
"count": 4,
"type": "VEC2",
"name": "UV Accessor 0"
},
{
"bufferView": 3,
"componentType": 5125,
"count": 6,
"type": "SCALAR",
"name": "Indices Accessor"
}
],
"asset": {
"generator": "glTF Asset Generator",
"version": "2.0",
"extras": {
"Attributes": "VertexColor_Vector4_Float - AlphaMode_Mask - AlphaCutoff - DoubleSided - BaseColorFactor - BaseColorTexture"
}
},
"buffers": [
{
"uri": "Material_Alpha_01.bin",
"byteLength": 168
}
],
"bufferViews": [
{
"buffer": 0,
"byteLength": 48,
"name": "Positions"
},
{
"buffer": 0,
"byteOffset": 48,
"byteLength": 64,
"name": "Colors"
},
{
"buffer": 0,
"byteOffset": 112,
"byteLength": 32,
"name": "Texture Coords 0"
},
{
"buffer": 0,
"byteOffset": 144,
"byteLength": 24,
"name": "Indices"
}
],
"images": [
{
"uri": "Texture_baseColor.png"
}
],
"materials": [
{
"pbrMetallicRoughness": {
"baseColorFactor": [
1.0,
1.0,
1.0,
0.6
],
"baseColorTexture": {
"index": 0
}
},
"alphaMode": "MASK",
"alphaCutoff": 0.7,
"doubleSided": true
}
],
"meshes": [
{
"primitives": [
{
"attributes": {
"POSITION": 0,
"COLOR_0": 1,
"TEXCOORD_0": 2
},
"indices": 3,
"material": 0
}
]
}
],
"nodes": [
{
"mesh": 0
}
],
"scene": 0,
"scenes": [
{
"nodes": [
0
]
}
],
"textures": [
{
"source": 0
}
]
}

View File

@@ -0,0 +1,25 @@
{
"alt-require": true,
"attr-lowercase": true,
"attr-no-duplication": true,
"attr-unsafe-chars": true,
"attr-value-double-quotes": true,
"attr-value-not-empty": false,
"doctype-first": true,
"doctype-html5": true,
"head-script-disabled": false,
"href-abs-or-rel": false,
"id-class-ad-disabled": true,
"id-class-value": false,
"id-unique": true,
"inline-script-disabled": true,
"inline-style-disabled": true,
"space-tab-mixed-disabled": "space",
"spec-char-escape": true,
"src-not-empty": true,
"style-disabled": false,
"tag-pair": true,
"tag-self-close": false,
"tagname-lowercase": true,
"title-require": true
}

View File

@@ -0,0 +1,88 @@
{
"requireCurlyBraces": [
"if",
"else",
"for",
"while",
"do",
"try",
"catch"
],
"requireSpaceAfterKeywords": [
"if",
"else",
"for",
"while",
"do",
"switch",
"case",
"return",
"try",
"catch",
"typeof"
],
"requireSpaceBeforeBlockStatements": true,
"requireParenthesesAroundIIFE": true,
"requireSpacesInConditionalExpression": true,
"disallowSpacesInNamedFunctionExpression": {
"beforeOpeningRoundBrace": true
},
"disallowSpacesInFunctionDeclaration": {
"beforeOpeningRoundBrace": true
},
"requireSpaceBetweenArguments": true,
"requireBlocksOnNewline": true,
"disallowEmptyBlocks": true,
"disallowSpacesInsideArrayBrackets": true,
"disallowSpacesInsideParentheses": true,
"disallowDanglingUnderscores": true,
"requireCommaBeforeLineBreak": true,
"disallowSpacesInCallExpression": true,
"disallowSpaceAfterPrefixUnaryOperators": true,
"disallowSpaceBeforePostfixUnaryOperators": true,
"disallowSpaceBeforeBinaryOperators": [
","
],
"requireSpacesInForStatement": true,
"requireSpaceBeforeBinaryOperators": true,
"requireSpaceAfterBinaryOperators": true,
"disallowKeywords": [
"with"
],
"disallowMixedSpacesAndTabs": true,
"disallowTrailingWhitespace": true,
"disallowKeywordsOnNewLine": [
"else"
],
"requireLineFeedAtFileEnd": true,
"requireCapitalizedConstructors": true,
"requireDotNotation": true,
"disallowNewlineBeforeBlockStatements": true,
"disallowMultipleLineStrings": true,
"requireSpaceBeforeObjectValues": true,
"validateQuoteMarks": "'",
"requireSpaceAfterLineComment": true,
"validateIndentation": 2,
"validateLineBreaks": "LF",
"disallowSpacesInFunction": {
"beforeOpeningRoundBrace": true
},
"requireSpacesInFunction": {
"beforeOpeningCurlyBrace": true
},
"disallowMultipleLineBreaks": true,
"disallowYodaConditions": true,
"disallowFunctionDeclarations": true,
"disallowMultipleVarDecl": "exceptUndefined",
"requirePaddingNewlinesBeforeKeywords": [
"do",
"for",
"if",
"switch",
"try",
"void",
"while",
"return"
],
"excludeFiles": ["**/node_modules/**", "**/min/**", "**/*.min.js"]
}

View File

@@ -0,0 +1,19 @@
{
"ecmaVersion": 6,
"libs": [
"browser",
"jquery"
],
"dontLoad": [
"node_modules/**"
],
"plugins": {
"es_modules": {},
"node": {},
"angular": {},
"doc_comment": {
"fullDocs": true,
"strong": true
}
}
}

View File

@@ -0,0 +1,15 @@
{
"ecmaVersion": 6,
"libs": [],
"loadEagerly": [
"src/app/**/*.js"
],
"dontLoad": [
"node_modules"
],
"plugins": {
"requirejs": {
"baseURL": "src"
}
}
}

View File

@@ -0,0 +1,18 @@
{
"type": "record",
"name": "Response",
"namespace": "org.rflow.message.data.http",
"aliases": [],
"fields": [
{"name": "client_ip", "type": ["string", "null"]},
{"name": "client_port", "type": ["int", "null"]},
{"name": "server_ip", "type": ["string", "null"]},
{"name": "server_port", "type": ["int", "null"]},
{"name": "protocol", "type": "string"},
{"name": "status_code", "type": "int"},
{"name": "status_reason_phrase", "type": "string"},
{"name": "headers", "type": {"type": "map", "values": "string"}},
{"name": "content", "type": "bytes"}
]
}

View File

@@ -0,0 +1,19 @@
{
"version": "1.0",
"name": "demo",
"description": "demo",
"launch_path": "/index.html",
"icons": {
"128": "/res/icon.png"
},
"developer": {
"name": "Cocos2d-html5",
"url": "http://cocos2d-x.org/"
},
"default_locale": "en",
"installs_allowed_from": [
"*"
],
"orientation": "portrait-primary",
"fullscreen": "true"
}

View File

@@ -0,0 +1,56 @@
{
"short_name": "CC Splitter",
"name": "Credit Card Splitter",
"start_url": "./index.html",
"display": "standalone",
"theme_color": "#000000",
"background_color": "#ffffff",
"lang": "en-GB",
"icons": [
{
"src": "logo-16.png",
"sizes": "16x16",
"type": "image/png"
},
{
"src": "logo-36.png",
"sizes": "36x36",
"type": "image/png"
},
{
"src": "logo-48.png",
"sizes": "48x48",
"type": "image/png"
},
{
"src": "logo-72.png",
"sizes": "72x72",
"type": "image/png"
},
{
"src": "logo-96.png",
"sizes": "96x96",
"type": "image/png"
},
{
"src": "logo-144.png",
"sizes": "144x144",
"type": "image/png"
},
{
"src": "logo-192.png",
"sizes": "192x192",
"type": "image/png"
},
{
"src": "logo-250.png",
"sizes": "250x250",
"type": "image/png"
},
{
"src": "logo-512.png",
"sizes": "512x512",
"type": "image/png"
}
]
}

122
samples/JSON/small.tfstate Normal file
View File

@@ -0,0 +1,122 @@
{
"version": 1,
"serial": 12,
"modules": [
{
"path": [
"root"
],
"outputs": {
"public_az1_subnet_id": "subnet-d658bba0",
"region": "us-west-2",
"vpc_cidr": "10.201.0.0/16",
"vpc_id": "vpc-65814701"
},
"resources": {
"aws_key_pair.onprem": {
"type": "aws_key_pair",
"primary": {
"id": "onprem",
"attributes": {
"id": "onprem",
"key_name": "onprem",
"public_key": "foo"
},
"meta": {
"schema_version": "1"
}
}
}
}
},
{
"path": [
"root",
"bootstrap"
],
"outputs": {
"consul_bootstrap_dns": "consul.bootstrap"
},
"resources": {
"aws_route53_record.oasis-consul-bootstrap-a": {
"type": "aws_route53_record",
"depends_on": [
"aws_route53_zone.oasis-consul-bootstrap"
],
"primary": {
"id": "Z68734P5178QN_consul.bootstrap_A",
"attributes": {
"failover": "",
"fqdn": "consul.bootstrap",
"health_check_id": "",
"id": "Z68734P5178QN_consul.bootstrap_A",
"name": "consul.bootstrap",
"records.#": "6",
"records.1148461392": "10.201.3.8",
"records.1169574759": "10.201.2.8",
"records.1206973758": "10.201.1.8",
"records.1275070284": "10.201.2.4",
"records.1304587643": "10.201.3.4",
"records.1313257749": "10.201.1.4",
"set_identifier": "",
"ttl": "300",
"type": "A",
"weight": "-1",
"zone_id": "Z68734P5178QN"
}
}
},
"aws_route53_record.oasis-consul-bootstrap-ns": {
"type": "aws_route53_record",
"depends_on": [
"aws_route53_zone.oasis-consul-bootstrap",
"aws_route53_zone.oasis-consul-bootstrap",
"aws_route53_zone.oasis-consul-bootstrap",
"aws_route53_zone.oasis-consul-bootstrap",
"aws_route53_zone.oasis-consul-bootstrap"
],
"primary": {
"id": "Z68734P5178QN_consul.bootstrap_NS",
"attributes": {
"failover": "",
"fqdn": "consul.bootstrap",
"health_check_id": "",
"id": "Z68734P5178QN_consul.bootstrap_NS",
"name": "consul.bootstrap",
"records.#": "4",
"records.1796532126": "ns-512.awsdns-00.net.",
"records.2728059479": "ns-1536.awsdns-00.co.uk.",
"records.4092160370": "ns-1024.awsdns-00.org.",
"records.456007465": "ns-0.awsdns-00.com.",
"set_identifier": "",
"ttl": "30",
"type": "NS",
"weight": "-1",
"zone_id": "Z68734P5178QN"
}
}
},
"aws_route53_zone.oasis-consul-bootstrap": {
"type": "aws_route53_zone",
"primary": {
"id": "Z68734P5178QN",
"attributes": {
"comment": "Used to bootstrap consul dns",
"id": "Z68734P5178QN",
"name": "consul.bootstrap",
"name_servers.#": "4",
"name_servers.0": "ns-0.awsdns-00.com.",
"name_servers.1": "ns-1024.awsdns-00.org.",
"name_servers.2": "ns-1536.awsdns-00.co.uk.",
"name_servers.3": "ns-512.awsdns-00.net.",
"tags.#": "0",
"vpc_id": "vpc-65814701",
"vpc_region": "us-west-2",
"zone_id": "Z68734P5178QN"
}
}
}
}
}
]
}

View File

@@ -0,0 +1,77 @@
{
"version": 3,
"terraform_version": "0.11.2",
"serial": 5,
"lineage": "5ffde9fb-4814-4609-a8a6-f1054f1779c1",
"modules": [
{
"path": [
"root"
],
"outputs": {},
"resources": {
"aws_iam_role.iam_for_lambda": {
"type": "aws_iam_role",
"depends_on": [],
"primary": {
"id": "iam_for_lambda",
"attributes": {
"arn": "arn:aws:iam::387412527620:role/iam_for_lambda",
"assume_role_policy": "{\"Version\":\"2012-10-17\",\"Statement\":[{\"Sid\":\"\",\"Effect\":\"Allow\",\"Principal\":{\"Service\":\"lambda.amazonaws.com\"},\"Action\":\"sts:AssumeRole\"}]}",
"create_date": "2018-01-27T04:05:27Z",
"force_detach_policies": "false",
"id": "iam_for_lambda",
"name": "iam_for_lambda",
"path": "/",
"unique_id": "AROAINXWJF2AIJOZMQXOE"
},
"meta": {},
"tainted": false
},
"deposed": [],
"provider": "provider.aws"
},
"aws_lambda_function.query-fitbit": {
"type": "aws_lambda_function",
"depends_on": [
"aws_iam_role.iam_for_lambda"
],
"primary": {
"id": "query-fitbit",
"attributes": {
"arn": "arn:aws:lambda:us-east-1:387412527620:function:query-fitbit",
"dead_letter_config.#": "0",
"description": "",
"environment.#": "0",
"filename": "../lambda/query-fitbit.zip",
"function_name": "query-fitbit",
"handler": "exports.handler",
"id": "query-fitbit",
"invoke_arn": "arn:aws:apigateway:us-east-1:lambda:path/2015-03-31/functions/arn:aws:lambda:us-east-1:387412527620:function:query-fitbit/invocations",
"kms_key_arn": "",
"last_modified": "2018-01-27T04:11:31.185+0000",
"memory_size": "128",
"publish": "false",
"qualified_arn": "arn:aws:lambda:us-east-1:387412527620:function:query-fitbit:$LATEST",
"reserved_concurrent_executions": "0",
"role": "arn:aws:iam::387412527620:role/iam_for_lambda",
"runtime": "nodejs6.10",
"source_code_hash": "mNFY3lZD4jFsVq/f353zMD9MLSBvoaEbObIB1KBnxq4=",
"tags.%": "0",
"timeout": "3",
"tracing_config.#": "1",
"tracing_config.0.mode": "PassThrough",
"version": "$LATEST",
"vpc_config.#": "0"
},
"meta": {},
"tainted": false
},
"deposed": [],
"provider": "provider.aws"
}
},
"depends_on": []
}
]
}

View File

@@ -0,0 +1,23 @@
{
"indent": 4,
"maxlen": 120,
"browser": false,
"couch": false,
"devel": false,
"node": false,
"rhino": false,
"white": true,
"plusplus":true,
"stupid":true,
"predef": [
"setTimeout",
"module",
"exports",
"define",
"require",
"window",
"buster",
"sinon"
]
}

View File

@@ -0,0 +1,19 @@
/* OUTPUT_FORMAT("elf32-littlearm", "elf32-bigarm", "elf32-littlearm") */
/* OUTPUT_ARCH(arm) */
ENTRY(__adbi$entry)
SECTIONS
{
. = 0x00000000 + SIZEOF_HEADERS;
.adbi : {
*(.rodata)
*(.rodata.*)
*(.data) *(.data.*)
*(.bss) *(.bss.*)
*(.text)
*(.text.*)
*(.adbi)
*(.adbi.*)
} = 0
}

57
samples/Logos/NCHax.x Normal file
View File

@@ -0,0 +1,57 @@
#import <UIKit/UIKit.h>
#import <BulletinBoard/BBSectionInfo.h>
#import <UIKit/UIImage+Private.h>
#import <version.h>
static NSString *const kHBDPWeeAppIdentifier = @"ws.hbang.dailypaperweeapp";
#pragma mark - Change section header and icon
// courtesy of benno
BOOL isDailyPaper = NO;
%hook SBBulletinObserverViewController
- (void)_addSection:(BBSectionInfo *)section toCategory:(NSInteger)category widget:(id)widget {
if ([section.sectionID isEqualToString:kHBDPWeeAppIdentifier]) {
isDailyPaper = YES;
%orig;
isDailyPaper = NO;
} else {
%orig;
}
}
%end
%hook SBBulletinListSection
- (void)setDisplayName:(NSString *)displayName {
%orig(isDailyPaper ? @"Current Wallpaper" : displayName);
}
- (void)setIconImage:(UIImage *)iconImage {
%orig(isDailyPaper ? [UIImage imageNamed:@"icon" inBundle:[NSBundle bundleWithPath:@"/Library/PreferenceBundles/DailyPaper.bundle"]] : iconImage);
}
%end
#pragma mark - Enable by default
%hook SBNotificationCenterDataProviderController
- (NSArray *)_copyDefaultEnabledWidgetIDs {
NSArray *defaultWidgets = %orig;
return [[defaultWidgets arrayByAddingObject:kHBDPWeeAppIdentifier] copy];
}
%end
#pragma mark - Constructor
%ctor {
if (!IS_IOS_OR_NEWER(iOS_8_0)) {
%init;
}
}

42
samples/Logos/NoCarrier.x Normal file
View File

@@ -0,0 +1,42 @@
//
// NoCarrier.x
// NoCarrier
//
// Created by Jonas Gessner on 27.01.2014.
// Copyright (c) 2014 Jonas Gessner. All rights reserved.
//
#import <CoreGraphics/CoreGraphics.h>
#include <substrate.h>
%group main
%hook UIStatusBarServiceItemView
- (id)_serviceContentsImage {
return nil;
}
- (CGFloat)extraLeftPadding {
return 0.0f;
}
- (CGFloat)extraRightPadding {
return 0.0f;
}
- (CGFloat)standardPadding {
return 2.0f;
}
%end
%end
%ctor {
@autoreleasepool {
%init(main);
}
}

239
samples/Logos/Tweak.x Normal file
View File

@@ -0,0 +1,239 @@
/*
* ShadowSocks Per-App Proxy Plugin
* https://github.com/linusyang/MobileShadowSocks
*
* Copyright (c) 2014 Linus Yang <laokongzi@gmail.com>
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*
*/
#include <UIKit/UIKit.h>
#include <libfinder/LFFinderController.h>
#define FUNC_NAME SCDynamicStoreCopyProxies
#define ORIG_FUNC original_ ## FUNC_NAME
#define CUST_FUNC custom_ ## FUNC_NAME
#define DECL_FUNC(ret, ...) \
extern ret FUNC_NAME(__VA_ARGS__); \
static ret (*ORIG_FUNC)(__VA_ARGS__); \
ret CUST_FUNC(__VA_ARGS__)
#define HOOK_FUNC() \
MSHookFunction(FUNC_NAME, (void *) CUST_FUNC, (void **) &ORIG_FUNC)
typedef const struct __SCDynamicStore *SCDynamicStoreRef;
void MSHookFunction(void *symbol, void *replace, void **result);
static BOOL proxyEnabled = YES;
static BOOL spdyDisabled = YES;
static BOOL finderEnabled = YES;
static BOOL getValue(NSDictionary *dict, NSString *key, BOOL defaultVal)
{
if (dict == nil || key == nil) {
return defaultVal;
}
NSNumber *valObj = [dict objectForKey:key];
if (valObj == nil) {
return defaultVal;
}
return [valObj boolValue];
}
static void updateSettings(void)
{
proxyEnabled = YES;
spdyDisabled = YES;
finderEnabled = YES;
NSDictionary *dict = [[NSDictionary alloc] initWithContentsOfFile:@"/var/mobile/Library/Preferences/com.linusyang.ssperapp.plist"];
if (dict != nil) {
NSString *bundleName = [[NSBundle mainBundle] bundleIdentifier];
if (getValue(dict, @"SSPerAppEnabled", NO) && bundleName != nil) {
NSString *entry = [[NSString alloc] initWithFormat:@"Enabled-%@", bundleName];
proxyEnabled = getValue(dict, entry, NO);
if (getValue(dict, @"SSPerAppReversed", NO)) {
proxyEnabled = !proxyEnabled;
}
[entry release];
}
spdyDisabled = getValue(dict, @"SSPerAppDisableSPDY", YES);
finderEnabled = getValue(dict, @"SSPerAppFinder", YES);
[dict release];
}
}
DECL_FUNC(CFDictionaryRef, SCDynamicStoreRef store)
{
if (proxyEnabled) {
return ORIG_FUNC(store);
}
CFMutableDictionaryRef proxyDict = CFDictionaryCreateMutable(kCFAllocatorDefault, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);
int zero = 0;
CFNumberRef zeroNumber = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &zero);
CFDictionarySetValue(proxyDict, CFSTR("HTTPEnable"), zeroNumber);
CFDictionarySetValue(proxyDict, CFSTR("HTTPProxyType"), zeroNumber);
CFDictionarySetValue(proxyDict, CFSTR("HTTPSEnable"), zeroNumber);
CFDictionarySetValue(proxyDict, CFSTR("ProxyAutoConfigEnable"), zeroNumber);
CFRelease(zeroNumber);
return proxyDict;
}
@interface SettingTableViewController <LFFinderActionDelegate>
- (BOOL)useLibFinder;
- (UIViewController *)allocFinderController;
- (void)finderSelectedFilePath:(NSString *)path checkSanity:(BOOL)check;
@end
%group FinderHook
%hook SettingTableViewController
- (BOOL)useLibFinder
{
return finderEnabled;
}
- (UIViewController *)allocFinderController
{
LFFinderController* finder = [[LFFinderController alloc] initWithMode:LFFinderModeDefault];
finder.actionDelegate = self;
return finder;
}
%new
-(void)finder:(LFFinderController*)finder didSelectItemAtPath:(NSString*)path
{
[self finderSelectedFilePath:path checkSanity:NO];
}
%end
%end
%group TwitterHook
%hook T1SPDYConfigurationChangeListener
- (BOOL)_shouldEnableSPDY
{
if (spdyDisabled) {
return NO;
} else {
return %orig;
}
}
%end
%end
%group FacebookHook
%hook FBRequester
- (BOOL)allowSPDY
{
if (spdyDisabled) {
return NO;
} else {
return %orig;
}
}
- (BOOL)useDNSCache
{
if (spdyDisabled) {
return NO;
} else {
return %orig;
}
}
%end
%hook FBNetworkerRequest
- (BOOL)disableSPDY
{
if (spdyDisabled) {
return YES;
} else {
return %orig;
}
}
%end
%hook FBRequesterState
- (BOOL)didUseSPDY
{
if (spdyDisabled) {
return NO;
} else {
return %orig;
}
}
%end
%hook FBAppConfigService
- (BOOL)disableDNSCache
{
if (spdyDisabled) {
return YES;
} else {
return %orig;
}
}
%end
%hook FBNetworker
- (BOOL)_shouldAllowUseOfDNSCache:(id)arg
{
if (spdyDisabled) {
return NO;
} else {
return %orig;
}
}
%end
%hook FBAppSessionController
- (BOOL)networkerShouldAllowUseOfDNSCache:(id)arg
{
if (spdyDisabled) {
return NO;
} else {
return %orig;
}
}
%end
%end
%ctor
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSString *bundleName = [[NSBundle mainBundle] bundleIdentifier];
if (bundleName != nil && ![bundleName isEqualToString:@"com.apple.springboard"]) {
updateSettings();
CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(), NULL, (CFNotificationCallback) updateSettings, CFSTR("com.linusyang.ssperapp.settingschanged"), NULL, CFNotificationSuspensionBehaviorCoalesce);
if ([bundleName isEqualToString:@"com.linusyang.MobileShadowSocks"]) {
%init(FinderHook);
} else {
HOOK_FUNC();
if ([bundleName isEqualToString:@"com.atebits.Tweetie2"]) {
%init(TwitterHook);
} else if ([bundleName isEqualToString:@"com.facebook.Facebook"]) {
%init(FacebookHook);
}
}
}
[pool drain];
}

5
samples/Logos/string1.x Normal file
View File

@@ -0,0 +1,5 @@
# APPLE LOCAL file string workaround 4943900
if { [istarget "*-*-darwin\[9123\]*"] } {
set additional_flags "-framework Foundation -fconstant-cfstrings"
}
return 0

488
samples/Lua/treegen.p8 Normal file
View File

@@ -0,0 +1,488 @@
pico-8 cartridge // http://www.pico-8.com
version 7
// taken from: https://github.com/lvictorino/pico8/blob/master/treegen.p8
__lua__
-- tree generation
-- basic space colonization algorithm
-- by laurent victorino
tree={} --tree node list
influence={} --influence list
newnodedist=5 -- distance between tree nodes
influencedist=50 -- attraction max distance
influencekilldist=10 -- distance at which an influence node is killed
crownw=64 -- tree crown width
crownh=64 -- tree crown height
crownx=64 -- tree crown x center position
cronwy=64 -- tree crown y center position
generate=false -- has generation started?
function _init()
-- randomize properties
newnodedist=rnd(10)+2
influencedist=rnd(60)+20
influencekilldist=rnd(20)+8
crownx=64+rnd(40)*(rnd(1)-rnd(1))
crowny=64+rnd(40)*(rnd(1)-rnd(1))
crownw=rnd(60)+30
crownh=rnd(60)+20
--
generate = false
initialize_root()
initialize_crown()
end
-- initialize first tree node
function initialize_root()
tree={}
add(tree,newnode(rnd(10)+54,127,nil,7))
end
-- initialize crown size and influence
function initialize_crown()
influence={}
-- create an eliptic crown composed of 100->400 influence nodes
for i=0,rnd(100)+300 do
a = rnd(1)
x = crownx + rnd(crownw) * cos(a)
y = crowny + rnd(crownh) * sin(a)
-- add a new influence to the list
add(influence,newnode(x,y,nil,5))
end
end
function _update()
-- x button: generate a new set
if btn(4) and btnp(4) == true then
_init()
end
-- c button: start generation
if btn(5) and btnp(5) == true then
generate=true
end
-- generation loop
if #influence != 0 and generate==true then
-- reset all tree nodes influence
for c in all(tree) do c.resetinfluence(c) end
-- is there any remaining influence?
flag=false
-- for every influence node
-- check what node they it influenced
for i in all(influence) do
closest=nil
for t in all(tree) do
if distvector(i,t) < influencedist
and (closest==nil or abs(distvector(i,t)) < abs(distvector(i,closest))) then
flag=true
closest=t
end
end
if closest!=nil then
closest.addinfluence(closest,i)
end
end
-- if no influence remains stop the generation
if flag == false then
influence={}
generate=false
return
end
-- for every tree node
-- compute the influence vector
-- and add a new tree node to the list
for t in all(tree) do
if #t.influence != 0 then
medv={}
medv.x=0
medv.y=0
for i in all(t.influence) do
dist=distvector(i,t)
medv.x+=(i.x-t.x)/dist -- closest influence nodes are more powerful
medv.y+=(i.y-t.y)/dist
-- destroy influence if too close
if dist < influencekilldist then
del(influence,i)
end
end
-- compute the influence vector
medv.x /= #t.influence
medv.y /= #t.influence
-- normalize influence vector
newn=normalize(medv)
-- compute new node position
newn.x=t.x+newnodedist*newn.x
newn.y=t.y+newnodedist*newn.y
-- add new node to the list
add(tree,newnode(newn.x,newn.y,t))
end
end
end
end
function _draw()
cls()
-- draw tree lines
for t in all(tree) do
if t.parent != nil then
line(t.x,t.y,t.parent.x,t.parent.y,4)
end
end
-- draw influence
for i in all(influence) do
i.draw(i,8)
end
-- helpers
color(13)
print("c:new set\tx:generate")
print("nodes\t\t\t\t\tcount:"..#tree.."\tdist:"..flr(newnodedist))
print("influence\tcount:"..#influence.."\tdist:"..flr(influencedist))
if #influence==0 then
print("generation is over.",0,123)
end
end
-- generate a new node
-- params: xpos,ypos,parent node to be attached to
function newnode(x,y,parent)
n={}
n.x=x
n.y=y
-- set parent
n.parent=parent
-- list of influence node
n.influence={}
-- draw node as crosses
n.draw=function(node,col)
line(node.x,node.y-1,node.x,node.y+1,col)
line(node.x-1,node.y,node.x+1,node.y,col)
end
-- add an influence node to the list
n.addinfluence=function(node,influence)
add(node.influence,influence)
end
-- reset the influence list
n.resetinfluence=function(node)
node.influence={}
end
return n
end
-- return the distance between
-- two vectors
function distvector(v1,v2)
vx=v1.x-v2.x
vy=v1.y-v2.y
return sqrt(vx*vx+vy*vy)
end
-- return the magnitude of a vector
function magnitude(v)
return sqrt(v.x*v.x+v.y*v.y)
end
-- return a normalized vector
function normalize(v)
vp={}
vp.x=v.x/magnitude(v)
vp.y=v.y/magnitude(v)
return vp
end
__gfx__
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00700700000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00077000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00077000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00700700000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
__gff__
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
__map__
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
__sfx__
000100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
__music__
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344
00 41424344

1
samples/Markdown/symlink.md Symbolic link
View File

@@ -0,0 +1 @@
README.mdown

View File

@@ -0,0 +1,83 @@
#Import "<std>"
Using std..
'Set your own path here. Defaults to build folder.
Global path:= AppDir() + "encodeToPng.png"
Function Main()
'Write from PixMap
Local source := New Pixmap( 100, 100 )
For Local y := 0 Until source.Width
For Local x := 0 Until source.Height
'Generates random pixels
source.SetPixelARGB( x, y, ARGB( 255, Rnd(0, 255), 0, Rnd(0, 255) ) )
Next
Next
source.Save( path )
'Read from png to PixMap
Local dest := Pixmap.Load( path )
Local a := ""
Local r := ""
Local g := ""
Local b := ""
For Local y := 0 Until dest.Width
For Local x := 0 Until source.Height
Local argb := dest.GetPixelARGB(x,y)
a += ARGBToAlpha( argb ) + " "
r += ARGBToRed( argb ) + " "
g += ARGBToGreen( argb ) + " "
b += ARGBToBlue( argb ) + " "
Next
a += "~n"
r += "~n"
g += "~n"
b += "~n"
Next
'Print resulting pixels
Print( " ~nAlpha:~n" + a )
Print( " ~nRed:~n" + r )
Print( " ~nGreen:~n" + g )
Print( " ~nBlue:~n" + b )
End
'**************** Color Functions ****************
Function ARGB:UInt( a:Float, r:Float, g:Float, b:Float )
Assert ( a<=1.0, "Alpha max value is 1.0" )
Assert ( r<=1.0, "Red max value is 1.0" )
Assert ( g<=1.0, "Green max value is 1.0" )
Assert ( b<=1.0, "Blue max value is 1.0" )
Return UInt(a*255) Shl 24 | UInt(r*255) Shl 16 | UInt(g*255) Shl 8 | UInt(b*255)
End
Function ARGB:UInt( a:Int, r:Int, g:Int, b:Int )
Assert ( a<256, "Alpha can't be higher than 255" )
Assert ( r<256, "Red can't be higher than 255" )
Assert ( g<256, "Green can't be higher than 255" )
Assert ( b<256, "Blue can't be higher than 255" )
Return( a Shl 24 | r Shl 16 | g Shl 8 | b )
End
Function ARGBToAlpha:Int( argb:UInt )
Return argb Shr 24 & $ff
End
Function ARGBToRed:Int( argb:UInt )
Return argb Shr 16 & $ff
End
Function ARGBToGreen:Int( argb:UInt )
Return argb Shr 8 & $ff
End
Function ARGBToBlue:Int( argb:UInt )
Return argb & $ff
End

View File

@@ -0,0 +1,185 @@
Namespace example
#rem
multi
line
comment
#end
#rem
nested
#rem
multi
line
#end
comment
#end
'Importing a module pre-compile in the modules folder
#Import "<mojo>"
'Setting search paths for namespaces
Using mojo..
Using std..
Const ONECONST:Int = 1
Const TWOCONST := 2
Const THREECONST := 3, FOURCONST:Int = 4
Global someVariable:Int = 4
Function Main()
'creating arrays
Local scores:Int[]= New Int[](10,20,30)
Local text:String[]= New String[]( "Hello","There","World" )
' string type
Local string1:String = "Hello world"
Local string2:= "Hello world"
' escape characers in strings
Local string4 := "~qHello World~q"
Local string5 := "~tIndented~n"
Local string6 := "tilde is wavey... ~~"
Print string4
Print string5
Print string6
' string pseudofunctions
Print " Hello World ~n".Trim() ' prints "Hello World" whithout whitespace
Print "Hello World".ToUpper() ' prints "HELLO WORLD"
' preprocessor keywords
#If __TARGET__ = "android"
'DoStuff()
#ElseIf __TARGET__ = "ios"
'DoOtherStuff()
#End
' operators
Local a := 32
Local b := 32 ~ 0
b ~= 16
b |= 16
b &= 16
Local c := a | b
Print c
'Creates a new Window class and starts the main App loop, using the Mojo module
New AppInstance
New GameWindow
App.Run()
End
'------------------------------------------ Class Examples ------------------------------------------
'You can extend the Window class to customize its behavior
Class GameWindow Extends Window
Private
Field _spiral :Float[]
Field _circle :Float[]
Public
Method New()
Super.New( "Test", 800, 600, WindowFlags.Resizable )
End
'Properties can be used to create "read-only" values
Property Spiral:Float[]()
Return _spiral
End
'Or to control what happens to a value when assigned
Property Circle:Float[]()
Return _circle
Setter( values:Float[] )
If( values.Length > 2 ) And ( values.Length Mod 2 = 0 )
_circle = values
Else
Print( "Circle values need to be an even number larger than 1" )
End
End
'Methods require a class instance. The keyword Self is optional when accessing fields and properties
'The method Window.OnRender is virtual, and can be overriden
'Width and Height are Propreties inherited from the Window class
Method OnRender( canvas:Canvas ) Override
RequestRender()
canvas.Clear( Color.DarkGrey )
canvas.Translate( Width/2.0, Height/2.0 )
canvas.Rotate( -Millisecs()/200.0 )
canvas.Color = New Color( 1, 0.8, 0.2 )
DrawLines( canvas, Spiral )
DrawLines( canvas, Circle, True )
End
'This method is called whenever the window layout changes, like when resizing
Method OnLayout() Override
_spiral = CreateSpiral( 0, 0, Height/1.5, Height/1.5, 100 )
Circle = CreateCircle( 0, 0, Height/1.5, Height/1.5, 100 )
End
'Functions can be called without a GameWindow instance, like "Static Functions" in other languages.
Function DrawLines( canvas:Canvas, lines:Float[], closedShape:Bool = False )
For Local n := 0 Until lines.Length Step 2
Local l := lines.Length - 3
Local x0 := lines[n]
Local y0 := lines[n+1]
Local x1 := n<l? lines[n+2] Else (closedShape? lines[0] Else x0 ) 'Conditional assignment, uses the "?" symbol to test a condition
Local y1 := n<l? lines[n+3] Else (closedShape? lines[1] Else y0 )
canvas.DrawLine( x0, y0, x1, y1 )
Next
End
Function CreateSpiral:Float[]( x:Double, y:Double, width:Double, height:Double, sides:Int = 32, turns:Float = 3.0 )
Local stack := New Stack<Float>
Local radStep := (Pi*2.0)/Float(sides)
Local xMult := 0.0
Local yMult := 0.0
Local radiusX:Float = width/2.0
Local radiusY:Float = height/2.0
Local stepX:Float = radiusX / sides
Local stepY:Float = radiusY / sides
For Local a := 0.0 To Pi*2 Step radStep
stack.Push( ( ( Sin( a*turns ) * radiusX )* xMult ) + x )
stack.Push( ( ( Cos( a*turns ) * radiusY )* yMult ) + y )
xMult += stepX/radiusX
yMult += stepY/radiusY
Next
Return stack.ToArray()
End
Function CreateCircle:Float[]( x:Double, y:Double, width:Double, height:Double, sides:Int = 32 )
Local stack := New Stack<Float>
Local radStep := (Pi*2.0)/Float(sides)
Local radiusX:Float = width/2.0
Local radiusY:Float = height/2.0
For Local a := 0.0 To Pi*2 Step radStep
stack.Push( ( Sin( a ) * radiusX ) + x )
stack.Push( ( Cos( a ) * radiusY ) + y )
Next
Return stack.ToArray()
End
End
'--------- extending with generics -----------------------------------------------------------------------------
Class MyList Extends List<Double>
End
'--------- interfaces ------------------------------------------------------------------------------------------
Interface Computer
Method Boot ()
Method Process ()
Method Display ()
End
'
Class PC Implements Computer
End

115
samples/Monkey/gui.monkey2 Normal file
View File

@@ -0,0 +1,115 @@
#Import "<mojo>"
#Import "<mojox>"
Using std..
Using mojo..
Using mojox..
Function Main()
New AppInstance
New TestGui
App.Run()
End
Class TestGui Extends Window
Field mainDock:DockingView
Field rgtDock:ScrollView
Field graphView:GraphView
Const smallFont:Font = Font.Load( "font::DejaVuSans.ttf", 10 )
Method New()
Super.New( "Test", 1024, 640, WindowFlags.Resizable )
mainDock = New MainDock()
rgtDock = New RightDock()
mainDock.AddView( rgtDock, "right", "400", True )
ContentView = mainDock
End
End
Class MainDock Extends DockingView
Method New()
Layout="fill"
Local newStyle := Style.Copy()
newStyle.BackgroundColor = Color.DarkGrey
newStyle.BorderColor = Color.Black
newStyle.Font = TestGui.smallFont
Style = newStyle
End
Method OnRender( canvas:Canvas ) Override
Super.OnRender( canvas )
canvas.Color = New Color( Rnd(), Rnd(), Rnd() )
canvas.DrawCircle( Frame.Width/4, Frame.Height/2, Frame.Height/4 )
canvas.Color = Color.Aluminum
canvas.DrawText( "gameview:" + App.FPS + " fps", 5, 5 )
End
End
Class RightDock Extends ScrollView
Private
Field _panSpeed := 10.0
Public
Method New()
Layout="fill"
ScrollBarsVisible = True
Local newStyle := Style.Copy()
newStyle.BackgroundColor = Color.Grey
newStyle.BorderColor = Color.Black
newStyle.Font = TestGui.smallFont
Style = newStyle
Local graph:=New GraphView
ContentView = graph
Scroll = New Vec2i( graph.Frame.Width/2, graph.Frame.Height/2 ) 'Doesn't work!
End
Method OnRender( canvas:Canvas ) Override
Super.OnRender( canvas )
canvas.Color = Color.Aluminum
canvas.DrawText( "size:" + Frame + " ,scroll:" + Scroll , 5, 5 )
End
Method OnMouseEvent( event:MouseEvent ) Override
Select event.Type
Case EventType.MouseWheel
Scroll = New Vec2i( Scroll.X+(event.Wheel.X*_panSpeed), Scroll.Y-(event.Wheel.Y*_panSpeed) )
App.RequestRender()
End
End
End
Class GraphView Extends View
Private
Field _panSpeed := 5.0
Field _size := New Vec2i( 1024, 1024 )
Public
Method New()
MinSize=New Vec2i( _size.X, _size.Y )
End
Method OnRender( canvas:Canvas ) Override
Local r:= 20.0
For Local x := 1 Until 10
For Local y := 1 Until 10
Local v := (x/10.0) -0.05
canvas.Color = New Color( v, v, v )
canvas.DrawCircle( (x*100)+r, (y*100)+r, r )
Next
Next
End
End

View File

@@ -0,0 +1,29 @@
'Showcases use of Lambda functions and Generics.
#Import "<std>"
Using std..
Function Main()
Local testStack := New Stack< MyObject >
For Local n := 1 To 20
Local newItem := New MyObject
newItem.depth = Rnd( 0, 100 )
testStack.Push( newItem )
Next
testStack.Sort( Lambda:Int( x:MyObject,y:MyObject )
Return x.depth<=>y.depth
End )
For Local n := Eachin testStack
Print( n.depth )
Next
End
Struct MyObject
Field depth := 0
End

67
samples/Nextflow/blast.nf Normal file
View File

@@ -0,0 +1,67 @@
#!/usr/bin/env nextflow
/*
* This is free and unencumbered software released into the public domain.
*
* Anyone is free to copy, modify, publish, use, compile, sell, or
* distribute this software, either in source code form or as a compiled
* binary, for any purpose, commercial or non-commercial, and by any
* means.
*
* In jurisdictions that recognize copyright laws, the author or authors
* of this software dedicate any and all copyright interest in the
* software to the public domain. We make this dedication for the benefit
* of the public at large and to the detriment of our heirs and
* successors. We intend this dedication to be an overt act of
* relinquishment in perpetuity of all present and future rights to this
* software under copyright law.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
* EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
* MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
* IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
* OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
* ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
* OTHER DEALINGS IN THE SOFTWARE.
*
* For more information, please refer to <http://unlicense.org/>
*/
/*
* Author Paolo Di Tommaso <paolo.ditommaso@gmail.com>
*/
params.query = "$HOME/sample.fa"
params.db = "$HOME/tools/blast-db/pdb/pdb"
process blast {
output:
file top_hits
"""
blastp -query ${params.query} -db ${params.db} -outfmt 6 \
| head -n 10 \
| cut -f 2 > top_hits
"""
}
process extract {
input:
file top_hits
output:
file sequences
"""
blastdbcmd -db ${params.db} -entry_batch $top_hits > sequences
"""
}
process align {
input:
file sequences
echo true
"""
t_coffee $sequences 2>&- | tee align_result
"""
}

496
samples/Nextflow/callings.nf Executable file
View File

@@ -0,0 +1,496 @@
#!/usr/bin/env nextflow
/*
* This is free and unencumbered software released into the public domain.
*
* Anyone is free to copy, modify, publish, use, compile, sell, or
* distribute this software, either in source code form or as a compiled
* binary, for any purpose, commercial or non-commercial, and by any
* means.
*
* In jurisdictions that recognize copyright laws, the author or authors
* of this software dedicate any and all copyright interest in the
* software to the public domain. We make this dedication for the benefit
* of the public at large and to the detriment of our heirs and
* successors. We intend this dedication to be an overt act of
* relinquishment in perpetuity of all present and future rights to this
* software under copyright law.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
* EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
* MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
* IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
* OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
* ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
* OTHER DEALINGS IN THE SOFTWARE.
*
* For more information, please refer to <http://unlicense.org/>
*/
/*
* 'CalliNGS-NF' - A Nextflow pipeline for variant calling with NGS data
*
* This pipeline that reproduces steps from the GATK best practics of SNP
* calling with RNAseq data procedure:
* https://software.broadinstitute.org/gatk/guide/article?id=3891
*
* Anna Vlasova
* Emilio Palumbo
* Paolo Di Tommaso
* Evan Floden
*/
/*
* Define the default parameters
*/
params.genome = "$baseDir/data/genome.fa"
params.variants = "$baseDir/data/known_variants.vcf.gz"
params.blacklist = "$baseDir/data/blacklist.bed"
params.reads = "$baseDir/data/reads/rep1_{1,2}.fq.gz"
params.results = "results"
params.gatk = '/usr/local/bin/GenomeAnalysisTK.jar'
params.gatk_launch = "java -jar $params.gatk"
log.info "C A L L I N G S - N F v 1.0"
log.info "================================"
log.info "genome : $params.genome"
log.info "reads : $params.reads"
log.info "variants : $params.variants"
log.info "blacklist: $params.blacklist"
log.info "results : $params.results"
log.info "gatk : $params.gatk"
log.info ""
/*
* Parse the input parameters
*/
GATK = params.gatk_launch
genome_file = file(params.genome)
variants_file = file(params.variants)
blacklist_file = file(params.blacklist)
reads_ch = Channel.fromFilePairs(params.reads)
/**********
* PART 1: Data preparation
*
* Process 1A: Create a FASTA genome index (.fai) with samtools for GATK
*/
process '1A_prepare_genome_samtools' {
tag "$genome.baseName"
input:
file genome from genome_file
output:
file "${genome}.fai" into genome_index_ch
script:
"""
samtools faidx ${genome}
"""
}
/*
* Process 1B: Create a FASTA genome sequence dictionary with Picard for GATK
*/
process '1B_prepare_genome_picard' {
tag "$genome.baseName"
input:
file genome from genome_file
output:
file "${genome.baseName}.dict" into genome_dict_ch
script:
"""
PICARD=`which picard.jar`
java -jar \$PICARD CreateSequenceDictionary R= $genome O= ${genome.baseName}.dict
"""
}
/*
* Process 1C: Create STAR genome index file.
*/
process '1C_prepare_star_genome_index' {
tag "$genome.baseName"
input:
file genome from genome_file
output:
file "genome_dir" into genome_dir_ch
script:
"""
mkdir genome_dir
STAR --runMode genomeGenerate \
--genomeDir genome_dir \
--genomeFastaFiles ${genome} \
--runThreadN ${task.cpus}
"""
}
/*
* Process 1D: Create a file containing the filtered and recoded set of variants
*/
process '1D_prepare_vcf_file' {
tag "$variantsFile.baseName"
input:
file variantsFile from variants_file
file blacklisted from blacklist_file
output:
set file("${variantsFile.baseName}.filtered.recode.vcf.gz"), file("${variantsFile.baseName}.filtered.recode.vcf.gz.tbi") into prepared_vcf_ch
script:
"""
vcftools --gzvcf $variantsFile -c \
--exclude-bed ${blacklisted} \
--recode | bgzip -c \
> ${variantsFile.baseName}.filtered.recode.vcf.gz
tabix ${variantsFile.baseName}.filtered.recode.vcf.gz
"""
}
/*
* END OF PART 1
*********/
/**********
* PART 2: STAR RNA-Seq Mapping
*
* Process 2: Align RNA-Seq reads to the genome with STAR
*/
process '2_rnaseq_mapping_star' {
tag "$replicateId"
input:
file genome from genome_file
file genomeDir from genome_dir_ch
set replicateId, file(reads) from reads_ch
output:
set replicateId, file('Aligned.sortedByCoord.out.bam'), file('Aligned.sortedByCoord.out.bam.bai') into aligned_bam_ch
script:
"""
# ngs-nf-dev Align reads to genome
STAR --genomeDir $genomeDir \
--readFilesIn $reads \
--runThreadN ${task.cpus} \
--readFilesCommand zcat \
--outFilterType BySJout \
--alignSJoverhangMin 8 \
--alignSJDBoverhangMin 1 \
--outFilterMismatchNmax 999
# 2nd pass (improve alignmets using table of splice junctions and create a new index)
mkdir genomeDir
STAR --runMode genomeGenerate \
--genomeDir genomeDir \
--genomeFastaFiles $genome \
--sjdbFileChrStartEnd SJ.out.tab \
--sjdbOverhang 75 \
--runThreadN ${task.cpus}
# Final read alignments
STAR --genomeDir genomeDir \
--readFilesIn $reads \
--runThreadN ${task.cpus} \
--readFilesCommand zcat \
--outFilterType BySJout \
--alignSJoverhangMin 8 \
--alignSJDBoverhangMin 1 \
--outFilterMismatchNmax 999 \
--outSAMtype BAM SortedByCoordinate \
--outSAMattrRGline ID:$replicateId LB:library PL:illumina PU:machine SM:GM12878
# Index the BAM file
samtools index Aligned.sortedByCoord.out.bam
"""
}
/*
* END OF PART 2
******/
/**********
* PART 3: GATK Prepare Mapped Reads
*
* Process 3: Split reads that contain Ns in their CIGAR string.
* Creates k+1 new reads (where k is the number of N cigar elements)
* that correspond to the segments of the original read beside/between
* the splicing events represented by the Ns in the original CIGAR.
*/
process '3_rnaseq_gatk_splitNcigar' {
tag "$replicateId"
input:
file genome from genome_file
file index from genome_index_ch
file genome_dict from genome_dict_ch
set replicateId, file(bam), file(index) from aligned_bam_ch
output:
set replicateId, file('split.bam'), file('split.bai') into splitted_bam_ch
script:
"""
# SplitNCigarReads and reassign mapping qualities
$GATK -T SplitNCigarReads \
-R $genome -I $bam \
-o split.bam \
-rf ReassignOneMappingQuality \
-RMQF 255 -RMQT 60 \
-U ALLOW_N_CIGAR_READS \
--fix_misencoded_quality_scores
"""
}
/*
* END OF PART 3
******/
/***********
* PART 4: GATK Base Quality Score Recalibration Workflow
*
* Process 4: Base recalibrate to detect systematic errors in base quality scores,
* select unique alignments and index
*
*/
process '4_rnaseq_gatk_recalibrate' {
tag "$replicateId"
input:
file genome from genome_file
file index from genome_index_ch
file dict from genome_dict_ch
set replicateId, file(bam), file(index) from splitted_bam_ch
set file(variants_file), file(variants_file_index) from prepared_vcf_ch
output:
set sampleId, file("${replicateId}.final.uniq.bam"), file("${replicateId}.final.uniq.bam.bai") into (final_output_ch, bam_for_ASE_ch)
script:
sampleId = replicateId.replaceAll(/[12]$/,'')
"""
# Indel Realignment and Base Recalibration
$GATK -T BaseRecalibrator \
--default_platform illumina \
-cov ReadGroupCovariate \
-cov QualityScoreCovariate \
-cov CycleCovariate \
-knownSites ${variants_file} \
-cov ContextCovariate \
-R ${genome} -I ${bam} \
--downsampling_type NONE \
-nct ${task.cpus} \
-o final.rnaseq.grp
$GATK -T PrintReads \
-R ${genome} -I ${bam} \
-BQSR final.rnaseq.grp \
-nct ${task.cpus} \
-o final.bam
# Select only unique alignments, no multimaps
(samtools view -H final.bam; samtools view final.bam| grep -w 'NH:i:1') \
|samtools view -Sb - > ${replicateId}.final.uniq.bam
# Index BAM files
samtools index ${replicateId}.final.uniq.bam
"""
}
/*
* END OF PART 4
******/
/***********
* PART 5: GATK Variant Calling
*
* Process 5: Call variants with GATK HaplotypeCaller.
* Calls SNPs and indels simultaneously via local de-novo assembly of
* haplotypes in an active region.
* Filter called variants with GATK VariantFiltration.
*/
process '5_rnaseq_call_variants' {
tag "$sampleId"
input:
file genome from genome_file
file index from genome_index_ch
file dict from genome_dict_ch
set sampleId, file(bam), file(bai) from final_output_ch.groupTuple()
output:
set sampleId, file('final.vcf') into vcf_files
script:
"""
# fix absolute path in dict file
sed -i 's@UR:file:.*${genome}@UR:file:${genome}@g' $dict
echo "${bam.join('\n')}" > bam.list
# Variant calling
$GATK -T HaplotypeCaller \
-R $genome -I bam.list \
-dontUseSoftClippedBases \
-stand_call_conf 20.0 \
-o output.gatk.vcf.gz
# Variant filtering
$GATK -T VariantFiltration \
-R $genome -V output.gatk.vcf.gz \
-window 35 -cluster 3 \
-filterName FS -filter "FS > 30.0" \
-filterName QD -filter "QD < 2.0" \
-o final.vcf
"""
}
/*
* END OF PART 5
******/
/***********
* PART 6: Post-process variants file and prepare for Allele-Specific Expression and RNA Editing Analysis
*
* Process 6A: Post-process the VCF result
*/
process '6A_post_process_vcf' {
tag "$sampleId"
publishDir "$params.results/$sampleId"
input:
set sampleId, file('final.vcf') from vcf_files
set file('filtered.recode.vcf.gz'), file('filtered.recode.vcf.gz.tbi') from prepared_vcf_ch
output:
set sampleId, file('final.vcf'), file('commonSNPs.diff.sites_in_files') into vcf_and_snps_ch
script:
'''
grep -v '#' final.vcf | awk '$7~/PASS/' |perl -ne 'chomp($_); ($dp)=$_=~/DP\\=(\\d+)\\;/; if($dp>=8){print $_."\\n"};' > result.DP8.vcf
vcftools --vcf result.DP8.vcf --gzdiff filtered.recode.vcf.gz --diff-site --out commonSNPs
'''
}
/*
* Process 6B: Prepare variants file for allele specific expression (ASE) analysis
*/
process '6B_prepare_vcf_for_ase' {
tag "$sampleId"
publishDir "$params.results/$sampleId"
input:
set sampleId, file('final.vcf'), file('commonSNPs.diff.sites_in_files') from vcf_and_snps_ch
output:
set sampleId, file('known_snps.vcf') into vcf_for_ASE
file('AF.histogram.pdf') into gghist_pdfs
script:
'''
awk 'BEGIN{OFS="\t"} $4~/B/{print $1,$2,$3}' commonSNPs.diff.sites_in_files > test.bed
vcftools --vcf final.vcf --bed test.bed --recode --keep-INFO-all --stdout > known_snps.vcf
grep -v '#' known_snps.vcf | awk -F '\\t' '{print $10}' \
|awk -F ':' '{print $2}'|perl -ne 'chomp($_); \
@v=split(/\\,/,$_); if($v[0]!=0 ||$v[1] !=0)\
{print $v[1]/($v[1]+$v[0])."\\n"; }' |awk '$1!=1' \
>AF.4R
gghist.R -i AF.4R -o AF.histogram.pdf
'''
}
/*
* Group data for allele-specific expression.
*
* The `bam_for_ASE_ch` emites tuples having the following structure, holding the final BAM/BAI files:
*
* ( sample_id, file_bam, file_bai )
*
* The `vcf_for_ASE` channel emits tuples having the following structure, holding the VCF file:
*
* ( sample_id, output.vcf )
*
* The BAMs are grouped together and merged with VCFs having the same sample id. Finally
* it creates a channel named `grouped_vcf_bam_bai_ch` emitting the following tuples:
*
* ( sample_id, file_vcf, List[file_bam], List[file_bai] )
*/
bam_for_ASE_ch
.groupTuple()
.phase(vcf_for_ASE)
.map{ left, right ->
def sampleId = left[0]
def bam = left[1]
def bai = left[2]
def vcf = right[1]
tuple(sampleId, vcf, bam, bai)
}
.set { grouped_vcf_bam_bai_ch }
/*
* Process 6C: Allele-Specific Expression analysis with GATK ASEReadCounter.
* Calculates allele counts at a set of positions after applying
* filters that are tuned for enabling allele-specific expression
* (ASE) analysis
*/
process '6C_ASE_knownSNPs' {
tag "$sampleId"
publishDir "$params.results/$sampleId"
input:
file genome from genome_file
file index from genome_index_ch
file dict from genome_dict_ch
set sampleId, file(vcf), file(bam), file(bai) from grouped_vcf_bam_bai_ch
output:
file "ASE.tsv"
script:
"""
echo "${bam.join('\n')}" > bam.list
$GATK -R ${genome} \
-T ASEReadCounter \
-o ASE.tsv \
-I bam.list \
-sites ${vcf}
"""
}

View File

@@ -0,0 +1,50 @@
aws {
region = 'eu-west-1'
}
cloud {
autoscale {
enabled = true
minInstances = 3
starvingTimeout = '2 min'
terminateWhenIdle = true
}
imageId = 'ami-78ds78d'
instanceProfile = 'MyRole'
instanceType = 'r4.large'
sharedStorageId = 'fs-76ds76s'
spotPrice = 0.06
subnetId = 'subnet-8d98d7s'
}
env {
BAR = 'world'
FOO = 'hola'
}
mail {
from = 'paolo.ditommaso@gmail.com'
smtp {
auth = true
host = 'email-smtp.us-east-1.amazonaws.com'
password = 'my-secret'
port = 587
starttls {
enable = true
required = true
}
user = 'my-name'
}
}
process {
executor = 'slurm'
queue = 'cn-el7'
memory = '16GB'
cpus = 8
container = 'user/rnaseq-nf:latest'
}
trace {
fields = 'task_id,name,status,attempt,exit,queue'
}

135
samples/Nextflow/rnaseq.nf Normal file
View File

@@ -0,0 +1,135 @@
#!/usr/bin/env nextflow
/*
* This is free and unencumbered software released into the public domain.
*
* Anyone is free to copy, modify, publish, use, compile, sell, or
* distribute this software, either in source code form or as a compiled
* binary, for any purpose, commercial or non-commercial, and by any
* means.
*
* In jurisdictions that recognize copyright laws, the author or authors
* of this software dedicate any and all copyright interest in the
* software to the public domain. We make this dedication for the benefit
* of the public at large and to the detriment of our heirs and
* successors. We intend this dedication to be an overt act of
* relinquishment in perpetuity of all present and future rights to this
* software under copyright law.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
* EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
* MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
* IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
* OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
* ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
* OTHER DEALINGS IN THE SOFTWARE.
*
* For more information, please refer to <http://unlicense.org/>
*/
/*
* Proof of concept of a RNAseq pipeline implemented with Nextflow
*
* Authors:
* - Paolo Di Tommaso <paolo.ditommaso@gmail.com>
* - Emilio Palumbo <emiliopalumbo@gmail.com>
* - Evan Floden <evanfloden@gmail.com>
*/
params.reads = "$baseDir/data/ggal/*_{1,2}.fq"
params.transcriptome = "$baseDir/data/ggal/ggal_1_48850000_49020000.Ggal71.500bpflank.fa"
params.outdir = "."
params.multiqc = "$baseDir/multiqc"
log.info """\
R N A S E Q - N F P I P E L I N E
===================================
transcriptome: ${params.transcriptome}
reads : ${params.reads}
outdir : ${params.outdir}
"""
.stripIndent()
transcriptome_file = file(params.transcriptome)
multiqc_file = file(params.multiqc)
Channel
.fromFilePairs( params.reads )
.ifEmpty { error "Cannot find any reads matching: ${params.reads}" }
.into { read_pairs_ch; read_pairs2_ch }
process index {
tag "$transcriptome_file.simpleName"
input:
file transcriptome from transcriptome_file
output:
file 'index' into index_ch
script:
"""
salmon index --threads $task.cpus -t $transcriptome -i index
"""
}
process quant {
tag "$pair_id"
input:
file index from index_ch
set pair_id, file(reads) from read_pairs_ch
output:
file(pair_id) into quant_ch
script:
"""
salmon quant --threads $task.cpus --libType=U -i index -1 ${reads[0]} -2 ${reads[1]} -o $pair_id
"""
}
process fastqc {
tag "FASTQC on $sample_id"
input:
set sample_id, file(reads) from read_pairs2_ch
output:
file("fastqc_${sample_id}_logs") into fastqc_ch
script:
"""
mkdir fastqc_${sample_id}_logs
fastqc -o fastqc_${sample_id}_logs -f fastq -q ${reads}
"""
}
process multiqc {
publishDir params.outdir, mode:'copy'
input:
file('*') from quant_ch.mix(fastqc_ch).collect()
file(config) from multiqc_file
output:
file('multiqc_report.html')
script:
"""
cp $config/* .
echo "custom_logo: \$PWD/logo.png" >> multiqc_config.yaml
multiqc .
"""
}
workflow.onComplete {
println ( workflow.success ? "\nDone! Open the following report in your browser --> $params.outdir/multiqc_report.html\n" : "Oops .. something went wrong" )
}

100
samples/Perl/Any.pm Normal file
View File

@@ -0,0 +1,100 @@
use strict; #-*-cperl-*-
use warnings;
use lib qw( ../../../../lib );
=encoding utf8
=head1 NAME
Algorithm::Evolutionary::Fitness::Any - Façade for any function so that it can be used as fitness
=head1 SYNOPSIS
use Algorithm::Evolutionary::Utils qw( string_decode )
sub squares {
my $chrom = shift;
my @values = string_decode( $chrom, 10, -1, 1 );
return $values[0] * $values[1];
}
my $any_eval = new Algorithm::Evolutionary::Fitness::Any \&squares;
=head1 DESCRIPTION
Turns any subroutine or closure into a fitness function. Useful mainly
if you want results cached; it's not really needed otherwise.
=head1 METHODS
=cut
package Algorithm::Evolutionary::Fitness::Any;
use Carp;
use base 'Algorithm::Evolutionary::Fitness::Base';
our $VERSION = '3.2';
=head2 new( $function )
Assigns default variables
=cut
sub new {
my $class = shift;
my $self = { _function => shift || croak "No functiona rray" };
bless $self, $class;
$self->initialize();
return $self;
}
=head2 apply( $individual )
Applies the instantiated problem to a chromosome. It is actually a
wrapper around C<_apply>.
=cut
sub apply {
my $self = shift;
my $individual = shift || croak "Nobody here!!!";
$self->{'_counter'}++;
return $self->_apply( $individual );
}
=head2 _apply( $individual )
This is the one that really does the stuff. It applies the defined
function to each individual. Itis cached for efficiency.
=cut
sub _apply {
my $self = shift;
my $individual = shift || croak "Nobody here!";
my $chrom = $individual->Chrom();
my $cache = $self->{'_cache'};
if ( $cache->{$chrom} ) {
return $cache->{$chrom};
}
my $result = $self->{'_function'}->($chrom);
if ( (scalar $chrom ) eq $chrom ) {
$cache->{$chrom} = $result;
}
return $result;
}
=head1 Copyright
This file is released under the GPL. See the LICENSE file included in this distribution,
or go to http://www.fsf.org/licenses/gpl.txt
=cut
"What???";

View File

@@ -0,0 +1,20 @@
use strict;
use warnings;
use ExtUtils::MakeMaker;
WriteMakefile(
NAME => 'Algorithm::Evolutionary::Simple',
AUTHOR => 'JJ Merelo <jj@merelo.net>',
VERSION_FROM => 'lib/Algorithm/Evolutionary/Simple.pm',
ABSTRACT_FROM => 'lib/Algorithm/Evolutionary/Simple.pm',
LICENSE => 'gpl',
EXE_FILES => [ 'script/simple-EA.pl', 'script/maxones.pl'],
PREREQ_PM => {
'Test::More' => 0,
'Carp' => 0,
'Exporter' => 0,
'Sort::Key::Top' => 0
},
dist => { COMPRESS => 'gzip -9f', SUFFIX => 'gz', },
clean => { FILES => 'Algorithm-Evolutionary-Simple-*' },
);

View File

@@ -0,0 +1,9 @@
use Rex -feature => ['1.0'];
user "eleccionesugr";
group eleccionesugr => "elecciones-ugr.cloudapp.net";
desc "Install perlbrew";
task "perlbrew", group => "eleccionesugr", sub {
};

View File

@@ -0,0 +1,21 @@
#!/usr/bin/env pwsh
# source: https://github.com/PowerShell/PowerShellStandard/blob/3436bfc162d6804dd11d1d76c4faff486b4b405d/build.ps1
param (
[Parameter(ParameterSetName="Clean")][switch]$Clean,
[Parameter(ParameterSetName="Test")][switch]$Test
)
import-module $PSScriptRoot/PowerShellStandard.psm1 -force
if ( $Clean ) {
Start-Clean
return
}
Start-Build
if ( $Test ) {
Invoke-Test
}

146
samples/RPC/rpc.x Normal file
View File

@@ -0,0 +1,146 @@
/* rpc.x extracted from RFC5531 */
const RPC_VERS = 2;
enum auth_flavor {
AUTH_NONE = 0,
AUTH_SYS = 1,
AUTH_SHORT = 2,
AUTH_DH = 3,
AUTH_KERB = 4, /* RFC2695 */
AUTH_RSA = 5,
RPCSEC_GSS = 6 /* RFC2203 */
/* and more to be defined */
};
typedef opaque opaque_auth_body<400>;
struct opaque_auth {
int flavor; /* may be "pseudo" value outside enum */
opaque_auth_body body;
};
enum msg_type {
CALL = 0,
REPLY = 1
};
enum reply_stat {
MSG_ACCEPTED = 0,
MSG_DENIED = 1
};
enum accept_stat {
SUCCESS = 0, /* RPC executed successfully */
PROG_UNAVAIL = 1, /* remote hasn't exported program */
PROG_MISMATCH = 2, /* remote can't support version # */
PROC_UNAVAIL = 3, /* program can't support procedure */
GARBAGE_ARGS = 4, /* procedure can't decode params */
SYSTEM_ERR = 5 /* e.g. memory allocation failure */
};
enum reject_stat {
RPC_MISMATCH = 0, /* RPC version number != 2 */
AUTH_ERROR = 1 /* remote can't authenticate caller */
};
enum auth_stat {
AUTH_OK = 0, /* success */
/*
* failed at remote end
*/
AUTH_BADCRED = 1, /* bad credential (seal broken) */
AUTH_REJECTEDCRED = 2, /* client must begin new session */
AUTH_BADVERF = 3, /* bad verifier (seal broken) */
AUTH_REJECTEDVERF = 4, /* verifier expired or replayed */
AUTH_TOOWEAK = 5, /* rejected for security reasons */
/*
* failed locally
*/
AUTH_INVALIDRESP = 6, /* bogus response verifier */
AUTH_FAILED = 7, /* reason unknown */
/*
* AUTH_KERB errors; deprecated. See [[139]RFC2695]
*/
AUTH_KERB_GENERIC = 8, /* kerberos generic error */
AUTH_TIMEEXPIRE = 9, /* time of credential expired */
AUTH_TKT_FILE = 10, /* problem with ticket file */
AUTH_DECODE = 11, /* can't decode authenticator */
AUTH_NET_ADDR = 12, /* wrong net address in ticket */
/*
* RPCSEC_GSS GSS related errors
*/
RPCSEC_GSS_CREDPROBLEM = 13, /* no credentials for user */
RPCSEC_GSS_CTXPROBLEM = 14 /* problem with context */
};
struct rpc_msg {
unsigned int xid;
union rpc_msg_body body;
};
union rpc_msg_body switch (msg_type mtype) {
case CALL:
call_body cbody;
case REPLY:
reply_body rbody;
};
struct call_body {
unsigned int rpcvers; /* must be equal to two (2) */
unsigned int prog;
unsigned int vers;
unsigned int proc;
opaque_auth cred;
opaque_auth verf;
/* procedure-specific parameters start here */
};
union reply_body switch (reply_stat stat) {
case MSG_ACCEPTED:
accepted_reply areply;
case MSG_DENIED:
rejected_reply rreply;
} /*reply*/;
struct accepted_reply {
opaque_auth verf;
union accepted_reply_data reply_data;
};
union accepted_reply_data switch (accept_stat stat) {
case SUCCESS:
void /* opaque results[0] */;
/*
* procedure-specific results start here
*/
case PROG_MISMATCH:
struct {
unsigned int low;
unsigned int high;
} mismatch_info;
default:
/*
* Void. Cases include PROG_UNAVAIL, PROC_UNAVAIL,
* GARBAGE_ARGS, and SYSTEM_ERR.
*/
void;
};
union rejected_reply switch (reject_stat stat) {
case RPC_MISMATCH:
struct {
unsigned int low;
unsigned int high;
} mismatch_info;
case AUTH_ERROR:
auth_stat auth_stat; /* renamed to avoid conflict with discriminator */
};
struct authsys_parms {
unsigned int stamp;
string machinename<255>;
unsigned int uid;
unsigned int gid;
unsigned int gids<16>;
};

228
samples/RPC/rusers.x Normal file
View File

@@ -0,0 +1,228 @@
/*
* Copyright (c) 2010, Oracle America, Inc.
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are
* met:
*
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above
* copyright notice, this list of conditions and the following
* disclaimer in the documentation and/or other materials
* provided with the distribution.
* * Neither the name of the "Oracle America, Inc." nor the names of its
* contributors may be used to endorse or promote products derived
* from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
* COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
* INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
* GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
* WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
* NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
%/*
% * Find out about remote users
% */
const RUSERS_MAXUSERLEN = 32;
const RUSERS_MAXLINELEN = 32;
const RUSERS_MAXHOSTLEN = 257;
struct rusers_utmp {
string ut_user<RUSERS_MAXUSERLEN>; /* aka ut_name */
string ut_line<RUSERS_MAXLINELEN>; /* device */
string ut_host<RUSERS_MAXHOSTLEN>; /* host user logged on from */
int ut_type; /* type of entry */
int ut_time; /* time entry was made */
unsigned int ut_idle; /* minutes idle */
};
typedef rusers_utmp utmp_array<>;
#ifdef RPC_HDR
%
%/*
% * Values for ut_type field above.
% */
#endif
const RUSERS_EMPTY = 0;
const RUSERS_RUN_LVL = 1;
const RUSERS_BOOT_TIME = 2;
const RUSERS_OLD_TIME = 3;
const RUSERS_NEW_TIME = 4;
const RUSERS_INIT_PROCESS = 5;
const RUSERS_LOGIN_PROCESS = 6;
const RUSERS_USER_PROCESS = 7;
const RUSERS_DEAD_PROCESS = 8;
const RUSERS_ACCOUNTING = 9;
program RUSERSPROG {
version RUSERSVERS_3 {
int
RUSERSPROC_NUM(void) = 1;
utmp_array
RUSERSPROC_NAMES(void) = 2;
utmp_array
RUSERSPROC_ALLNAMES(void) = 3;
} = 3;
} = 100002;
#ifdef RPC_HDR
%
%
%#ifdef __cplusplus
%extern "C" {
%#endif
%
%#include <rpc/xdr.h>
%
%/*
% * The following structures are used by version 2 of the rusersd protocol.
% * They were not developed with rpcgen, so they do not appear as RPCL.
% */
%
%#define RUSERSVERS_IDLE 2
%#define RUSERSVERS 3 /* current version */
%#define MAXUSERS 100
%
%/*
% * This is the structure used in version 2 of the rusersd RPC service.
% * It corresponds to the utmp structure for BSD systems.
% */
%struct ru_utmp {
% char ut_line[8]; /* tty name */
% char ut_name[8]; /* user id */
% char ut_host[16]; /* host name, if remote */
% long int ut_time; /* time on */
%};
%
%struct utmparr {
% struct ru_utmp **uta_arr;
% int uta_cnt;
%};
%typedef struct utmparr utmparr;
%
%extern bool_t xdr_utmparr (XDR *xdrs, struct utmparr *objp) __THROW;
%
%struct utmpidle {
% struct ru_utmp ui_utmp;
% unsigned int ui_idle;
%};
%
%struct utmpidlearr {
% struct utmpidle **uia_arr;
% int uia_cnt;
%};
%
%extern bool_t xdr_utmpidlearr (XDR *xdrs, struct utmpidlearr *objp) __THROW;
%
%#ifdef __cplusplus
%}
%#endif
#endif
#ifdef RPC_XDR
%bool_t xdr_utmp (XDR *xdrs, struct ru_utmp *objp);
%
%bool_t
%xdr_utmp (XDR *xdrs, struct ru_utmp *objp)
%{
% /* Since the fields are char foo [xxx], we should not free them. */
% if (xdrs->x_op != XDR_FREE)
% {
% char *ptr;
% unsigned int size;
% ptr = objp->ut_line;
% size = sizeof (objp->ut_line);
% if (!xdr_bytes (xdrs, &ptr, &size, size)) {
% return (FALSE);
% }
% ptr = objp->ut_name;
% size = sizeof (objp->ut_name);
% if (!xdr_bytes (xdrs, &ptr, &size, size)) {
% return (FALSE);
% }
% ptr = objp->ut_host;
% size = sizeof (objp->ut_host);
% if (!xdr_bytes (xdrs, &ptr, &size, size)) {
% return (FALSE);
% }
% }
% if (!xdr_long(xdrs, &objp->ut_time)) {
% return (FALSE);
% }
% return (TRUE);
%}
%
%bool_t xdr_utmpptr(XDR *xdrs, struct ru_utmp **objpp);
%
%bool_t
%xdr_utmpptr (XDR *xdrs, struct ru_utmp **objpp)
%{
% if (!xdr_reference(xdrs, (char **) objpp, sizeof (struct ru_utmp),
% (xdrproc_t) xdr_utmp)) {
% return (FALSE);
% }
% return (TRUE);
%}
%
%bool_t
%xdr_utmparr (XDR *xdrs, struct utmparr *objp)
%{
% if (!xdr_array(xdrs, (char **)&objp->uta_arr, (u_int *)&objp->uta_cnt,
% MAXUSERS, sizeof(struct ru_utmp *),
% (xdrproc_t) xdr_utmpptr)) {
% return (FALSE);
% }
% return (TRUE);
%}
%
%bool_t xdr_utmpidle(XDR *xdrs, struct utmpidle *objp);
%
%bool_t
%xdr_utmpidle (XDR *xdrs, struct utmpidle *objp)
%{
% if (!xdr_utmp(xdrs, &objp->ui_utmp)) {
% return (FALSE);
% }
% if (!xdr_u_int(xdrs, &objp->ui_idle)) {
% return (FALSE);
% }
% return (TRUE);
%}
%
%bool_t xdr_utmpidleptr(XDR *xdrs, struct utmpidle **objp);
%
%bool_t
%xdr_utmpidleptr (XDR *xdrs, struct utmpidle **objpp)
%{
% if (!xdr_reference(xdrs, (char **) objpp, sizeof (struct utmpidle),
% (xdrproc_t) xdr_utmpidle)) {
% return (FALSE);
% }
% return (TRUE);
%}
%
%bool_t
%xdr_utmpidlearr (XDR *xdrs, struct utmpidlearr *objp)
%{
% if (!xdr_array(xdrs, (char **)&objp->uia_arr, (u_int *)&objp->uia_cnt,
% MAXUSERS, sizeof(struct utmpidle *),
% (xdrproc_t) xdr_utmpidleptr)) {
% return (FALSE);
% }
% return (TRUE);
%}
#endif

311
samples/RPC/yp.x Normal file
View File

@@ -0,0 +1,311 @@
/* @(#)yp.x 2.1 88/08/01 4.0 RPCSRC */
/*
* Copyright (c) 2010, Oracle America, Inc.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are
* met:
*
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above
* copyright notice, this list of conditions and the following
* disclaimer in the documentation and/or other materials
* provided with the distribution.
* * Neither the name of the "Oracle America, Inc." nor the names of its
* contributors may be used to endorse or promote products derived
* from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
* COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
* INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
* GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
* WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
* NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
/*
* Protocol description file for the Yellow Pages Service
*/
const YPMAXRECORD = 1024;
const YPMAXDOMAIN = 64;
const YPMAXMAP = 64;
const YPMAXPEER = 64;
enum ypstat {
YP_TRUE = 1,
YP_NOMORE = 2,
YP_FALSE = 0,
YP_NOMAP = -1,
YP_NODOM = -2,
YP_NOKEY = -3,
YP_BADOP = -4,
YP_BADDB = -5,
YP_YPERR = -6,
YP_BADARGS = -7,
YP_VERS = -8
};
enum ypxfrstat {
YPXFR_SUCC = 1,
YPXFR_AGE = 2,
YPXFR_NOMAP = -1,
YPXFR_NODOM = -2,
YPXFR_RSRC = -3,
YPXFR_RPC = -4,
YPXFR_MADDR = -5,
YPXFR_YPERR = -6,
YPXFR_BADARGS = -7,
YPXFR_DBM = -8,
YPXFR_FILE = -9,
YPXFR_SKEW = -10,
YPXFR_CLEAR = -11,
YPXFR_FORCE = -12,
YPXFR_XFRERR = -13,
YPXFR_REFUSED = -14
};
typedef string domainname<YPMAXDOMAIN>;
typedef string mapname<YPMAXMAP>;
typedef string peername<YPMAXPEER>;
typedef opaque keydat<YPMAXRECORD>;
typedef opaque valdat<YPMAXRECORD>;
struct ypmap_parms {
domainname domain;
mapname map;
unsigned int ordernum;
peername peer;
};
struct ypreq_key {
domainname domain;
mapname map;
keydat key;
};
struct ypreq_nokey {
domainname domain;
mapname map;
};
struct ypreq_xfr {
ypmap_parms map_parms;
unsigned int transid;
unsigned int prog;
unsigned int port;
};
struct ypresp_val {
ypstat stat;
valdat val;
};
struct ypresp_key_val {
ypstat stat;
#ifdef STUPID_SUN_BUG
/* This is the form as distributed by Sun. But even the Sun NIS
servers expect the values in the other order. So their
implementation somehow must change the order internally. We
don't want to follow this bad example since the user should be
able to use rpcgen on this file. */
keydat key;
valdat val;
#else
valdat val;
keydat key;
#endif
};
struct ypresp_master {
ypstat stat;
peername peer;
};
struct ypresp_order {
ypstat stat;
unsigned int ordernum;
};
union ypresp_all switch (bool more) {
case TRUE:
ypresp_key_val val;
case FALSE:
void;
};
struct ypresp_xfr {
unsigned int transid;
ypxfrstat xfrstat;
};
struct ypmaplist {
mapname map;
ypmaplist *next;
};
struct ypresp_maplist {
ypstat stat;
ypmaplist *maps;
};
enum yppush_status {
YPPUSH_SUCC = 1, /* Success */
YPPUSH_AGE = 2, /* Master's version not newer */
YPPUSH_NOMAP = -1, /* Can't find server for map */
YPPUSH_NODOM = -2, /* Domain not supported */
YPPUSH_RSRC = -3, /* Local resource alloc failure */
YPPUSH_RPC = -4, /* RPC failure talking to server */
YPPUSH_MADDR = -5, /* Can't get master address */
YPPUSH_YPERR = -6, /* YP server/map db error */
YPPUSH_BADARGS = -7, /* Request arguments bad */
YPPUSH_DBM = -8, /* Local dbm operation failed */
YPPUSH_FILE = -9, /* Local file I/O operation failed */
YPPUSH_SKEW = -10, /* Map version skew during transfer */
YPPUSH_CLEAR = -11, /* Can't send "Clear" req to local ypserv */
YPPUSH_FORCE = -12, /* No local order number in map use -f flag. */
YPPUSH_XFRERR = -13, /* ypxfr error */
YPPUSH_REFUSED = -14 /* Transfer request refused by ypserv */
};
struct yppushresp_xfr {
unsigned transid;
yppush_status status;
};
/*
* Response structure and overall result status codes. Success and failure
* represent two separate response message types.
*/
enum ypbind_resptype {
YPBIND_SUCC_VAL = 1,
YPBIND_FAIL_VAL = 2
};
struct ypbind_binding {
opaque ypbind_binding_addr[4]; /* In network order */
opaque ypbind_binding_port[2]; /* In network order */
};
union ypbind_resp switch (ypbind_resptype ypbind_status) {
case YPBIND_FAIL_VAL:
unsigned ypbind_error;
case YPBIND_SUCC_VAL:
ypbind_binding ypbind_bindinfo;
};
/* Detailed failure reason codes for response field ypbind_error*/
const YPBIND_ERR_ERR = 1; /* Internal error */
const YPBIND_ERR_NOSERV = 2; /* No bound server for passed domain */
const YPBIND_ERR_RESC = 3; /* System resource allocation failure */
/*
* Request data structure for ypbind "Set domain" procedure.
*/
struct ypbind_setdom {
domainname ypsetdom_domain;
ypbind_binding ypsetdom_binding;
unsigned ypsetdom_vers;
};
/*
* YP access protocol
*/
program YPPROG {
version YPVERS {
void
YPPROC_NULL(void) = 0;
bool
YPPROC_DOMAIN(domainname) = 1;
bool
YPPROC_DOMAIN_NONACK(domainname) = 2;
ypresp_val
YPPROC_MATCH(ypreq_key) = 3;
ypresp_key_val
YPPROC_FIRST(ypreq_key) = 4;
ypresp_key_val
YPPROC_NEXT(ypreq_key) = 5;
ypresp_xfr
YPPROC_XFR(ypreq_xfr) = 6;
void
YPPROC_CLEAR(void) = 7;
ypresp_all
YPPROC_ALL(ypreq_nokey) = 8;
ypresp_master
YPPROC_MASTER(ypreq_nokey) = 9;
ypresp_order
YPPROC_ORDER(ypreq_nokey) = 10;
ypresp_maplist
YPPROC_MAPLIST(domainname) = 11;
} = 2;
} = 100004;
/*
* YPPUSHPROC_XFRRESP is the callback routine for result of YPPROC_XFR
*/
program YPPUSH_XFRRESPPROG {
version YPPUSH_XFRRESPVERS {
void
YPPUSHPROC_NULL(void) = 0;
#ifdef STUPID_SUN_BUG
/* This is the form as distributed by Sun. But even
the Sun NIS servers expect the values in the other
order. So their implementation somehow must change
the order internally. We don't want to follow this
bad example since the user should be able to use
rpcgen on this file. */
yppushresp_xfr
YPPUSHPROC_XFRRESP(void) = 1;
#else
void
YPPUSHPROC_XFRRESP(yppushresp_xfr) = 1;
#endif
} = 1;
} = 0x40000000; /* transient: could be anything up to 0x5fffffff */
/*
* YP binding protocol
*/
program YPBINDPROG {
version YPBINDVERS {
void
YPBINDPROC_NULL(void) = 0;
ypbind_resp
YPBINDPROC_DOMAIN(domainname) = 1;
void
YPBINDPROC_SETDOM(ypbind_setdom) = 2;
} = 2;
} = 100007;

205
samples/Scala/car-ride.kojo Normal file
View File

@@ -0,0 +1,205 @@
// Kojo examples (files ending in .kojo) are licensed under The MIT License:
// Copyright (C) 2009-2018 Lalit Pant <pant.lalit@gmail.com> and the Kojo Dev Team.
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
// The above copyright notice and this permission notice shall be included in all
// copies or substantial portions of the Software.
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
// SOFTWARE.
// Use the four arrow keys to avoid the blue cars
// You gain energy every second, and lose energy for every collision
// You lose if your energy drops below zero, or you hit the edges of the screen
// You win if you stay alive for a minute
switchToDefault2Perspective()
val carHeight = 100
val markerHeight = 80
// The collision polygon for the (very similarly sized) car images car1.png and car2.png
val carE = trans(2, 14) -> Picture {
repeat(2) {
forward(70); right(45); forward(20); right(45)
forward(18); right(45); forward(20); right(45)
}
}
def car(img: String) = PicShape.image(img, carE)
val cars = collection.mutable.Map.empty[Picture, Vector2D]
val carSpeed = 3
val pResponse = 3
var pVel = Vector2D(0, 0)
var disabledTime = 0L
val bplayer = newMp3Player
val cplayer = newMp3Player
def createCar() {
val c = trans(cb.x + random(cb.width.toInt), cb.y + cb.height) -> car("/media/car-ride/car2.png")
draw(c)
cars += c -> Vector2D(0, -carSpeed)
}
val markers = collection.mutable.Set.empty[Picture]
def createMarker() {
val mwidth = 20
val m = fillColor(white) * penColor(white) *
trans(cb.x + cb.width / 2 - mwidth / 2, cb.y + cb.height) -> PicShape.rect(markerHeight, mwidth)
draw(m)
markers += m
}
cleari()
drawStage(darkGray)
val cb = canvasBounds
val player = car("/media/car-ride/car1.png")
draw(player)
drawAndHide(carE)
timer(1200) {
createMarker()
createCar()
}
animate {
player.moveToFront()
val enabled = epochTimeMillis - disabledTime > 300
if (enabled) {
if (isKeyPressed(Kc.VK_LEFT)) {
pVel = Vector2D(-pResponse, 0)
player.transv(pVel)
}
if (isKeyPressed(Kc.VK_RIGHT)) {
pVel = Vector2D(pResponse, 0)
player.transv(pVel)
}
if (isKeyPressed(Kc.VK_UP)) {
pVel = Vector2D(0, pResponse)
player.transv(pVel)
if (!isMp3Playing) {
playMp3Sound("/media/car-ride/car-accel.mp3")
}
}
else {
stopMp3()
}
if (isKeyPressed(Kc.VK_DOWN)) {
pVel = Vector2D(0, -pResponse)
player.transv(pVel)
if (!bplayer.isMp3Playing) {
bplayer.playMp3Sound("/media/car-ride/car-brake.mp3")
}
}
else {
bplayer.stopMp3()
}
}
else {
player.transv(pVel)
}
if (player.collidesWith(stageLeft) || player.collidesWith(stageRight)) {
cplayer.playMp3Sound("/media/car-ride/car-crash.mp3")
player.setOpacity(0.5)
drawMessage("You Crashed!", red)
stopAnimation()
}
else if (player.collidesWith(stageTop)) {
pVel = Vector2D(0, -pResponse)
player.transv(pVel * 2)
disabledTime = epochTimeMillis
}
else if (player.collidesWith(stageBot)) {
pVel = Vector2D(0, pResponse)
player.transv(pVel * 2)
disabledTime = epochTimeMillis
}
cars.foreach { cv =>
val (c, vel) = cv
c.moveToFront()
if (player.collidesWith(c)) {
cplayer.playMp3Sound("/media/car-ride/car-crash.mp3")
pVel = bouncePicVectorOffPic(player, pVel - vel, c) / 2
player.transv(pVel * 3)
c.transv(-pVel * 3)
disabledTime = epochTimeMillis
updateEnergyCrash()
}
else {
val newVel = Vector2D(vel.x + randomDouble(1) / 2 - 0.25, vel.y)
cars += c -> newVel
c.transv(newVel)
}
if (c.position.y + carHeight < cb.y) {
c.erase()
cars -= c
}
}
markers.foreach { m =>
m.translate(0, -carSpeed * 2)
if (m.position.y + markerHeight < cb.y) {
m.erase()
markers -= m
}
}
}
var energyLevel = 0
def energyText = s"Energy: $energyLevel"
val energyLabel = trans(cb.x + 10, cb.y + cb.height - 10) -> PicShape.textu(energyText, 20, blue)
def updateEnergyTick() {
energyLevel += 2
energyLabel.update(energyText)
}
def updateEnergyCrash() {
energyLevel -= 10
energyLabel.update(energyText)
if (energyLevel < 0) {
drawMessage("You're out of energy! You Lose", red)
stopAnimation()
}
}
def drawMessage(m: String, c: Color) {
val te = textExtent(m, 30)
val pic = penColor(c) * trans(cb.x + (cb.width - te.width) / 2, 0) -> PicShape.text(m, 30)
draw(pic)
}
def manageGameScore() {
var gameTime = 0
val timeLabel = trans(cb.x + 10, cb.y + 50) -> PicShape.textu(gameTime, 20, blue)
draw(timeLabel)
draw(energyLabel)
timeLabel.forwardInputTo(stageArea)
timer(1000) {
gameTime += 1
timeLabel.update(gameTime)
updateEnergyTick()
if (gameTime == 60) {
drawMessage("Time up! You Win", green)
stopAnimation()
}
}
}
manageGameScore()
playMp3Loop("/media/car-ride/car-move.mp3")
activateCanvas()
// Car images, via google images, from http://motor-kid.com/race-cars-top-view.html
// and www.carinfopic.com
// Car sounds from http://soundbible.com

View File

@@ -0,0 +1,53 @@
// Kojo examples (files ending in .kojo) are licensed under The MIT License:
// Copyright (C) 2009-2018 Lalit Pant <pant.lalit@gmail.com> and the Kojo Dev Team.
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
// The above copyright notice and this permission notice shall be included in all
// copies or substantial portions of the Software.
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
// SOFTWARE.
// Example from http://lalitpant.blogspot.in/2012/05/recursive-drawing-with-kojo.html
// This example is based on Kojo Pictures
val size = 100
def S = Picture {
repeat (4) {
forward(size)
right()
}
}
def stem = scale(0.13, 1) * penColor(noColor) * fillColor(black) -> S
clear()
setBackground(Color(255, 170, 29))
invisible()
def drawing(n: Int): Picture = {
if (n == 1)
stem
else
GPics(stem,
trans(2, size-5) * brit(0.05) -> GPics(
rot(25) * scale(0.72) -> drawing(n-1),
rot(25) * trans(0, size * 0.72) * rot(-75) * scale(0.55) -> drawing(n-1)
)
)
}
val pic = trans(0, -100) -> drawing(10)
draw(pic)

View File

@@ -0,0 +1,84 @@
// Kojo examples (files ending in .kojo) are licensed under The MIT License:
// Copyright (C) 2009-2018 Lalit Pant <pant.lalit@gmail.com> and the Kojo Dev Team.
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
// The above copyright notice and this permission notice shall be included in all
// copies or substantial portions of the Software.
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
// SOFTWARE.
// Run this program to get basic control over the turtle via some buttons
// Useful for smaller kids
// Change these value to tweak the behavior of the buttons
val fdStep = 50
val fdStep2 = 10
val rtStep = 90
val rtStep2 = 10
val bgColor = white
val sBgColor = "white"
// End tweak region
clear()
clearOutput()
beamsOn()
val width = canvasBounds.width
val height = canvasBounds.height
setBackground(bgColor)
setPenColor(purple)
def action(code: String) {
interpret(code); println(code)
}
val cmd = Map(
"forward1" -> s"forward($fdStep)",
"forward2" -> s"forward($fdStep2)",
"hop1" -> s"hop($fdStep)",
"hop2" -> s"hop($fdStep2)",
"right1" -> s"right($rtStep)",
"right2" -> s"right($rtStep2)",
"left1" -> s"left($rtStep)",
"left2" -> s"left($rtStep2)"
)
def eraseCmds(n: Int) =
s"saveStyle(); setPenColor($sBgColor); setPenThickness(4); back($n); restoreStyle()"
def button(forcmd: String) = PicShape.button(cmd(forcmd)) { action(cmd(forcmd)) }
val panel = trans(-width / 2, -height / 2) * scale(1.4) -> VPics(
HPics(
button("left2"),
button("forward2"),
button("right2"),
button("hop2"),
PicShape.button(s"erase($fdStep2)") { action(eraseCmds(fdStep2)) }
),
HPics(
button("left1"),
button("forward1"),
button("right1"),
button("hop1"),
PicShape.button(s"erase($fdStep)") { action(eraseCmds(fdStep)) }
)
)
draw(panel)
println("// Paste the generated program below into the script editor")
println("// and run it -- to reproduce your drawing")
println("clear()")

417
samples/TOML/filenames/Cargo.lock generated Normal file
View File

@@ -0,0 +1,417 @@
[[package]]
name = "aho-corasick"
version = "0.6.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"memchr 2.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "ansi_term"
version = "0.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "atty"
version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.40 (registry+https://github.com/rust-lang/crates.io-index)",
"termion 1.5.1 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "bitflags"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "bytecount"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"simd 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "cfg-if"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "clap"
version = "2.31.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"ansi_term 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)",
"atty 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
"bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"strsim 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)",
"textwrap 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "crossbeam"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "encoding_rs"
version = "0.7.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
"simd 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "fnv"
version = "1.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "fuchsia-zircon"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"fuchsia-zircon-sys 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "fuchsia-zircon-sys"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "glob"
version = "0.2.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "globset"
version = "0.3.0"
dependencies = [
"aho-corasick 0.6.4 (registry+https://github.com/rust-lang/crates.io-index)",
"fnv 1.0.6 (registry+https://github.com/rust-lang/crates.io-index)",
"glob 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)",
"memchr 2.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "grep"
version = "0.1.8"
dependencies = [
"log 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)",
"memchr 2.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)",
"regex-syntax 0.5.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "ignore"
version = "0.4.1"
dependencies = [
"crossbeam 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)",
"globset 0.3.0",
"lazy_static 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)",
"memchr 2.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)",
"same-file 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)",
"tempdir 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"thread_local 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)",
"walkdir 2.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "lazy_static"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "libc"
version = "0.2.40"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "log"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "memchr"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.40 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "memmap"
version = "0.6.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.40 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "num_cpus"
version = "1.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.40 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "rand"
version = "0.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"fuchsia-zircon 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.40 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "redox_syscall"
version = "0.1.37"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "redox_termios"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"redox_syscall 0.1.37 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "regex"
version = "0.2.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"aho-corasick 0.6.4 (registry+https://github.com/rust-lang/crates.io-index)",
"memchr 2.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"regex-syntax 0.5.3 (registry+https://github.com/rust-lang/crates.io-index)",
"thread_local 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)",
"utf8-ranges 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "regex-syntax"
version = "0.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"ucd-util 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "remove_dir_all"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "ripgrep"
version = "0.8.1"
dependencies = [
"atty 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
"bytecount 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)",
"clap 2.31.2 (registry+https://github.com/rust-lang/crates.io-index)",
"encoding_rs 0.7.2 (registry+https://github.com/rust-lang/crates.io-index)",
"globset 0.3.0",
"grep 0.1.8",
"ignore 0.4.1",
"lazy_static 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.40 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)",
"memchr 2.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"memmap 0.6.2 (registry+https://github.com/rust-lang/crates.io-index)",
"num_cpus 1.8.0 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)",
"same-file 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)",
"termcolor 0.3.6",
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "same-file"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "simd"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "strsim"
version = "0.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "tempdir"
version = "0.3.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"rand 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)",
"remove_dir_all 0.5.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "termcolor"
version = "0.3.6"
dependencies = [
"wincolor 0.1.6",
]
[[package]]
name = "termion"
version = "1.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.40 (registry+https://github.com/rust-lang/crates.io-index)",
"redox_syscall 0.1.37 (registry+https://github.com/rust-lang/crates.io-index)",
"redox_termios 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "textwrap"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "thread_local"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"lazy_static 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
"unreachable 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "ucd-util"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "unicode-width"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "unreachable"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "utf8-ranges"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "void"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "walkdir"
version = "2.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"same-file 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "winapi"
version = "0.3.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"winapi-i686-pc-windows-gnu 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi-x86_64-pc-windows-gnu 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "winapi-i686-pc-windows-gnu"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "winapi-x86_64-pc-windows-gnu"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "wincolor"
version = "0.1.6"
dependencies = [
"winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
[metadata]
"checksum aho-corasick 0.6.4 (registry+https://github.com/rust-lang/crates.io-index)" = "d6531d44de723825aa81398a6415283229725a00fa30713812ab9323faa82fc4"
"checksum ansi_term 0.11.0 (registry+https://github.com/rust-lang/crates.io-index)" = "ee49baf6cb617b853aa8d93bf420db2383fab46d314482ca2803b40d5fde979b"
"checksum atty 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)" = "af80143d6f7608d746df1520709e5d141c96f240b0e62b0aa41bdfb53374d9d4"
"checksum bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)" = "b3c30d3802dfb7281680d6285f2ccdaa8c2d8fee41f93805dba5c4cf50dc23cf"
"checksum bytecount 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)" = "882585cd7ec84e902472df34a5e01891202db3bf62614e1f0afe459c1afcf744"
"checksum cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)" = "d4c819a1287eb618df47cc647173c5c4c66ba19d888a6e50d605672aed3140de"
"checksum clap 2.31.2 (registry+https://github.com/rust-lang/crates.io-index)" = "f0f16b89cbb9ee36d87483dc939fe9f1e13c05898d56d7b230a0d4dff033a536"
"checksum crossbeam 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)" = "24ce9782d4d5c53674646a6a4c1863a21a8fc0cb649b3c94dfc16e45071dea19"
"checksum encoding_rs 0.7.2 (registry+https://github.com/rust-lang/crates.io-index)" = "98fd0f24d1fb71a4a6b9330c8ca04cbd4e7cc5d846b54ca74ff376bc7c9f798d"
"checksum fnv 1.0.6 (registry+https://github.com/rust-lang/crates.io-index)" = "2fad85553e09a6f881f739c29f0b00b0f01357c743266d478b68951ce23285f3"
"checksum fuchsia-zircon 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "2e9763c69ebaae630ba35f74888db465e49e259ba1bc0eda7d06f4a067615d82"
"checksum fuchsia-zircon-sys 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "3dcaa9ae7725d12cdb85b3ad99a434db70b468c09ded17e012d86b5c1010f7a7"
"checksum glob 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)" = "8be18de09a56b60ed0edf84bc9df007e30040691af7acd1c41874faac5895bfb"
"checksum lazy_static 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "c8f31047daa365f19be14b47c29df4f7c3b581832407daabe6ae77397619237d"
"checksum libc 0.2.40 (registry+https://github.com/rust-lang/crates.io-index)" = "6fd41f331ac7c5b8ac259b8bf82c75c0fb2e469bbf37d2becbba9a6a2221965b"
"checksum log 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)" = "89f010e843f2b1a31dbd316b3b8d443758bc634bed37aabade59c686d644e0a2"
"checksum memchr 2.0.1 (registry+https://github.com/rust-lang/crates.io-index)" = "796fba70e76612589ed2ce7f45282f5af869e0fdd7cc6199fa1aa1f1d591ba9d"
"checksum memmap 0.6.2 (registry+https://github.com/rust-lang/crates.io-index)" = "e2ffa2c986de11a9df78620c01eeaaf27d94d3ff02bf81bfcca953102dd0c6ff"
"checksum num_cpus 1.8.0 (registry+https://github.com/rust-lang/crates.io-index)" = "c51a3322e4bca9d212ad9a158a02abc6934d005490c054a2778df73a70aa0a30"
"checksum rand 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)" = "eba5f8cb59cc50ed56be8880a5c7b496bfd9bd26394e176bc67884094145c2c5"
"checksum redox_syscall 0.1.37 (registry+https://github.com/rust-lang/crates.io-index)" = "0d92eecebad22b767915e4d529f89f28ee96dbbf5a4810d2b844373f136417fd"
"checksum redox_termios 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "7e891cfe48e9100a70a3b6eb652fef28920c117d366339687bd5576160db0f76"
"checksum regex 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)" = "aec3f58d903a7d2a9dc2bf0e41a746f4530e0cab6b615494e058f67a3ef947fb"
"checksum regex-syntax 0.5.3 (registry+https://github.com/rust-lang/crates.io-index)" = "b2550876c31dc914696a6c2e01cbce8afba79a93c8ae979d2fe051c0230b3756"
"checksum remove_dir_all 0.5.0 (registry+https://github.com/rust-lang/crates.io-index)" = "dfc5b3ce5d5ea144bb04ebd093a9e14e9765bcfec866aecda9b6dec43b3d1e24"
"checksum same-file 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)" = "cfb6eded0b06a0b512c8ddbcf04089138c9b4362c2f696f3c3d76039d68f3637"
"checksum simd 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)" = "3dd0805c7363ab51a829a1511ad24b6ed0349feaa756c4bc2f977f9f496e6673"
"checksum strsim 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)" = "bb4f380125926a99e52bc279241539c018323fab05ad6368b56f93d9369ff550"
"checksum tempdir 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)" = "15f2b5fb00ccdf689e0149d1b1b3c03fead81c2b37735d812fa8bddbbf41b6d8"
"checksum termion 1.5.1 (registry+https://github.com/rust-lang/crates.io-index)" = "689a3bdfaab439fd92bc87df5c4c78417d3cbe537487274e9b0b2dce76e92096"
"checksum textwrap 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)" = "c0b59b6b4b44d867f1370ef1bd91bfb262bf07bf0ae65c202ea2fbc16153b693"
"checksum thread_local 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)" = "279ef31c19ededf577bfd12dfae728040a21f635b06a24cd670ff510edd38963"
"checksum ucd-util 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "fd2be2d6639d0f8fe6cdda291ad456e23629558d466e2789d2c3e9892bda285d"
"checksum unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)" = "bf3a113775714a22dcb774d8ea3655c53a32debae63a063acc00a91cc586245f"
"checksum unreachable 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "382810877fe448991dfc7f0dd6e3ae5d58088fd0ea5e35189655f84e6814fa56"
"checksum utf8-ranges 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "662fab6525a98beff2921d7f61a39e7d59e0b425ebc7d0d9e66d316e55124122"
"checksum void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6a02e4885ed3bc0f2de90ea6dd45ebcbb66dacffe03547fadbb0eeae2770887d"
"checksum walkdir 2.1.4 (registry+https://github.com/rust-lang/crates.io-index)" = "63636bd0eb3d00ccb8b9036381b526efac53caf112b7783b730ab3f8e44da369"
"checksum winapi 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)" = "04e3bd221fcbe8a271359c04f21a76db7d0c6028862d1bb5512d85e1e2eb5bb3"
"checksum winapi-i686-pc-windows-gnu 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "ac3b87c63620426dd9b991e5ce0329eff545bccbbb34f3be09ff6fb6ab51b7b6"
"checksum winapi-x86_64-pc-windows-gnu 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f"

103
samples/sed/hanoi.sed Normal file
View File

@@ -0,0 +1,103 @@
# Towers of Hanoi in sed.
#
# @(#)hanoi.sed 8.1 (Berkeley) 6/6/93
# $FreeBSD$
#
#
# Ex:
# Run "sed -f hanoi.sed", and enter:
#
# :abcd: : :<CR>
#
# note -- TWO carriage returns were once required, this will output the
# sequence of states involved in moving 4 rings, the largest called "a" and
# the smallest called "d", from the first to the second of three towers, so
# that the rings on any tower at any time are in descending order of size.
# You can start with a different arrangement and a different number of rings,
# say :ce:b:ax: and it will give the shortest procedure for moving them all
# to the middle tower. The rules are: the names of the rings must all be
# lower-case letters, they must be input within 3 fields (representing the
# towers) and delimited by 4 colons, such that the letters within each field
# are in alphabetical order (i.e. rings are in descending order of size).
#
# For the benefit of anyone who wants to figure out the script, an "internal"
# line of the form
# b:0abx:1a2b3 :2 :3x2
# has the following meaning: the material after the three markers :1, :2,
# and :3 represents the three towers; in this case the current set-up is
# ":ab : :x :". The numbers after a, b and x in these fields indicate
# that the next time it gets a chance, it will move a to tower 2, move b
# to tower 3, and move x to tower 2. The string after :0 just keeps track
# of the alphabetical order of the names of the rings. The b at the
# beginning means that it is now dealing with ring b (either about to move
# it, or re-evaluating where it should next be moved to).
#
# Although this version is "limited" to 26 rings because of the size of the
# alphabet, one could write a script using the same idea in which the rings
# were represented by arbitrary [strings][within][brackets], and in place of
# the built-in line of the script giving the order of the letters of the
# alphabet, it would accept from the user a line giving the ordering to be
# assumed, e.g. [ucbvax][decvax][hplabs][foo][bar].
#
# George Bergman
# Math, UC Berkeley 94720 USA
# cleaning, diagnostics
s/ *//g
/^$/d
/[^a-z:]/{a\
Illegal characters: use only a-z and ":". Try again.
d
}
/^:[a-z]*:[a-z]*:[a-z]*:$/!{a\
Incorrect format: use\
\ : string1 : string2 : string3 :<CR>\
Try again.
d
}
/\([a-z]\).*\1/{a\
Repeated letters not allowed. Try again.
d
}
# initial formatting
h
s/[a-z]/ /g
G
s/^:\( *\):\( *\):\( *\):\n:\([a-z]*\):\([a-z]*\):\([a-z]*\):$/:1\4\2\3:2\5\1\3:3\6\1\2:0/
s/[a-z]/&2/g
s/^/abcdefghijklmnopqrstuvwxyz/
:a
s/^\(.\).*\1.*/&\1/
s/.//
/^[^:]/ba
s/\([^0]*\)\(:0.*\)/\2\1:/
s/^[^0]*0\(.\)/\1&/
:b
# outputting current state without markers
h
s/.*:1/:/
s/[123]//gp
g
:c
# establishing destinations
/^\(.\).*\1:1/td
/^\(.\).*:1[^:]*\11/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\31/
/^\(.\).*:1[^:]*\12/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\33/
/^\(.\).*:1[^:]*\13/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\32/
/^\(.\).*:2[^:]*\11/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\33/
/^\(.\).*:2[^:]*\12/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\32/
/^\(.\).*:2[^:]*\13/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\31/
/^\(.\).*:3[^:]*\11/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\32/
/^\(.\).*:3[^:]*\12/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\31/
/^\(.\).*:3[^:]*\13/s/^\(.\)\(.*\1\([a-z]\).*\)\3./\3\2\33/
bc
# iterate back to find smallest out-of-place ring
:d
s/^\(.\)\(:0[^:]*\([^:]\)\1.*:\([123]\)[^:]*\1\)\4/\3\2\4/
td
# move said ring (right, resp. left)
s/^\(.\)\(.*\)\1\([23]\)\(.*:\3[^ ]*\) /\1\2 \4\1\3/
s/^\(.\)\(.*:\([12]\)[^ ]*\) \(.*\)\1\3/\1\2\1\3\4 /
tb
s/.*/Done! Try another, or end with ^D./p
d

View File

@@ -1,6 +1,7 @@
#!/usr/bin/env ruby
require "optparse"
require "open3"
ROOT = File.expand_path("../../", __FILE__)
@@ -42,6 +43,17 @@ def log(msg)
puts msg if $verbose
end
def command(*args)
log "$ #{args.join(' ')}"
output, status = Open3.capture2e(*args)
if !status.success?
output.each_line do |line|
log " > #{line}"
end
warn "Command failed. Aborting."
exit 1
end
end
usage = """Usage:
#{$0} [-v|--verbose] [--replace grammar] url
@@ -51,12 +63,12 @@ Examples:
"""
$replace = nil
$verbose = false
$verbose = true
OptionParser.new do |opts|
opts.banner = usage
opts.on("-v", "--verbose", "Print verbose feedback to STDOUT") do
$verbose = true
opts.on("-q", "--quiet", "Do not print output unless there's a failure") do
$verbose = false
end
opts.on("-rSUBMODULE", "--replace=SUBMODDULE", "Replace an existing grammar submodule.") do |name|
$replace = name
@@ -72,6 +84,10 @@ unless $url
exit 1;
end
# Exit early if docker isn't installed or running.
log "Checking docker is installed and running"
command('docker', 'ps')
# Ensure the given URL is an HTTPS link
parts = parse_url $url
https = "https://#{parts[:host]}/#{parts[:user]}/#{parts[:repo]}"
@@ -82,23 +98,24 @@ Dir.chdir(ROOT)
if repo_old
log "Deregistering: #{repo_old}"
`git submodule deinit #{repo_old}`
`git rm -rf #{repo_old}`
`script/grammar-compiler -update`
command('git', 'submodule', 'deinit', repo_old)
command('git', 'rm', '-rf', repo_old)
command('script/grammar-compiler', 'update', '-f')
end
log "Registering new submodule: #{repo_new}"
`git submodule add -f #{https} #{repo_new}`
exit 1 if $?.exitstatus > 0
`script/grammar-compiler -add #{repo_new}`
command('git', 'submodule', 'add', '-f', https, repo_new)
command('script/grammar-compiler', 'add', repo_new)
log "Confirming license"
if repo_old
`script/licensed`
command('script/licensed')
else
`script/licensed --module "#{repo_new}"`
repo_new = File.absolute_path(repo_new)
command('script/licensed', '--module', repo_new)
end
log "Updating grammar documentation in vendor/README.md"
`bundle exec rake samples`
`script/list-grammars`
command('bundle', 'exec', 'rake', 'samples')
command('script/sort-submodules')
command('script/list-grammars')

9
script/build-grammars-tarball Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/sh
set -e
cd "$(dirname "$0")/.."
rm -rf ./linguist-grammars
./script/grammar-compiler compile -o linguist-grammars || true
tar -zcvf linguist-grammars.tar.gz linguist-grammars

View File

@@ -6,7 +6,9 @@ cd "$(dirname "$0")/.."
image="linguist/grammar-compiler:latest"
mkdir -p grammars
docker pull $image
exec docker run --rm \
-u $(id -u $USER):$(id -g $USER) \
-v $PWD:/src/linguist \
-w /src/linguist -ti $image "$@"
-w /src/linguist $image "$@"

View File

@@ -40,12 +40,12 @@ OptionParser.new do |opts|
end
end.parse!
source = Licensed::Source::Filesystem.new(module_path || "vendor/grammars/*/", type: "grammar")
config = Licensed::Configuration.new
source = Licensed::Source::Filesystem.new(module_path || "#{File.expand_path("../", File.dirname(__FILE__))}/vendor/grammars/*/", type: "grammar")
config = Licensed::Configuration.load_from(File.expand_path("../vendor/licenses/config.yml", File.dirname(__FILE__)))
config.sources << source
command = if ARGV[0] == "verify"
Licensed::Command::Verify.new(config)
command = if ARGV[0] == "status"
Licensed::Command::Status.new(config)
else
Licensed::Command::Cache.new(config)
end

50
script/sort-submodules Executable file
View File

@@ -0,0 +1,50 @@
#!/usr/bin/env ruby
require "optparse"
ROOT = File.expand_path "../../", __FILE__
# Extract and sort a list of submodules
def sort_entries(file_data)
submodules = []
file_data.scan(/(^\[submodule[^\n]+\n)((?:\t[^\n]+\n)+)/).each do |head, body|
path = body.match(/^\tpath\s*=\s*\K(.+)$/)[0]
submodules << [path, head + body]
end
submodules.sort! { |a,b| a[0] <=> b[0] }
submodules.collect { |i| i[1] }
end
usage = <<-EOH
Usage:
#{$0} [-t|--test] [-h|--help]
Examples:
#{$0} # Update .gitmodules file in-place
#{$0} --help # Display this help message
#{$0} --test # Exit with an error code if .gitmodules needs sorting
EOH
$testing = false
OptionParser.new do |opts|
opts.banner = usage
opts.on("-h", "--help") do
puts usage
exit
end
opts.on("-t", "--test", "Don't update file; only test if it's unsorted") do
$testing = true
end
end.parse!
unsorted = File.read("#{ROOT}/.gitmodules")
sorted = sort_entries(unsorted).join
if $testing
exit unsorted == sorted
else
File.write "#{ROOT}/.gitmodules", sorted
end

9
test/fixtures/Perl 6/chromosome.pl vendored Normal file
View File

@@ -0,0 +1,9 @@
class Chromosome {
has Seq $.chromosome is rw;
has $.fitness is rw;
}
my $len = 32;
my $this-chromosome = Chromosome.new( chromosome => map( { rand >= 0.5 ?? True !! False }, 1..$len ) );
say $this-chromosome.chromosome();

51
test/fixtures/Perl/01-methods.pl vendored Normal file
View File

@@ -0,0 +1,51 @@
#!perl
use Test::More;
use Test::Exception;
use_ok 'Music::ScaleNote';
my $msn = Music::ScaleNote->new(
scale_note => 'C',
scale_name => 'pminor',
# verbose => 1,
);
isa_ok $msn, 'Music::ScaleNote';
my $x;
throws_ok { $x = $msn->get_offset() }
qr/note_name, note_format or offset not provided/, 'invalid get_offset';
my $format = 'midinum';
$x = $msn->get_offset(
note_name => 60,
note_format => $format,
offset => 1,
);
is $x->format($format), 63, 'get_offset';
$format = 'ISO';
$x = $msn->get_offset(
note_name => 'D#4',
note_format => $format,
offset => -1,
);
is $x->format($format), 'C4', 'get_offset';
throws_ok {
$x = $msn->get_offset(
note_name => 'C0',
note_format => $format,
offset => -1,
)
} qr/Octave: -1 out of bounds/, 'out of bounds';
throws_ok {
$x = $msn->get_offset(
note_name => 'A#127',
note_format => $format,
offset => 1,
)
} qr/Octave: 128 out of bounds/, 'out of bounds';
done_testing();

View File

@@ -169,6 +169,9 @@ class TestBlob < Minitest::Test
assert sample_blob_memory("JavaScript/jquery-1.6.1.min.js").generated?
assert sample_blob_memory("JavaScript/jquery-1.4.2.min.js").generated?
# Cargo generated composer.lock file
assert sample_blob_memory("TOML/filenames/Cargo.lock").generated?
# Composer generated composer.lock file
assert sample_blob_memory("JSON/filenames/composer.lock").generated?
@@ -307,5 +310,36 @@ class TestBlob < Minitest::Test
included = sample_blob_memory("HTML/pages.html")
assert_predicate included, :include_in_language_stats?
# Test detectable override (i.e by .gitattributes)
def prose.detectable?; true end
assert_predicate prose, :include_in_language_stats?
included_not_detectable = included.clone()
def included_not_detectable.detectable?; false end
refute_predicate included_not_detectable, :include_in_language_stats?
# Test not included if vendored, documentation or generated overridden
# even if detectable
included_vendored = included.clone()
def included_vendored.vendored?; true end
refute_predicate included_vendored, :include_in_language_stats?
def included_vendored.detectable?; true end
refute_predicate included_vendored, :include_in_language_stats?
included_documentation = included.clone()
def included_documentation.documentation?; true end
refute_predicate included_documentation, :include_in_language_stats?
def included_documentation.detectable?; true end
refute_predicate included_documentation, :include_in_language_stats?
included_generated = included.clone()
def included_generated.generated?; true end
refute_predicate included_generated, :include_in_language_stats?
def included_generated.detectable?; true end
refute_predicate included_generated, :include_in_language_stats?
end
end

View File

@@ -534,6 +534,14 @@ class TestFileBlob < Minitest::Test
assert sample_blob("subproject/gradlew.bat").vendored?
assert sample_blob("subproject/gradle/wrapper/gradle-wrapper.properties").vendored?
# Maven
assert sample_blob("mvnw").vendored?
assert sample_blob("mvnw.cmd").vendored?
assert sample_blob(".mvn/wrapper/maven-wrapper.properties").vendored?
assert sample_blob("subproject/mvnw").vendored?
assert sample_blob("subproject/mvnw.cmd").vendored?
assert sample_blob("subproject/.mvn/wrapper/maven-wrapper.properties").vendored?
# Octicons
assert sample_blob("octicons.css").vendored?
assert sample_blob("public/octicons.min.css").vendored?

View File

@@ -66,7 +66,10 @@ class TestGenerated < Minitest::Test
generated_sample_without_loading_data("go/vendor/gopkg.in/some/nested/path/foo.go")
# .NET designer file
generated_sample_without_loading_data("Dummu/foo.designer.cs")
generated_sample_without_loading_data("Dummy/foo.designer.cs")
generated_sample_without_loading_data("Dummy/foo.Designer.cs")
generated_sample_without_loading_data("Dummy/foo.designer.vb")
generated_sample_without_loading_data("Dummy/foo.Designer.vb")
# Composer generated composer.lock file
generated_sample_without_loading_data("JSON/composer.lock")

View File

@@ -5,45 +5,35 @@ class TestGrammars < Minitest::Test
# List of projects that are allowed without licenses
PROJECT_WHITELIST = [
"vendor/grammars/Sublime-Lasso",
"vendor/grammars/blitzmax"
"vendor/grammars/Sublime-Lasso", # No license file
"vendor/grammars/blitzmax", # No license file
"vendor/grammars/creole", # License filename is not LICENSE(.*)?
].freeze
HASH_WHITELIST = [
"bc12b3b4917eab9aedb87ec1305c2a4376e34fd1", # TextMate bundles
"16c4748566b3dd996594af0410a1875b22d3a2b3", # language-yaml and atom-salt
"ff21db2554d69d78b2220db5615b16bbba0788d3", # factor
"b4381ebae3235e91aaf5ccab1e8e94e9ad4faef4", # jflex.tmbundle
"2edac46b0a63309c96442d2826321a442217472f", # Agda.tmbundle
"7dfce11e2e3579ee43b83e69b1b64e77a2e378f0", # ant.tmbundle
"79e72fd673dcebadd8fbace8d43db3da96d2c09f", # bro-sublime
"62b97e52b78439c14550a44a3fe51332aeffb3a1", # elixir-tmbundle
"75cf04a9121ca7bb5a9c122b33007ac016ba72e7", # factor
"0acff2bb1536a3942a39ac74987ffd9c44905a6b", # FreeMarker.tmbundle
"ee77ce4cf9121bccc3e37ba6b98f8e7acd589aaf", # gap-tmbundle
"4cfc7ce12de920ccc836bbab2d748151d5ba7e38", # go-tmbundle
"6c2e34d62c08f97a3e2ece3eedc65fbd99873ff4", # idl.tmbundle
"e5212ae103917a9c2c3c1429a4569df466686fbd", # Isabelle.tmbundle
"bb56ce634fb7ddd38eee988c593ab7cb98a04f64", # jflex.tmbundle
"41cdc7e9f9d2e62eb8ac68a1a9359b9c39a7a9bf", # mako-tmbundle
"7821982b18bc35d6925cc16ece68d9c71f1fbba3", # moonscript-tmbundle
"c235154dbf7864612ac0d337ef5fe79a586b061a", # PHP-Twig.tmbundle
"0c216b112f3a4e6d5848128504d8378d8c7eee00", # r.tmbundle
"da39a3ee5e6b4b0d3255bfef95601890afd80709", # SCSS.tmbundle
"b5432a1e1055de7eeede2dddf91e009480651fd6", # jasmin-sublime
"170b35df61879139b88379a8f1bfd86289c13599", # language-clojure
"60e1fe192238a032341d5dd3cd80535459fc84e4", # language-coffee-script
"94fbd554ec1837fb7c508fd7425326639c3f4103", # language-csharp
"70fb557a431891c2d634c33fa7367feab5066fd6", # language-javascript
"8653305b358375d0fced85dc24793b99919b11ef", # language-shellscript
"9f0c0b0926a18f5038e455e8df60221125fc3111", # elixir-tmbundle
"a4dadb2374282098c5b8b14df308906f5347d79a", # mako-tmbundle
"e06722add999e7428048abcc067cd85f1f7ca71c", # r.tmbundle
"50b14a0e3f03d7ca754dac42ffb33302b5882b78", # smalltalk-tmbundle
"eafbc4a2f283752858e6908907f3c0c90188785b", # gap-tmbundle
"22b3bf41b9e3e8c22357ee12265f149d68aae60a", # Stylus
"c87e7e574fca543941650e5b0a144b44c02c55d8", # language-crystal
"ace112feb693358db2970d0805f6894b745e14b5", # atom-language-purescript
"a626362e3efd030c1d97c0faf422cf8c2dfaea54", # FreeMarker.tmbundle
"15a394f6bc43400946570b299aee8ae264a1e3ff", # language-renpy
"74bb588102e8f332970a0fcabe36299e0806f130", # language-less
"2f03492b52d7dd83b4e7472f01b87c6121e5b1a4", # monkey
"784da5ce445892bc3e26beeb6a4402bbc5ca997e", # ant.tmbundle
"bdab9fdc21e6790b479ccb5945b78bc0f6ce2493", # language-blade
"c9118c370411f2f049c746c0fd096554e877aea2", # atom-language-perl6
"15a502335012f27f8a5991139298edb87a6e467d", # atom-language-rust
"304be6184f7f344d44a1d13bddf511019624fd22", # language-css
"8c538244ba88ef9902a4faf11a2b9acec46f2a4e", # sublime-nginx
"82c356d6ecb143a8a20e1658b0d6a2d77ea8126f", # idl.tmbundle
"9dafd4e2a79cb13a6793b93877a254bc4d351e74", # sublime-text-ox
"8e111741d97ba2e27b3d18a309d426b4a37e604f", # sublime-varnish
"68539730d3cde34355f429f2267e265c1e030912", # smalltalk-tmbundle
"4b5f67a54532ca6e49ba44cd135a510a74712e07", # Stylus
"23d2538e33ce62d58abda2c039364b92f64ea6bc", # sublime-angelscript
"53714285caad3c480ebd248c490509695d10404b", # atom-language-julia
"966085b715baa0b0b67b40924123f92f90acd0ba", # sublime-shen
"3df4ef028c6384b64bc59b8861d6c52093b2116d", # sublime-text-ox
"fd47e09f1fbdb3c26e2960d0aa2b8535bbc31188", # sublimetext-cuda-cpp
"93360925b1805be2b3f0a18e207649fcb524b991", # Std license in README.md of many TextMate grammars like abap.tmbundle
].freeze
# List of allowed SPDX license names
@@ -101,7 +91,7 @@ class TestGrammars < Minitest::Test
end
def test_submodules_have_recognized_licenses
unrecognized = submodule_licenses.select { |k,v| v.nil? && Licensee::FSProject.new(k).license_file }
unrecognized = submodule_licenses.select { |k,v| v.nil? && Licensee.project(k).license_file }
unrecognized.reject! { |k,v| PROJECT_WHITELIST.include?(k) }
message = "The following submodules have unrecognized licenses:\n* #{unrecognized.keys.join("\n* ")}\n"
message << "Please ensure that the project's LICENSE file contains the full text of the license"
@@ -186,15 +176,22 @@ class TestGrammars < Minitest::Test
# If the license is unrecognized, return its hash
def submodule_license(submodule)
# Prefer Licensee to detect a submodule's license
project = Licensee::FSProject.new(submodule, detect_readme: true)
return project.license.key if project.license
project = Licensee.project(submodule, detect_packages: true, detect_readme: true)
return project.license.key if project.licenses.length == 1 && !project.license.pseudo_license?
# If we have more than one license, return the first one that isn't a
# pseudo-license (other or no-license), if any
if project.licenses.length > 1
first_real_license = project.licenses.reject{ |f| f.pseudo_license? }.first
return first_real_license.key unless first_real_license.nil?
end
# We know a license exists, but no method was able to recognize it.
# We return the license hash in this case, to uniquely identify it.
if project.license_file
return project.license_file.hash
return project.license_file.content_hash
elsif project.readme
return project.readme.hash
return project.readme.content_hash
end
end
end

View File

@@ -13,8 +13,9 @@ class TestHeuristics < Minitest::Test
end
def all_fixtures(language_name, file="*")
Dir.glob("#{samples_path}/#{language_name}/#{file}") -
["#{samples_path}/#{language_name}/filenames"]
fixs = Dir.glob("#{samples_path}/#{language_name}/#{file}") -
["#{samples_path}/#{language_name}/filenames"]
fixs.reject { |f| File.symlink?(f) }
end
def test_no_match
@@ -23,6 +24,10 @@ class TestHeuristics < Minitest::Test
assert_equal [], results
end
def test_symlink_empty
assert_equal [], Heuristics.call(file_blob("Markdown/symlink.md"), [Language["Markdown"]])
end
def assert_heuristics(hash)
candidates = hash.keys.map { |l| Language[l] }

View File

@@ -473,4 +473,10 @@ class TestLanguage < Minitest::Test
assert_nil Language.find_by_name(',')
assert_nil Language.find_by_alias(',')
end
def test_detect_prefers_markdown_for_md
blob = Linguist::FileBlob.new(File.join(samples_path, "Markdown/symlink.md"))
match = Linguist.detect(blob)
assert_equal Language["Markdown"], match
end
end

View File

@@ -44,6 +44,11 @@ class TestPedantic < Minitest::Test
assert_sorted tests
end
def test_submodules_are_sorted
system(File.expand_path("../../script/sort-submodules", __FILE__) + " -t")
assert $?.success?
end
def assert_sorted(list)
list.each_cons(2) do |previous, item|
flunk "#{previous} should come after #{item}" if previous > item

View File

@@ -121,4 +121,16 @@ class TestRepository < Minitest::Test
# overridden .gitattributes
assert rakefile.generated?
end
def test_linguist_override_detectable?
attr_commit = "8f86998866f6f2c8aa14e0dd430e61fd25cff720"
linguist_repo(attr_commit).read_index
# markdown is overridden by .gitattributes to be detectable, html to not be detectable
markdown = Linguist::LazyBlob.new(rugged_repository, attr_commit, "samples/Markdown/tender.md")
html = Linguist::LazyBlob.new(rugged_repository, attr_commit, "samples/HTML/pages.html")
assert_predicate markdown, :detectable?
refute_predicate html, :detectable?
end
end

View File

@@ -42,6 +42,15 @@ class TestSamples < Minitest::Test
end
end
def test_filename_listed
Samples.each do |sample|
if sample[:filename]
listed_filenames = Language[sample[:language]].filenames
assert_includes listed_filenames, sample[:filename], "#{sample[:path]} isn't listed as a filename for #{sample[:language]} in languages.yml"
end
end
end
# Check that there aren't samples with extensions or interpreters that
# aren't explicitly defined in languages.yml
languages_yml = File.expand_path("../../lib/linguist/languages.yml", __FILE__)

View File

@@ -1,16 +1,13 @@
FROM golang:1.9.2
RUN apt-get update
RUN apt-get upgrade -y
RUN apt-get install -y curl gnupg
WORKDIR /go/src/github.com/github/linguist/tools/grammars
RUN curl -sL https://deb.nodesource.com/setup_6.x | bash -
RUN apt-get install -y nodejs
RUN npm install -g season
RUN apt-get install -y cmake
RUN cd /tmp && git clone https://github.com/vmg/pcre
RUN mkdir -p /tmp/pcre/build && cd /tmp/pcre/build && \
RUN curl -sL https://deb.nodesource.com/setup_6.x | bash - && \
apt-get update && \
apt-get install -y nodejs cmake && \
npm install -g season && \
cd /tmp && git clone https://github.com/vmg/pcre && \
mkdir -p /tmp/pcre/build && cd /tmp/pcre/build && \
cmake .. \
-DPCRE_SUPPORT_JIT=ON \
-DPCRE_SUPPORT_UTF=ON \
@@ -22,14 +19,12 @@ RUN mkdir -p /tmp/pcre/build && cd /tmp/pcre/build && \
-DPCRE_BUILD_PCREGREP=OFF \
-DPCRE_BUILD_TESTS=OFF \
-G "Unix Makefiles" && \
make && make install
RUN rm -rf /tmp/pcre
make && make install && \
rm -rf /tmp/pcre && \
cd /go && go get -u github.com/golang/dep/cmd/dep && \
rm -rf /var/lib/apt/lists/*
RUN go get -u github.com/golang/dep/cmd/dep
WORKDIR /go/src/github.com/github/linguist/tools/grammars
COPY . .
RUN dep ensure
RUN go install ./cmd/grammar-compiler
RUN dep ensure && go install ./cmd/grammar-compiler
ENTRYPOINT ["grammar-compiler"]

View File

@@ -25,6 +25,12 @@
packages = ["."]
revision = "06020f85339e21b2478f756a78e295255ffa4d6a"
[[projects]]
name = "github.com/urfave/cli"
packages = ["."]
revision = "cfb38830724cc34fedffe9a2a29fb54fa9169cd1"
version = "v1.20.0"
[[projects]]
name = "gopkg.in/cheggaaa/pb.v1"
packages = ["."]
@@ -40,6 +46,6 @@
[solve-meta]
analyzer-name = "dep"
analyzer-version = 1
inputs-digest = "eb10157687c05a542025c119a5280abe429e29141bde70dd437d48668f181861"
inputs-digest = "ba2e3150d728692b49e3e2d652b6ea23db82777c340e0c432cd4af6f0eef9f55"
solver-name = "gps-cdcl"
solver-version = 1

View File

@@ -17,3 +17,7 @@
[[constraint]]
name = "gopkg.in/cheggaaa/pb.v1"
version = "1.0.18"
[[constraint]]
name = "github.com/urfave/cli"
version = "1.20.0"

View File

@@ -1,80 +1,120 @@
package main
import (
"flag"
"fmt"
"os"
"os/exec"
"github.com/github/linguist/tools/grammars/compiler"
"github.com/urfave/cli"
)
var linguistRoot = flag.String("linguist", "", "path to Linguist installation")
var protoOut = flag.String("proto", "", "dump Protobuf library")
var jsonOut = flag.String("json", "", "dump JSON output")
var addGrammar = flag.String("add", "", "add a new grammar source")
var updateList = flag.Bool("update", false, "update grammars.yml instead of verifying its contents")
var report = flag.String("report", "", "write report to file")
func cwd() string {
cwd, _ := os.Getwd()
return cwd
}
func fatal(err error) {
fmt.Fprintf(os.Stderr, "FATAL: %s\n", err)
os.Exit(1)
func wrap(err error) error {
return cli.NewExitError(err, 255)
}
func main() {
flag.Parse()
app := cli.NewApp()
app.Name = "Linguist Grammars Compiler"
app.Usage = "Compile user-submitted grammars and check them for errors"
if _, err := exec.LookPath("csonc"); err != nil {
fatal(err)
app.Flags = []cli.Flag{
cli.StringFlag{
Name: "linguist-path",
Value: cwd(),
Usage: "path to Linguist root",
},
}
if *linguistRoot == "" {
cwd, err := os.Getwd()
if err != nil {
fatal(err)
}
*linguistRoot = cwd
app.Commands = []cli.Command{
{
Name: "add",
Usage: "add a new grammar source",
Flags: []cli.Flag{
cli.BoolFlag{
Name: "force, f",
Usage: "ignore compilation errors",
},
},
Action: func(c *cli.Context) error {
conv, err := compiler.NewConverter(c.String("linguist-path"))
if err != nil {
return wrap(err)
}
if err := conv.AddGrammar(c.Args().First()); err != nil {
if !c.Bool("force") {
return wrap(err)
}
}
if err := conv.WriteGrammarList(); err != nil {
return wrap(err)
}
return nil
},
},
{
Name: "update",
Usage: "update grammars.yml with the contents of the grammars library",
Flags: []cli.Flag{
cli.BoolFlag{
Name: "force, f",
Usage: "write grammars.yml even if grammars fail to compile",
},
},
Action: func(c *cli.Context) error {
conv, err := compiler.NewConverter(c.String("linguist-path"))
if err != nil {
return wrap(err)
}
if err := conv.ConvertGrammars(true); err != nil {
return wrap(err)
}
if err := conv.Report(); err != nil {
if !c.Bool("force") {
return wrap(err)
}
}
if err := conv.WriteGrammarList(); err != nil {
return wrap(err)
}
return nil
},
},
{
Name: "compile",
Usage: "convert the grammars from the library",
Flags: []cli.Flag{
cli.StringFlag{Name: "proto-out, P"},
cli.StringFlag{Name: "out, o"},
},
Action: func(c *cli.Context) error {
conv, err := compiler.NewConverter(c.String("linguist-path"))
if err != nil {
return cli.NewExitError(err, 1)
}
if err := conv.ConvertGrammars(false); err != nil {
return cli.NewExitError(err, 1)
}
if out := c.String("proto-out"); out != "" {
if err := conv.WriteProto(out); err != nil {
return cli.NewExitError(err, 1)
}
}
if out := c.String("out"); out != "" {
if err := conv.WriteJSON(out); err != nil {
return cli.NewExitError(err, 1)
}
}
if err := conv.Report(); err != nil {
return wrap(err)
}
return nil
},
},
}
conv, err := compiler.NewConverter(*linguistRoot)
if err != nil {
fatal(err)
}
if *addGrammar != "" {
if err := conv.AddGrammar(*addGrammar); err != nil {
fatal(err)
}
}
if err := conv.ConvertGrammars(*updateList); err != nil {
fatal(err)
}
if err := conv.WriteGrammarList(); err != nil {
fatal(err)
}
if *protoOut != "" {
if err := conv.WriteProto(*protoOut); err != nil {
fatal(err)
}
}
if *jsonOut != "" {
if err := conv.WriteJSON(*jsonOut); err != nil {
fatal(err)
}
}
if *report == "" {
conv.Report(os.Stderr)
} else {
f, err := os.Create(*report)
if err != nil {
fatal(err)
}
conv.Report(f)
f.Close()
}
app.Run(os.Args)
}

View File

@@ -3,7 +3,6 @@ package compiler
import (
"encoding/json"
"fmt"
"io"
"io/ioutil"
"os"
"path"
@@ -52,6 +51,16 @@ func (conv *Converter) work() {
conv.wg.Done()
}
func (conv *Converter) tmpScopes() map[string]bool {
scopes := make(map[string]bool)
for _, ary := range conv.grammars {
for _, s := range ary {
scopes[s] = true
}
}
return scopes
}
func (conv *Converter) AddGrammar(source string) error {
repo := conv.Load(source)
if len(repo.Files) == 0 {
@@ -61,17 +70,30 @@ func (conv *Converter) AddGrammar(source string) error {
conv.grammars[source] = repo.Scopes()
conv.modified = true
knownScopes := conv.tmpScopes()
repo.FixRules(knownScopes)
if len(repo.Errors) > 0 {
fmt.Fprintf(os.Stderr, "The new grammar %s contains %d errors:\n",
repo, len(repo.Errors))
for _, err := range repo.Errors {
fmt.Fprintf(os.Stderr, " - %s\n", err)
}
fmt.Fprintf(os.Stderr, "\n")
return fmt.Errorf("failed to compile the given grammar")
}
fmt.Printf("OK! added grammar source '%s'\n", source)
for scope := range repo.Files {
fmt.Printf("\tnew scope: %s\n", scope)
}
return nil
}
func (conv *Converter) ScopeMap() map[string]*Repository {
func (conv *Converter) AllScopes() map[string]bool {
// Map from scope -> Repository first to error check
// possible duplicates
allScopes := make(map[string]*Repository)
for _, repo := range conv.Loaded {
for scope := range repo.Files {
if original := allScopes[scope]; original != nil {
@@ -82,7 +104,12 @@ func (conv *Converter) ScopeMap() map[string]*Repository {
}
}
return allScopes
// Convert to scope -> bool
scopes := make(map[string]bool)
for s := range allScopes {
scopes[s] = true
}
return scopes
}
func (conv *Converter) ConvertGrammars(update bool) error {
@@ -112,13 +139,16 @@ func (conv *Converter) ConvertGrammars(update bool) error {
conv.modified = true
}
knownScopes := conv.ScopeMap()
knownScopes := conv.AllScopes()
for source, repo := range conv.Loaded {
repo.FixRules(knownScopes)
if update {
conv.grammars[source] = repo.Scopes()
scopes := repo.Scopes()
if len(scopes) > 0 {
conv.grammars[source] = scopes
}
} else {
expected := conv.grammars[source]
repo.CompareScopes(expected)
@@ -190,7 +220,7 @@ func (conv *Converter) WriteGrammarList() error {
return ioutil.WriteFile(ymlpath, outyml, 0666)
}
func (conv *Converter) Report(w io.Writer) {
func (conv *Converter) Report() error {
var failed []*Repository
for _, repo := range conv.Loaded {
if len(repo.Errors) > 0 {
@@ -202,13 +232,20 @@ func (conv *Converter) Report(w io.Writer) {
return failed[i].Source < failed[j].Source
})
total := 0
for _, repo := range failed {
fmt.Fprintf(w, "- [ ] %s (%d errors)\n", repo, len(repo.Errors))
fmt.Fprintf(os.Stderr, "- [ ] %s (%d errors)\n", repo, len(repo.Errors))
for _, err := range repo.Errors {
fmt.Fprintf(w, " - [ ] %s\n", err)
fmt.Fprintf(os.Stderr, " - [ ] %s\n", err)
}
fmt.Fprintf(w, "\n")
fmt.Fprintf(os.Stderr, "\n")
total += len(repo.Errors)
}
if total > 0 {
return fmt.Errorf("the grammar library contains %d errors", total)
}
return nil
}
func NewConverter(root string) (*Converter, error) {

View File

@@ -14,16 +14,22 @@ var GrammarAliases = map[string]string{
}
var KnownFields = map[string]bool{
"comment": true,
"uuid": true,
"author": true,
"comments": true,
"macros": true,
"fileTypes": true,
"firstLineMatch": true,
"keyEquivalent": true,
"foldingStopMarker": true,
"foldingStartMarker": true,
"foldingEndMarker": true,
"limitLineLength": true,
"comment": true,
"uuid": true,
"author": true,
"comments": true,
"macros": true,
"fileTypes": true,
"firstLineMatch": true,
"keyEquivalent": true,
"foldingStopMarker": true,
"foldingStartMarker": true,
"foldingEndMarker": true,
"limitLineLength": true,
"hideFromUser": true,
"injectionSelector": true,
"swallow": true,
"foregroundColor": true,
"backgroundColor": true,
"increaseIndentPattern": true,
}

View File

@@ -81,7 +81,7 @@ func (repo *Repository) CompareScopes(scopes []string) {
}
}
func (repo *Repository) FixRules(knownScopes map[string]*Repository) {
func (repo *Repository) FixRules(knownScopes map[string]bool) {
for _, file := range repo.Files {
w := walker{
File: file,
@@ -108,6 +108,11 @@ func isValidGrammar(path string, info os.FileInfo) bool {
return false
}
// Tree-Sitter grammars are not supported
if strings.HasPrefix(filepath.Base(path), "tree-sitter-") {
return false
}
dir := filepath.Dir(path)
ext := filepath.Ext(path)
@@ -117,7 +122,7 @@ func isValidGrammar(path string, info os.FileInfo) bool {
case ".tmlanguage", ".yaml-tmlanguage":
return true
case ".cson", ".json":
return strings.HasSuffix(dir, "/grammars")
return strings.HasSuffix(dir, "/grammars") || strings.HasSuffix(dir, "/syntaxes")
default:
return false
}

View File

@@ -6,6 +6,7 @@ import (
"os/exec"
"path"
"path/filepath"
"sort"
"strings"
)
@@ -14,14 +15,43 @@ type fsLoader struct {
abspath string
}
var preferredGrammars = map[string]int{
".tmlanguage": 0,
".cson": 1,
".json": 1,
".plist": 2,
".yaml-tmlanguage": 3,
}
func findPreferredExtension(ext []string) string {
if len(ext) > 1 {
sort.Slice(ext, func(i, j int) bool {
a := strings.ToLower(ext[i])
b := strings.ToLower(ext[j])
return preferredGrammars[a] < preferredGrammars[b]
})
}
return ext[0]
}
func (l *fsLoader) findGrammars() (files []string, err error) {
grammars := make(map[string][]string)
err = filepath.Walk(l.abspath,
func(path string, info os.FileInfo, err error) error {
if err == nil && isValidGrammar(path, info) {
files = append(files, path)
ext := filepath.Ext(path)
base := path[0 : len(path)-len(ext)]
grammars[base] = append(grammars[base], ext)
}
return nil
})
for base, ext := range grammars {
pref := findPreferredExtension(ext)
files = append(files, base+pref)
}
return
}

View File

@@ -19,7 +19,7 @@ func (w *walker) checkInclude(rule *grammar.Rule) {
}
include = strings.Split(include, "#")[0]
_, ok := w.Known[include]
ok := w.Known[include]
if !ok {
if !w.Missing[include] {
w.Missing[include] = true
@@ -73,7 +73,7 @@ func (w *walker) walk(rule *grammar.Rule) {
type walker struct {
File *LoadedFile
Known map[string]*Repository
Known map[string]bool
Missing map[string]bool
Errors []error
}

19
vendor/README.md vendored
View File

@@ -24,7 +24,6 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **APL:** [Alhadis/language-apl](https://github.com/Alhadis/language-apl)
- **Apollo Guidance Computer:** [Alhadis/language-agc](https://github.com/Alhadis/language-agc)
- **AppleScript:** [textmate/applescript.tmbundle](https://github.com/textmate/applescript.tmbundle)
- **Arduino:** [textmate/c.tmbundle](https://github.com/textmate/c.tmbundle)
- **AsciiDoc:** [zuckschwerdt/asciidoc.tmbundle](https://github.com/zuckschwerdt/asciidoc.tmbundle)
- **ASN.1:** [ajLangley12/language-asn1](https://github.com/ajLangley12/language-asn1)
- **ASP:** [textmate/asp.tmbundle](https://github.com/textmate/asp.tmbundle)
@@ -72,6 +71,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **Common Lisp:** [textmate/lisp.tmbundle](https://github.com/textmate/lisp.tmbundle)
- **Common Workflow Language:** [manabuishii/language-cwl](https://github.com/manabuishii/language-cwl)
- **Component Pascal:** [textmate/pascal.tmbundle](https://github.com/textmate/pascal.tmbundle)
- **CoNLL-U:** [odanoburu/conllu-linguist-grammar](https://github.com/odanoburu/conllu-linguist-grammar)
- **Cool:** [anunayk/cool-tmbundle](https://github.com/anunayk/cool-tmbundle)
- **Coq:** [mkolosick/Sublime-Coq](https://github.com/mkolosick/Sublime-Coq)
- **Cpp-ObjDump:** [nanoant/assembly.tmbundle](https://github.com/nanoant/assembly.tmbundle)
@@ -151,7 +151,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **Handlebars:** [daaain/Handlebars](https://github.com/daaain/Handlebars)
- **Harbour:** [hernad/atom-language-harbour](https://github.com/hernad/atom-language-harbour)
- **Haskell:** [atom-haskell/language-haskell](https://github.com/atom-haskell/language-haskell)
- **Haxe:** [clemos/haxe-sublime-bundle](https://github.com/clemos/haxe-sublime-bundle)
- **Haxe:** [vshaxe/haxe-TmLanguage](https://github.com/vshaxe/haxe-TmLanguage)
- **HCL:** [alexlouden/Terraform.tmLanguage](https://github.com/alexlouden/Terraform.tmLanguage)
- **HLSL:** [tgjones/shaders-tmLanguage](https://github.com/tgjones/shaders-tmLanguage)
- **HTML:** [textmate/html.tmbundle](https://github.com/textmate/html.tmbundle)
@@ -160,7 +160,8 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **HTML+EEX:** [elixir-lang/elixir-tmbundle](https://github.com/elixir-lang/elixir-tmbundle)
- **HTML+ERB:** [atom/language-ruby](https://github.com/atom/language-ruby)
- **HTML+PHP:** [textmate/php.tmbundle](https://github.com/textmate/php.tmbundle)
- **HTTP:** [httpspec/sublime-highlighting](https://github.com/httpspec/sublime-highlighting)
- **HTTP:** [samsalisbury/Sublime-HTTP](https://github.com/samsalisbury/Sublime-HTTP)
- **HXML:** [vshaxe/haxe-TmLanguage](https://github.com/vshaxe/haxe-TmLanguage)
- **IDL:** [mgalloy/idl.tmbundle](https://github.com/mgalloy/idl.tmbundle)
- **Idris:** [idris-hackers/idris-sublime](https://github.com/idris-hackers/idris-sublime)
- **Inform 7:** [erkyrath/language-inform7](https://github.com/erkyrath/language-inform7)
@@ -189,7 +190,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **KiCad Legacy Layout:** [Alhadis/language-pcb](https://github.com/Alhadis/language-pcb)
- **KiCad Schematic:** [Alhadis/language-pcb](https://github.com/Alhadis/language-pcb)
- **Kit:** [textmate/html.tmbundle](https://github.com/textmate/html.tmbundle)
- **Kotlin:** [vkostyukov/kotlin-sublime-package](https://github.com/vkostyukov/kotlin-sublime-package)
- **Kotlin:** [nishtahir/language-kotlin](https://github.com/nishtahir/language-kotlin)
- **LabVIEW:** [textmate/xml.tmbundle](https://github.com/textmate/xml.tmbundle)
- **Lasso:** [bfad/Sublime-Lasso](https://github.com/bfad/Sublime-Lasso)
- **Latte:** [textmate/php-smarty.tmbundle](https://github.com/textmate/php-smarty.tmbundle)
@@ -214,7 +215,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **Marko:** [marko-js/marko-tmbundle](https://github.com/marko-js/marko-tmbundle)
- **Mask:** [tenbits/sublime-mask](https://github.com/tenbits/sublime-mask)
- **Mathematica:** [shadanan/mathematica-tmbundle](https://github.com/shadanan/mathematica-tmbundle)
- **Matlab:** [textmate/matlab.tmbundle](https://github.com/textmate/matlab.tmbundle)
- **Matlab:** [mathworks/MATLAB-Language-grammar](https://github.com/mathworks/MATLAB-Language-grammar)
- **Maven POM:** [textmate/maven.tmbundle](https://github.com/textmate/maven.tmbundle)
- **Max:** [textmate/json.tmbundle](https://github.com/textmate/json.tmbundle)
- **MAXScript:** [Alhadis/language-maxscript](https://github.com/Alhadis/language-maxscript)
@@ -239,6 +240,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **NetLinx+ERB:** [amclain/sublime-netlinx](https://github.com/amclain/sublime-netlinx)
- **NetLogo:** [textmate/lisp.tmbundle](https://github.com/textmate/lisp.tmbundle)
- **NewLisp:** [textmate/lisp.tmbundle](https://github.com/textmate/lisp.tmbundle)
- **Nextflow:** [nextflow-io/atom-language-nextflow](https://github.com/nextflow-io/atom-language-nextflow)
- **Nginx:** [brandonwamboldt/sublime-nginx](https://github.com/brandonwamboldt/sublime-nginx)
- **Nim:** [Varriount/NimLime](https://github.com/Varriount/NimLime)
- **Ninja:** [khyo/language-ninja](https://github.com/khyo/language-ninja)
@@ -313,6 +315,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **RobotFramework:** [shellderp/sublime-robot-plugin](https://github.com/shellderp/sublime-robot-plugin)
- **Roff:** [Alhadis/language-roff](https://github.com/Alhadis/language-roff)
- **Rouge:** [atom/language-clojure](https://github.com/atom/language-clojure)
- **RPC:** [textmate/c.tmbundle](https://github.com/textmate/c.tmbundle)
- **RPM Spec:** [waveclaw/language-rpm-spec](https://github.com/waveclaw/language-rpm-spec)
- **Ruby:** [atom/language-ruby](https://github.com/atom/language-ruby)
- **RUNOFF:** [Alhadis/language-roff](https://github.com/Alhadis/language-roff)
@@ -321,11 +324,12 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **SaltStack:** [saltstack/atom-salt](https://github.com/saltstack/atom-salt)
- **SAS:** [rpardee/sas.tmbundle](https://github.com/rpardee/sas.tmbundle)
- **Sass:** [nathos/sass-textmate-bundle](https://github.com/nathos/sass-textmate-bundle)
- **Scala:** [mads379/scala.tmbundle](https://github.com/mads379/scala.tmbundle)
- **Scala:** [scala/vscode-scala-syntax](https://github.com/scala/vscode-scala-syntax)
- **Scaml:** [scalate/Scalate.tmbundle](https://github.com/scalate/Scalate.tmbundle)
- **Scheme:** [textmate/scheme.tmbundle](https://github.com/textmate/scheme.tmbundle)
- **Scilab:** [textmate/scilab.tmbundle](https://github.com/textmate/scilab.tmbundle)
- **SCSS:** [MarioRicalde/SCSS.tmbundle](https://github.com/MarioRicalde/SCSS.tmbundle)
- **sed:** [Alhadis/language-sed](https://github.com/Alhadis/language-sed)
- **ShaderLab:** [tgjones/shaders-tmLanguage](https://github.com/tgjones/shaders-tmLanguage)
- **Shell:** [atom/language-shellscript](https://github.com/atom/language-shellscript)
- **ShellSession:** [atom/language-shellscript](https://github.com/atom/language-shellscript)
@@ -336,6 +340,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **Smalltalk:** [tomas-stefano/smalltalk-tmbundle](https://github.com/tomas-stefano/smalltalk-tmbundle)
- **Smarty:** [textmate/php-smarty.tmbundle](https://github.com/textmate/php-smarty.tmbundle)
- **SMT:** [SRI-CSL/SMT.tmbundle](https://github.com/SRI-CSL/SMT.tmbundle)
- **Solidity:** [davidhq/SublimeEthereum](https://github.com/davidhq/SublimeEthereum)
- **SourcePawn:** [github-linguist/sublime-sourcepawn](https://github.com/github-linguist/sublime-sourcepawn)
- **SPARQL:** [peta/turtle.tmbundle](https://github.com/peta/turtle.tmbundle)
- **Spline Font Database:** [Alhadis/language-fontforge](https://github.com/Alhadis/language-fontforge)
@@ -372,7 +377,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **TypeScript:** [Microsoft/TypeScript-TmLanguage](https://github.com/Microsoft/TypeScript-TmLanguage)
- **Unified Parallel C:** [textmate/c.tmbundle](https://github.com/textmate/c.tmbundle)
- **Unity3D Asset:** [atom/language-yaml](https://github.com/atom/language-yaml)
- **Unix Assembly:** [Nessphoro/sublimeassembly](https://github.com/Nessphoro/sublimeassembly)
- **Unix Assembly:** [calculuswhiz/Assembly-Syntax-Definition](https://github.com/calculuswhiz/Assembly-Syntax-Definition)
- **Uno:** [atom/language-csharp](https://github.com/atom/language-csharp)
- **UnrealScript:** [textmate/java.tmbundle](https://github.com/textmate/java.tmbundle)
- **UrWeb:** [gwalborn/UrWeb-Language-Definition](https://github.com/gwalborn/UrWeb-Language-Definition)

Some files were not shown because too many files have changed in this diff Show More