Compare commits

..

219 Commits

Author SHA1 Message Date
Vicent Marti
98b99e38bb Merge pull request #2654 from github/vmg/git-linguist-fixes
Small fixes for git-linguist
2015-09-28 11:12:49 +02:00
Vicent Marti
d8e3bec499 Bump version 2015-09-28 01:45:49 -07:00
Vicent Marti
7c759d4d29 git-linguist: Do not write cache if repo is gone 2015-09-28 01:45:49 -07:00
Vicent Marti
41d438b47e repository: Do not attempt to scan large repos 2015-09-28 01:45:49 -07:00
Vicent Marti
41911d6921 git-linguist: Properly handle $GIT_DIR from git 2015-09-28 01:45:49 -07:00
Arfon Smith
dca18d77cb Merge pull request #2656 from iblech/better-test-description
Clarified that only nonprimary extensions should be sorted
2015-09-27 20:34:30 +01:00
Ingo Blechschmidt
040af5dad2 Clarify that only nonprimary extensions should be sorted 2015-09-25 19:23:06 +02:00
Arfon Smith
4867c49bd9 Merge pull request #2642 from github/license-in-gemspec
Include LICENSE in gem
2015-09-23 09:19:18 +01:00
Brandon Keepers
a354eddf4b Update github-linguist.gemspec 2015-09-22 16:33:08 -04:00
Vicent Marti
9b78c533a5 Merge pull request #2641 from github/js-syntax-fix
github-linguist-grammars 4.6.2
2015-09-21 21:56:33 +02:00
Mislav Marohnić
090ea576b9 github-linguist-grammars 4.6.2 2015-09-21 21:23:40 +02:00
Mislav Marohnić
6a2d33a4b3 Bump language-javascript for syntax highlighting fix
This is primarily to pull
https://github.com/atom/language-javascript/pull/227

Full changelog:
c5c381e...7b14bbb (diff-46d5c1ca71eaebb92619d6c7abc9388d)
2015-09-21 19:04:30 +02:00
Vicent Marti
2c62da7834 Merge pull request #2636 from github/vmg/git-linguist-oid
git-linguist: Delay loading @commit_oid
2015-09-16 16:48:12 +02:00
Vicent Marti
0145a0adb2 git-linguist: Delay loading @commit_oid 2015-09-16 05:50:35 -07:00
Vicent Marti
473282d64c Merge pull request #2630 from github/vmg/memory
4.6.0: Reduce memory pressure
2015-09-15 11:19:45 +02:00
Vicent Marti
c2c068e9db Bump version to 4.6.0 2015-09-14 08:43:10 -07:00
Vicent Marti
13d1f662d1 Add the git-linguist helper 2015-09-14 08:42:51 -07:00
Arfon Smith
bdd57f58a0 Merge pull request #2625 from github/handlebars-group
Adding Handlebars to the HTML group
2015-09-09 09:18:11 +01:00
Arfon Smith
b1bcabd6e6 Adding Handlebars to the HTML group 2015-09-08 12:25:05 +01:00
Arfon Smith
e128c3fa82 Merge pull request #2622 from miksen/patch-1
Language bar clarification in README.md
2015-09-08 12:08:42 +01:00
Arfon Smith
efac9fe750 Merge pull request #2624 from pchaigno/jsx-grammar
New JSX language under JavaScript group
2015-09-08 11:01:44 +01:00
Arfon Smith
2b8545a8fa Merge pull request #2567 from pcantrell/objc-import-statement
#import "*.h" detection for Objective-C
2015-09-08 10:53:32 +01:00
Vicent Marti
b275b5d728 Soften memory pressure 2015-09-07 22:03:29 +02:00
Paul Chaignon
1f46cfafa7 New JSX language under JavaScript group
A specific grammar is needed to highlight .jsx files
Thus, there are now in a distinct language but still in the JavaScript group
2015-09-05 13:31:17 +02:00
miksen
b1dcdf3418 Language bar clarification in README.md
Clarified what the percentages in the Language bar are based on.
2015-09-04 11:49:10 +02:00
Paul Cantrell
4bfd65deb8 #import "*.h" detection for Objective-C 2015-09-03 22:10:27 -05:00
Arfon Smith
61102812a0 Merge pull request #2619 from github/linguist-generated
Adding support for generated overrides
2015-09-03 15:02:16 +01:00
Arfon Smith
580cfce7fb Adding support for generated overrides 2015-09-03 14:39:27 +01:00
Arfon Smith
f1383d7a45 Merge pull request #2616 from scttnlsn/patch-1
Ignore spec fixtures
2015-09-02 18:35:07 +01:00
Scott Nelson
e4ce5bfe39 Ignore spec fixtures 2015-09-02 12:52:25 -04:00
Arfon Smith
6ed64f25a2 Merge pull request #2607 from Alhadis/master
Add .geojson/.topojson as JSON extensions / Fix NCL colour
2015-09-02 10:13:31 +01:00
Alhadis
114a331106 Add ".topojson" as a JSON extension 2015-09-02 07:04:51 +10:00
Arfon Smith
9aa24a216a Merge pull request #2612 from github/cut-release-v4.5.15
Cut release v4.5.15
2015-09-01 21:04:40 +01:00
Arfon Smith
13702451ab Bumping to v4.5.15 2015-09-01 19:13:02 +01:00
Arfon Smith
f0242f6f97 Updating grammars 2015-09-01 19:12:27 +01:00
Arfon Smith
9775820398 Merge pull request #2591 from jtbandes/master
Vendored definitions for Xcode-related files
2015-09-01 18:53:08 +01:00
Arfon Smith
7daf26bcd0 Merge pull request #2604 from larsbrinkhoff/alphabetise-heuristics
Alphabetise heuristics.
2015-09-01 15:25:49 +01:00
Arfon Smith
231f705098 Merge pull request #2606 from thejameskyle/jsproj
Add .jsproj extension
2015-09-01 15:23:53 +01:00
Arfon Smith
893ab8fd8d Merge pull request #2610 from ismail-arilik/patch-1
Add color for the PL/SQL language.
2015-09-01 15:20:49 +01:00
ismail-arilik
5afdd2c533 Add color for the PL/SQL language.
I have added color to the PL/SQL language: #dad8d8. I take this color from a window background which evokes the interface of the language.
2015-09-01 14:23:59 +03:00
Lars Brinkhoff
e4f5c0066a Add checks to keep heuristics alphabetised. 2015-09-01 08:21:34 +02:00
Alhadis
a167f852dd Alphabetise JSON extensions 2015-09-01 07:33:07 +10:00
Alhadis
b428bce126 Quote NCL language's colour value
Without double-quotes, the hex colour is interpreted as a YAML comment.

Originally added in 2d39258.
2015-09-01 07:24:17 +10:00
Alhadis
e62d0e19a5 Add ".geojson" as a JSON extension 2015-09-01 07:12:18 +10:00
James Kyle
9b8bf9068f Move jsproj example to xml directory 2015-08-31 11:35:01 -07:00
James Kyle
6e05edc350 Add .jsproj extension 2015-08-31 10:26:29 -07:00
Lars Brinkhoff
dd8eaf2893 Alphabetise heuristics. 2015-08-31 08:53:51 +02:00
Ben Balter
e24efad5ff Merge pull request #2594 from github/license-help
Add some additional help text to license test
2015-08-26 14:55:37 -04:00
Arfon Smith
58a34cdb7d Merge pull request #2589 from mandel/master
Add the X10 language.
2015-08-26 19:36:54 +01:00
Louis Mandel
b1c6b330e9 Switch to Apache License. 2015-08-26 14:26:30 -04:00
Ben Balter
7c3e265033 also add help for unapproved licenses 2015-08-26 12:31:29 -04:00
Louis Mandel
13695a716c Update X10 grammar license. 2015-08-26 09:46:02 -04:00
Ben Balter
c9e43804d6 add some help text to license test 2015-08-26 09:26:04 -04:00
Arfon Smith
1535e3553e Merge pull request #2580 from LeonaMorro/master
add *.lslp as an additional extension for LSL (LindenScriptingLanguage)
2015-08-26 14:09:48 +01:00
Arfon Smith
0ac05bbbeb Merge pull request #2582 from pchaigno/vendor-libraries
Vendored JS files
2015-08-26 14:02:57 +01:00
Arfon Smith
d3f979d640 Merge pull request #2592 from jtbandes/typo
Fix typo in Obj-C heuristic keyword
2015-08-26 12:52:05 +01:00
Jacob Bandes-Storch
0e9ded45dc Fix typo in Obj-C heuristic keyword
`synchronised` → `synchronized`
2015-08-26 00:34:00 -07:00
Jacob Bandes-Storch
a2ca886510 Vendored definitions for Xcode-related files 2015-08-25 23:42:20 -07:00
Louis Mandel
25a1af3775 Add the X10 language (http://x10-lang.org/). 2015-08-24 13:26:43 -04:00
Paul Chaignon
0d8e0a2970 Sublime Text workspace files as vendored 2015-08-24 12:32:26 +02:00
Paul Chaignon
c0fff6c8a8 Make Slick regexp more general 2015-08-21 12:30:01 +02:00
Paul Chaignon
e6b4428614 Tests for new vendored files 2015-08-21 12:24:04 +02:00
Paul Chaignon
4e6e69833d Test for new CodeMirror regexp 2015-08-21 12:15:17 +02:00
Paul Chaignon
1d9faff4c6 New JS vendored files 2015-08-21 12:13:42 +02:00
Paul Chaignon
7025cbe760 Fix CodeMirror regex for vendored files 2015-08-21 12:12:52 +02:00
LeonaMorro
e922b7c2ca added *.lslp to samples/LSL folder 2015-08-21 11:44:23 +02:00
LeonaMorro
96518d2d0f added *.lslp as LSL(Linden Scripting Language) 2015-08-21 11:17:12 +02:00
Arfon Smith
1241b20ba1 Merge pull request #2578 from blakeembrey/correct-raml-type
Make RAML a markup language
2015-08-20 20:56:21 +01:00
Ben Balter
f03f5c1628 Merge pull request #2568 from github/licensee
Use Licensee to classify submodule licenses
2015-08-20 15:42:50 -04:00
Ben Balter
cb550a3662 remove some random submodules 2015-08-20 15:30:51 -04:00
Ben Balter
d1f90d61c5 Merge branch 'master' into licensee 2015-08-20 15:25:15 -04:00
Blake Embrey
16e65fe189 Make RAML a markup language
RAML was originally merged as a data language, but this seems like an incorrect definition. I changed it to be markup instead, which will also result in RAML appearing in repo statistics.
2015-08-20 10:07:44 -07:00
Ben Balter
62a0faa729 let us 2015-08-20 12:23:52 -04:00
Ben Balter
fbb3ab2292 batch license test output 2015-08-20 11:38:31 -04:00
Arfon Smith
b3b75e5ef8 Merge pull request #2574 from github/perl-t
Adding Perl/Perl6 heuristic for '.t'
2015-08-20 10:29:52 +01:00
Arfon Smith
8b36210db5 Merge pull request #2573 from a0viedo/patch-1
relativize link in readme
2015-08-20 10:25:02 +01:00
Arfon Smith
a74f3b3e46 Adding Perl/Perl6 heuristic for '.t' 2015-08-20 10:16:52 +01:00
Alejandro Oviedo
e214a52de5 relativize link in readme
...so it could link properly in other branches and forks.
2015-08-19 19:15:05 -03:00
Arfon Smith
0624a9395c Merge pull request #2571 from pchaigno/prolog-grammar
New grammars for Prolog and ECLiPSe
2015-08-19 21:44:13 +01:00
Arfon Smith
b2e7f7ffa6 Merge pull request #2570 from pchaigno/typescript-grammar
New grammar for Typescript
2015-08-19 21:41:15 +01:00
Arfon Smith
b312b39a10 Merge pull request #2572 from pchaigno/rmarkdown-tmscope
TextMate scope for RMarkdown
2015-08-19 21:38:42 +01:00
Paul Chaignon
80e2d112b2 tm_scope for RMarkdown 2015-08-19 22:07:23 +02:00
Paul Chaignon
519b169df0 New grammar for Typescript from Sublime Text package 2015-08-19 21:30:24 +02:00
Paul Chaignon
5c2cfbc334 Remove Typescript grammar 2015-08-19 21:27:40 +02:00
Paul Chaignon
7d91e4959a Dissociate ECLiPSe from Prolog
ECLiPSe syntax is slightly different from Prolog syntax
ECLiPSe is in the Prolog group so it will only be highlighted differently
2015-08-19 20:56:14 +02:00
Paul Chaignon
0c5aa2a7eb Merge branch 'master' into prolog-grammar 2015-08-19 20:46:15 +02:00
Paul Chaignon
0d7a264981 Update submodule for Prolog grammar 2015-08-19 20:43:16 +02:00
Arfon Smith
52ff2d2e74 Merge pull request #2557 from pchaigno/mozilla-public-license
Recognize the Mozilla Public License for grammars
2015-08-19 19:21:33 +01:00
Ben Balter
8a7ceaa845 bump licensee to support ruby 1.9.3 2015-08-19 13:22:31 -04:00
Ben Balter
fd9ce2d1cf use licensee to classify submodule licenses 2015-08-19 12:54:21 -04:00
Arfon Smith
c7bab11ebe Merge pull request #2566 from github/cut-release-v4.5.14
v4.5.14 version bump
2015-08-19 10:36:52 +01:00
Arfon Smith
6995fc28b6 v4.5.14 version bump 2015-08-19 07:01:07 +01:00
Arfon Smith
102f14d0e9 Grammars update 2015-08-19 06:59:39 +01:00
Arfon Smith
aac168402b Merge pull request #2565 from pchaigno/aspectj-grammar
Grammar for AspectJ
2015-08-19 06:49:16 +01:00
Paul Chaignon
152d49513f Grammar for AspectJ from Sublime Text package 2015-08-19 00:11:34 +02:00
Arfon Smith
d5564c808d Merge pull request #2560 from larsbrinkhoff/sexp
Add .sexp to Common Lisp.
2015-08-18 21:15:46 +01:00
Lars Brinkhoff
82410e07b2 Add .sexp to Common Lisp.
config.sexp by Jeremie Dimino; Apache License 2.0.
rss.sexp by Dan Lentz; LGPL 2.1.
2015-08-18 12:45:47 +02:00
Arfon Smith
94d90b30b5 Merge pull request #2561 from pchaigno/labview-xml
Highlight LabVIEW files as XML
2015-08-18 12:25:38 +02:00
Paul Chaignon
06997f0da2 Fix scope of grammar for LabVIEW 2015-08-14 14:17:32 +02:00
Paul Chaignon
55aafa416d Highlight LabVIEW files as XML 2015-08-14 13:56:28 +02:00
Paul Chaignon
6226a46988 Recognize the Mozilla Public License for grammars 2015-08-13 17:35:01 +02:00
Paul Chaignon
8d216f0c43 Grammar for Prolog from Sublime Text package
MPLv2 License is currently unrecognized
2015-08-13 17:15:04 +02:00
Paul Chaignon
7f5bb25542 Remove grammar for Prolog from TextMate bundle 2015-08-13 16:56:21 +02:00
Arfon Smith
5fcdf6adc2 Merge pull request #2555 from larsbrinkhoff/zone
Add .arpa to DNS Zone.
2015-08-13 10:18:51 +01:00
Lars Brinkhoff
6a565a849b Add .arpa to DNS Zone.
sample.arpa by Freeswitch project; Mozilla Public License 1.1.
2015-08-13 07:38:17 +02:00
Arfon Smith
66fc67e34c Merge pull request #2554 from github/mcandre-master
DNS Zone files
2015-08-12 18:54:04 +01:00
Arfon Smith
7cf140940e Fixing up the build 2015-08-12 17:20:29 +01:00
Arfon Smith
60e90bab23 Removing un-used Bind bundle 2015-08-12 17:06:39 +01:00
Arfon Smith
4f58258186 Removing erroneous submodule 2015-08-12 17:05:57 +01:00
Arfon Smith
03e2904ebf Merge branch 'master' of https://github.com/mcandre/linguist into mcandre-master 2015-08-12 15:15:28 +01:00
Andrew Pennebaker
bea90b256e use st2-zonefile (MIT licensed) instead of Bind.tmbundle (unlicensed) 2015-08-11 14:01:45 -05:00
Arfon Smith
8eb37ba956 Merge pull request #2541 from github/reworking-documentation-paths
Reworking documentation paths
2015-08-11 09:44:19 +01:00
Arfon Smith
8d20c1fb59 More inclusive documentation matches for License and Readme files 2015-08-11 09:39:41 +01:00
Arfon Smith
9a1abf0c49 Merge branch 'master' into reworking-documentation-paths 2015-08-11 09:10:08 +01:00
Arfon Smith
5aae7a4000 Merge pull request #2552 from github/cut-release-v4.5.13
Bumping version to v4.5.13
2015-08-11 07:50:53 +01:00
Arfon Smith
d9509a1750 Bumping version to v4.5.13 2015-08-11 07:01:05 +01:00
Arfon Smith
978c448fb8 Merge pull request #2551 from github/name-checkin
Catching one more edge case
2015-08-11 06:53:25 +01:00
Arfon Smith
997c0fca10 Catching one more edge case 2015-08-11 06:48:54 +01:00
Arfon Smith
3ae6e68492 Merge pull request #2549 from github/name-checkin
Don't blow up if empty string/nil passed to alias methods
2015-08-10 22:17:07 +01:00
Arfon Smith
851c93a1f7 Don't blow up if empty string/nil passed to alias methods 2015-08-10 22:07:28 +01:00
Arfon Smith
a5f7355e16 Merge pull request #2547 from github/grammars
Grammar updates
2015-08-10 15:10:20 +01:00
Arfon Smith
18ffdbaa65 Grammar updates 2015-08-10 15:07:27 +01:00
Arfon Smith
c089222bc6 Merge pull request #2545 from pchaigno/yaml-tmlanguage
YAML extensions for TextMate and Sublime Text grammars
2015-08-10 09:54:20 +01:00
Arfon Smith
37f9535d27 Merge pull request #2546 from ammaraskar/master
Make regex for vim modeline more lenient
2015-08-10 09:53:46 +01:00
Ammar Askar
4650368bc2 Make regex for vim modeline more lenient
This change allows the filetype/language to be retrieved from more complex vim modelines. The current regex strictly allows a set line which contains only the filetype/ft parameter and nothing else
2015-08-10 00:42:14 -05:00
Paul Chaignon
88b14ed455 .syntax extension for YAML 2015-08-09 14:13:48 +02:00
Paul Chaignon
54a2a47bc0 YAML-tmLanguage extension for YAML 2015-08-09 14:11:01 +02:00
Arfon Smith
ffcc970140 Merge pull request #2542 from github/brewfile
Highlight Brewfile as Ruby
2015-08-07 22:26:55 +01:00
Joshua Peek
7a811e39e0 Add sample Brewfile 2015-08-07 14:01:08 -07:00
Joshua Peek
11f158cbb3 Highlight Brewfile as Ruby 2015-08-07 11:12:30 -07:00
Arfon Smith
5d5550c48b Moving vendored definitions to documentation 2015-08-07 10:42:31 +01:00
Arfon Smith
fd570d906a Adding examples path to documentation.yml 2015-08-07 10:35:18 +01:00
Arfon Smith
deab0662f9 Merge pull request #2447 from Ryman/rustup
Split on comma in language name if no match is found
2015-08-07 10:28:21 +01:00
Arfon Smith
7238f50a6b Merge pull request #2539 from rji/puppet-lang-updates
Updates for the Puppet language
2015-08-07 10:25:32 +01:00
Arfon Smith
499fcd1f3f Merge pull request #2540 from pchaigno/makefile.inc
Filename Makefile.inc for Makefile
2015-08-06 09:33:55 +01:00
Paul Chaignon
dc0ddc82d6 Filename Makefile.inc for Makefile 2015-08-06 10:08:30 +02:00
Arfon Smith
436fc34cb9 Merge pull request #2538 from BerkeleyTrue/patch-1
Add codemirror's demo directory
2015-08-06 09:01:01 +01:00
Roger Ignazio
f072cd96e3 Add hiera_include() sample for the Puppet language
Prior to this commit, some Puppet files were being incorrectly
identified as Pascal when they contained only the following content:

  hiera_include('classes')

This commit adds a hiera_include() sample to for the Puppet language to
correct this behavior.
2015-08-05 14:02:09 -07:00
Roger Ignazio
3441a001c7 Modify Puppet color based on style guide
Prior to this commit, the Puppet language was colored to #332A77. The
Puppet Labs style guide (https://puppetlabs.com/styleguide/brand)
specifies Puppet Dark Purple to be #302B6D. Alternately, Puppet Purple,
a lighter variant, may be used: #7C6AAB.

Keeping with the dark purple theme, this commit modifies the Puppet
language to use Puppet Dark Purple, hex #302B6D.
2015-08-05 14:00:08 -07:00
Berkeley Martinez
bc747844ea Add codemirror's demo directory
This accounts for about ~150 html files throwing off the statistics of one of my projects
2015-08-05 12:22:21 -07:00
Arfon Smith
a887f58bcc Merge pull request #2537 from imsys/xbase-extra
#2504 - xBase aliases and extension .prw
2015-08-05 14:06:24 +01:00
Arthur Helfstein Fragoso
f42afef6e0 order - .prw should come after .ch
1) Failure:
TestPedantic#test_extensions_are_sorted
[/home/arthur/Projects/linguist/test/test_pedantic.rb:15]:
.prw should come after .ch
2015-08-05 05:58:02 -03:00
Arthur Helfstein Fragoso
18eaf22cb9 Added xBase/AdvPL sample file 2015-08-05 05:19:41 -03:00
Arthur Helfstein Fragoso
d94f427e12 xBase: Add aliases and extension .prw
xBase: Add aliases:
* advpl
* clipper
* foxpro

And the extension .prw
2015-08-05 05:06:11 -03:00
Arfon Smith
b94eb42db6 Merge pull request #2536 from github/slim-grammar
Adding Slim tm_scope
2015-08-04 22:04:25 +01:00
Arfon Smith
d2297f5516 Adding Slim tm_scope 2015-08-04 21:56:11 +01:00
Arfon Smith
ef6f58b828 Merge pull request #2535 from pchaigno/heuristics-case-insensitive
Case-insensitive extension match for heuristic rules
2015-08-04 21:35:54 +01:00
Paul Chaignon
eb0bf16cce Case-insensitive extension match for heuristic rules 2015-08-04 17:28:52 +02:00
Arfon Smith
ca51415540 Merge pull request #2534 from github/cut-release-v4.5.11
Bumping version to v4.5.11
2015-08-04 14:05:10 +01:00
Arfon Smith
8ae32e1d47 Bumping version to v4.5.11 2015-08-04 13:29:52 +01:00
Arfon Smith
0a6165c4d9 Updating csharp scopes 2015-08-04 13:27:21 +01:00
Arfon Smith
cf8521a629 Grammar updates 2015-08-04 13:18:27 +01:00
Arfon Smith
b11c7f3dc0 Merge pull request #2513 from yyx990803/master
add syntax highlight for *.vue component files
2015-08-04 12:47:12 +01:00
Arfon Smith
01151aad5c Merge pull request #2533 from github/rrebol
Adding back R/Rebol heuristics
2015-08-04 12:45:44 +01:00
Arfon Smith
6b283068a9 Adding back R/Rebol heuristics 2015-08-04 12:20:15 +01:00
Arfon Smith
ccd7d4d89d Merge pull request #2532 from github/heuristic-fixes
Fixing up some new heuristics
2015-08-04 12:11:17 +01:00
Arfon Smith
208ec3906f Fixing up some new heuristics 2015-08-04 12:06:41 +01:00
Arfon Smith
84d4fccb4d Merge pull request #2441 from pchaigno/associate-heuristic-with-extension
Associate heuristic rules with file extensions
2015-08-04 12:00:47 +01:00
Arfon Smith
8d8ea959ee Merge pull request #2527 from radeksimko/hcl-as-ruby
Parse HCL as Ruby, not JavaScript
2015-07-30 14:42:46 +01:00
Radek Simko
1c73db499f Parse HCL as Ruby, not JavaScript 2015-07-30 11:31:06 +01:00
Arfon Smith
16a4b4947f Merge pull request #2526 from github/cut-releasev4.5.10
Bumping to v4.5.10
2015-07-29 15:21:31 +01:00
Arfon Smith
4b2abb2064 Bumping to v4.5.10 2015-07-29 14:50:11 +01:00
Arfon Smith
c581b6a5a7 Merge pull request #2525 from github/grammars-update
Grammars update
2015-07-29 14:37:08 +01:00
Arfon Smith
4c66582f87 Grammars update 2015-07-29 14:27:35 +01:00
Arfon Smith
11388a5355 Merge pull request #1899 from sethvargo/patch-1
Add HCL to languages.yml
2015-07-29 14:14:38 +01:00
Arfon Smith
24ca98b1a3 Merge pull request #2524 from github/more-encompassing-number-skips
More encompassing number skips
2015-07-29 14:07:04 +01:00
Arfon Smith
90a293727d Merge branch 'master' into more-encompassing-number-skips 2015-07-29 13:54:51 +01:00
Arfon Smith
e869f6c173 Merge pull request #2438 from edm00se/master
add support for XPages
2015-07-25 17:52:01 +01:00
Evan You
5b187d1f20 update vue-syntax-highlight version 2015-07-24 23:40:43 -04:00
Evan You
7b5d1c075d add syntax highlight for *.vue component files 2015-07-21 18:51:55 -04:00
Arfon Smith
07173d2238 Merge pull request #2515 from github/xbase-ch
xBase .ch
2015-07-21 15:26:45 +01:00
Arfon Smith
6b747f7d65 Adding Charity and heuristic for xBase .ch files 2015-07-21 14:59:47 +01:00
Arfon Smith
aef19d72f9 Merge branch 'master' into xbase-ch 2015-07-21 13:55:59 +01:00
Arfon Smith
e1a661bffc Merge pull request #2506 from hdgarrood/master
Use a different grammar for PureScript
2015-07-21 13:37:05 +01:00
Arfon Smith
560f9b15d7 Merge pull request #2510 from joelparkerhenderson/master
Add documentation categorization for CHANGELOG et. al.
2015-07-20 16:30:57 +01:00
Paul Chaignon
452fc59d4f Merge branch 'master' into associate-heuristic-with-extension 2015-07-20 13:08:13 +02:00
Joel Parker Henderson
682cc2d82d Add documentation categorization for CHANGELOG et. al. 2015-07-19 19:36:12 -06:00
Harry Garrood
29197736c7 Use a different grammar for PureScript 2015-07-19 09:18:09 +01:00
Arfon Smith
e1dbd68713 Merge branch 'master' of github.com:github/linguist 2015-07-16 14:43:21 -07:00
Arfon Smith
0ecb865797 Merge branch 'rpavlick-master' 2015-07-16 14:40:52 -07:00
Arfon Smith
1ced06483e Merge branch 'master' of https://github.com/rpavlick/linguist into rpavlick-master 2015-07-16 14:40:21 -07:00
Garen Torikian
861cee33d5 Merge pull request #2502 from vszakats/patch-1
CONTRIBUTING.md: minor URL cleanups
2015-07-15 15:40:51 -07:00
Viktor Szakats
6b882438b0 CONTRIBUTING.md: minor URL cleanups
use `https://` and `.svg`, follow a redirect.
2015-07-15 13:39:27 +02:00
Eric McCormick
87eb4577ea trying this again 2015-07-13 21:31:02 -05:00
Eric McCormick
7563bf43e9 Revert "removed samples"
This reverts commit ce8cfed7ff.
2015-07-13 21:28:36 -05:00
Eric McCormick
ce8cfed7ff removed samples 2015-07-13 21:22:54 -05:00
Eric McCormick
8742de9a88 pulling .xsp, .form, .view from PR
Per comment in [PR 2438](https://github.com/github/linguist/pull/2438#issuecomment-120588670), pulling all but .xsp-config and .xsp.metadata for uniqueness and non-overlapping assignment based on file extension. This should be set / overridden in the `.gitattributes` file, [as demonstrated here](https://github.com/edm00se/AnAppOfIceAndFire/blob/master/.gitattributes), to assign the XPages language to the project files.
2015-07-13 21:11:06 -05:00
Viktor Szakats
4dcdb0c79c xBase: add .ch extension + sample 2015-07-13 10:32:08 +02:00
rpavlick
2d392581e2 adding NCL language 2015-07-09 07:17:01 -07:00
Paul Chaignon
25d160e850 Merge branch 'master' into associate-heuristic-with-extension 2015-07-04 23:03:32 +02:00
Paul Chaignon
e688c865bc Merge branch 'master' into associate-heuristic-with-extension 2015-07-04 22:48:06 +02:00
Paul Chaignon
8bf1defdc1 Merge branch 'master' into associate-heuristic-with-extension
Conflicts:
	lib/linguist/heuristics.rb
2015-06-18 21:54:59 +02:00
Kevin Butler
bc8d65e7d3 Add 1.0 rust sample and add file with extern crate usage 2015-06-10 17:58:36 +01:00
Kevin Butler
3180c5d554 Allow delimiting by comma in the language name 2015-06-10 15:37:31 +01:00
Paul Chaignon
be122ca1a5 Fix test for Perl heuristic
Improve heuristic rule for Perl6
Separate heuristic rules for .pl (with Prolog) and .pm (without Prolog)
2015-06-06 19:55:04 +02:00
Paul Chaignon
b05f6f0018 Test for the new heuristic definitions 2015-06-06 18:49:36 +02:00
Paul Chaignon
e811021806 Fix tests to use the correct extension in heuristic rules
Extand a few tests where only one file was tested for a language
2015-06-06 18:27:59 +02:00
Paul Chaignon
656f4f440d Several extensions can be associated to a heuristic rule 2015-06-06 17:44:02 +02:00
Paul Chaignon
7fb62de4d7 Associate each heuristic rule to a file extension 2015-06-06 15:37:41 +02:00
Eric McCormick
26a5325dc3 undoing color
apparently a bad idea, Travis CI didn't like it
2015-06-05 11:04:34 -05:00
Eric McCormick
4881e0aa51 added color to languages.yml 2015-06-05 10:36:40 -05:00
Eric McCormick
743f7c76de resorted entry in languages.yml to be consistent with other blocks 2015-06-05 06:19:02 -05:00
Eric McCormick
f8ef01f557 updated tm_scope to none in languages.yml
It looks like it helps to read the comments at the top of the file, who knew!
2015-06-05 06:13:36 -05:00
Eric McCormick
402fa5c2cd added navbar.xsp, xsp-config, xsp.metadata for completeness of file extensions and corrected xsp.metadata extension in languages.yml 2015-06-04 22:51:55 -05:00
Eric McCormick
5ac1e847a5 samples added for XPages design elements (.xsp) and affiliated NSF-based elements (.form, .view)
- demoServerRESTconsumption, src: https://gist.github.com/edm00se/15249dba8ff3fea38312, license: Creative Commons 3.0
- house.form, houses.view, house.xsp, src: https://github.com/edm00se/AnAppOfIceAndFire, license: Creative Commons 3.0
- UnpMainxsp, src: https://github.com/teamstudio/xcontrols-domino/blob/master/sampler-app/XPages/UnpMain.xsp, license: Apache v2.0
- xLogin.xsp, src: http://openntf.org/XSnippets.nsf/snippet.xsp?id=xpages-form-login-with-session-variable, license: Apache 2.0
2015-06-04 22:09:49 -05:00
Eric McCormick
0737a21e38 XPages added as programming language
XPages design element extensions and NSF-based design elements (Form, View) extensions added (all readable when marked up using XML settings for ACE); .jss (Domino SSJS) already aliased under JavaScript
2015-06-04 21:37:58 -05:00
Andrew Pennebaker
03369b8a6c use https url for travis support 2015-05-03 12:51:52 -05:00
Andrew Pennebaker
3b2ddb1a18 classify DNS zone as a data format 2015-05-01 11:25:16 -05:00
Andrew Pennebaker
1e20b12241 zone: add ace_mode 2015-04-30 15:40:44 -05:00
Andrew Pennebaker
81c41df15c zone: add tm_scope 2015-04-30 15:39:32 -05:00
Andrew Pennebaker
8b736189e0 better name for DNS zone 2015-04-30 15:34:43 -05:00
Andrew Pennebaker
188d2367df add sample zone file 2015-04-30 15:25:52 -05:00
Andrew Pennebaker
5aeac500da list zone in grammars.yml 2015-04-30 15:24:46 -05:00
Andrew Pennebaker
5730ab28ab list zone in languages.yml 2015-04-30 15:23:31 -05:00
Andrew Pennebaker
1c56b03a28 highlight DNS zone (BIND) files 2015-04-30 15:14:26 -05:00
Stefan Johnson
885b5aab41 Changed tokenizer number literals to be more encompassing
Number literals now skips hexadecimal, and C style literals.
2015-02-20 14:08:39 +11:00
Seth Vargo
5217f19faa Alphabetize order 2014-12-18 17:53:53 -05:00
Seth Vargo
296d170ba9 Add sample for HCL 2014-12-18 17:52:30 -05:00
Seth Vargo
a97fd74399 Add HCL to languages.yml
More information on HCL: https://github.com/hashicorp/hcl
2014-12-18 17:46:26 -05:00
154 changed files with 9031 additions and 460 deletions

33
.gitmodules vendored
View File

@@ -169,9 +169,6 @@
[submodule "vendor/grammars/sublime-idris"]
path = vendor/grammars/sublime-idris
url = https://github.com/laughedelic/sublime-idris
[submodule "vendor/grammars/sublime-better-typescript"]
path = vendor/grammars/sublime-better-typescript
url = https://github.com/lavrton/sublime-better-typescript
[submodule "vendor/grammars/moonscript-tmbundle"]
path = vendor/grammars/moonscript-tmbundle
url = https://github.com/leafo/moonscript-tmbundle
@@ -397,9 +394,6 @@
[submodule "vendor/grammars/processing.tmbundle"]
path = vendor/grammars/processing.tmbundle
url = https://github.com/textmate/processing.tmbundle
[submodule "vendor/grammars/prolog.tmbundle"]
path = vendor/grammars/prolog.tmbundle
url = https://github.com/textmate/prolog.tmbundle
[submodule "vendor/grammars/python-django.tmbundle"]
path = vendor/grammars/python-django.tmbundle
url = https://github.com/textmate/python-django.tmbundle
@@ -659,3 +653,30 @@
[submodule "vendor/grammars/language-xbase"]
path = vendor/grammars/language-xbase
url = https://github.com/hernad/atom-language-harbour
[submodule "vendor/grammars/language-ncl"]
path = vendor/grammars/language-ncl
url = https://github.com/rpavlick/language-ncl.git
[submodule "vendor/grammars/atom-language-purescript"]
path = vendor/grammars/atom-language-purescript
url = https://github.com/freebroccolo/atom-language-purescript
[submodule "vendor/grammars/vue-syntax-highlight"]
path = vendor/grammars/vue-syntax-highlight
url = https://github.com/vuejs/vue-syntax-highlight
[submodule "vendor/grammars/st2-zonefile"]
path = vendor/grammars/st2-zonefile
url = https://github.com/sixty4k/st2-zonefile
[submodule "vendor/grammars/sublimeprolog"]
path = vendor/grammars/sublimeprolog
url = https://github.com/alnkpa/sublimeprolog
[submodule "vendor/grammars/sublime-aspectj"]
path = vendor/grammars/sublime-aspectj
url = https://github.com/pchaigno/sublime-aspectj
[submodule "vendor/grammars/sublime-typescript"]
path = vendor/grammars/sublime-typescript
url = https://github.com/Microsoft/TypeScript-Sublime-Plugin
[submodule "vendor/grammars/X10"]
path = vendor/grammars/X10
url = git@github.com:x10-lang/x10-highlighting.git
[submodule "vendor/grammars/language-babel"]
path = vendor/grammars/language-babel
url = https://github.com/gandm/language-babel

View File

@@ -58,7 +58,7 @@ Syntax highlighting in GitHub is performed using TextMate-compatible grammars. T
Assuming your code is being detected as the right language, in most cases this is due to a bug in the language grammar rather than a bug in Linguist. [`grammars.yml`][grammars] lists all the grammars we use for syntax highlighting on github.com. Find the one corresponding to your code's programming language and submit a bug report upstream. If you can, try to reproduce the highlighting problem in the text editor that the grammar is designed for (TextMate, Sublime Text, or Atom) and include that information in your bug report.
You can also try to fix the bug yourself and submit a Pull Request. [TextMate's documentation](http://manual.macromates.com/en/language_grammars) offers a good introduction on how to work with TextMate-compatible grammars. You can test grammars using [Lightshow](https://github-lightshow.herokuapp.com).
You can also try to fix the bug yourself and submit a Pull Request. [TextMate's documentation](https://manual.macromates.com/en/language_grammars) offers a good introduction on how to work with TextMate-compatible grammars. You can test grammars using [Lightshow](https://github-lightshow.herokuapp.com).
Once the bug has been fixed upstream, we'll pick it up for GitHub in the next release of Linguist.
@@ -74,9 +74,9 @@ To run the tests:
bundle exec rake test
Sometimes getting the tests running can be too much work, especially if you don't have much Ruby experience. It's okay: be lazy and let our build bot [Travis](http://travis-ci.org/#!/github/linguist) run the tests for you. Just open a pull request and the bot will start cranking away.
Sometimes getting the tests running can be too much work, especially if you don't have much Ruby experience. It's okay: be lazy and let our build bot [Travis](https://travis-ci.org/#!/github/linguist) run the tests for you. Just open a pull request and the bot will start cranking away.
Here's our current build status: [![Build Status](https://secure.travis-ci.org/github/linguist.png?branch=master)](http://travis-ci.org/github/linguist)
Here's our current build status: [![Build Status](https://api.travis-ci.org/github/linguist.svg?branch=master)](https://travis-ci.org/github/linguist)
## Releasing

View File

@@ -13,10 +13,10 @@ See [Troubleshooting](#troubleshooting) and [`CONTRIBUTING.md`](/CONTRIBUTING.md
![language stats bar](https://cloud.githubusercontent.com/assets/173/5562290/48e24654-8ddf-11e4-8fe7-735b0ce3a0d3.png)
The Language stats bar is built by aggregating the languages of each file in that repository. If it is reporting a language that you don't expect:
The Language stats bar displays languages percentages for the files in the repository. The percentages are calculated based on the bytes of code for each language as reported by the [List Languages](https://developer.github.com/v3/repos/#list-languages) API. If the bar is reporting a language that you don't expect:
0. Click on the name of the language in the stats bar to see a list of the files that are identified as that language.
0. If you see files that you didn't write, consider moving the files into one of the [paths for vendored code](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml), or use the [manual overrides](#overrides) feature to ignore them.
0. If you see files that you didn't write, consider moving the files into one of the [paths for vendored code](/lib/linguist/vendor.yml), or use the [manual overrides](#overrides) feature to ignore them.
0. If the files are being misclassified, search for [open issues][issues] to see if anyone else has already reported the issue. Any information you an add, especially links to public repositories, is helpful.
0. If there are no reported issues of this misclassification, [open an issue][new-issue] and include a link to the repository or a sample of the code that is being misclassified.

View File

@@ -62,7 +62,7 @@ namespace :benchmark do
corpus = File.expand_path(ENV["CORPUS"] || "samples")
require 'linguist/language'
require 'linguist'
results = Hash.new
Dir.glob("#{corpus}/**/*").each do |file|

137
bin/git-linguist Executable file
View File

@@ -0,0 +1,137 @@
#!/usr/bin/env ruby
require 'linguist'
require 'rugged'
require 'optparse'
require 'json'
require 'tmpdir'
require 'zlib'
class GitLinguist
def initialize(path, commit_oid, incremental = true)
@repo_path = path
@commit_oid = commit_oid
@incremental = incremental
end
def linguist
if @commit_oid.nil?
raise "git-linguist must be called with a specific commit OID to perform language computation"
end
repo = Linguist::Repository.new(rugged, @commit_oid)
if @incremental && stats = load_language_stats
old_commit_oid, old_stats = stats
# A cache with NULL oid means that we want to froze
# these language stats in place and stop computing
# them (for performance reasons)
return old_stats if old_commit_oid == NULL_OID
repo.load_existing_stats(old_commit_oid, old_stats)
end
result = yield repo
save_language_stats(@commit_oid, repo.cache)
result
end
def load_language_stats
version, oid, stats = load_cache
if version == LANGUAGE_STATS_CACHE_VERSION && oid && stats
[oid, stats]
end
end
def save_language_stats(oid, stats)
cache = [LANGUAGE_STATS_CACHE_VERSION, oid, stats]
write_cache(cache)
end
def clear_language_stats
File.unlink(cache_file)
end
def disable_language_stats
save_language_stats(NULL_OID, {})
end
protected
NULL_OID = ("0" * 40).freeze
LANGUAGE_STATS_CACHE = 'language-stats.cache'
LANGUAGE_STATS_CACHE_VERSION = "v3:#{Linguist::VERSION}"
def rugged
@rugged ||= Rugged::Repository.bare(@repo_path)
end
def cache_file
File.join(@repo_path, LANGUAGE_STATS_CACHE)
end
def write_cache(object)
return unless File.directory? @repo_path
begin
tmp_path = Dir::Tmpname.make_tmpname(cache_file, nil)
File.open(tmp_path, "wb") do |f|
marshal = Marshal.dump(object)
f.write(Zlib::Deflate.deflate(marshal))
end
File.rename(tmp_path, cache_file)
rescue => e
(File.unlink(tmp_path) rescue nil)
raise e
end
end
def load_cache
marshal = File.open(cache_file, "rb") { |f| Zlib::Inflate.inflate(f.read) }
Marshal.load(marshal)
rescue SystemCallError, ::Zlib::DataError, ::Zlib::BufError, TypeError
nil
end
end
def git_linguist(args)
incremental = true
commit = nil
parser = OptionParser.new do |opts|
opts.banner = "Usage: git-linguist [OPTIONS] stats|breakdown|dump-cache|clear|disable"
opts.on("-f", "--force", "Force a full rescan") { incremental = false }
opts.on("--commit=COMMIT", "Commit to index") { |v| commit = v}
end
parser.parse!(args)
git_dir = `git rev-parse --git-dir`.strip
raise "git-linguist must be ran in a Git repository" unless $?.success?
wrapper = GitLinguist.new(git_dir, commit, incremental)
case args.pop
when "stats"
wrapper.linguist do |linguist|
puts JSON.dump(linguist.languages)
end
when "breakdown"
wrapper.linguist do |linguist|
puts JSON.dump(linguist.breakdown_by_file)
end
when "dump-cache"
puts JSON.dump(wrapper.load_language_stats)
when "clear"
wrapper.clear_language_stats
when "disable"
wrapper.disable_language_stats
else
$stderr.print(parser.help)
exit 1
end
end
git_linguist(ARGV)

View File

@@ -10,8 +10,8 @@ Gem::Specification.new do |s|
s.homepage = "https://github.com/github/linguist"
s.license = "MIT"
s.files = Dir['lib/**/*'] - ['lib/linguist/grammars.rb']
s.executables << 'linguist'
s.files = Dir['lib/**/*'] - ['lib/linguist/grammars.rb'] + ['LICENSE']
s.executables = ['linguist', 'git-linguist']
s.add_dependency 'charlock_holmes', '~> 0.7.3'
s.add_dependency 'escape_utils', '~> 1.1.0'
@@ -24,4 +24,6 @@ Gem::Specification.new do |s|
s.add_development_dependency 'rake'
s.add_development_dependency 'yajl-ruby'
s.add_development_dependency 'color-proximity', '~> 0.2.1'
s.add_development_dependency 'licensee', '~> 4.7.4'
end

View File

@@ -144,6 +144,8 @@ vendor/grammars/VBDotNetSyntax:
- source.vbnet
vendor/grammars/Vala-TMBundle:
- source.vala
vendor/grammars/X10:
- source.x10
vendor/grammars/abap.tmbundle:
- source.abap
vendor/grammars/actionscript3-tmbundle:
@@ -176,6 +178,8 @@ vendor/grammars/assembly.tmbundle:
- source.x86asm
vendor/grammars/atom-fsharp/:
- source.fsharp
vendor/grammars/atom-language-purescript/:
- source.purescript
vendor/grammars/atom-salt:
- source.python.salt
- source.yaml.salt
@@ -310,6 +314,9 @@ vendor/grammars/json.tmbundle:
- source.json
vendor/grammars/kotlin-sublime-package:
- source.Kotlin
vendor/grammars/language-babel/:
- source.js.jsx
- source.regexp.babel
vendor/grammars/language-clojure:
- source.clojure
vendor/grammars/language-coffee-script:
@@ -318,6 +325,7 @@ vendor/grammars/language-coffee-script:
vendor/grammars/language-crystal:
- source.crystal
vendor/grammars/language-csharp:
- source.cake
- source.cs
- source.csx
- source.nant-build
@@ -328,9 +336,12 @@ vendor/grammars/language-hy:
vendor/grammars/language-javascript:
- source.js
- source.js.regexp
- source.js.regexp.replacement
vendor/grammars/language-jsoniq/:
- source.jq
- source.xq
vendor/grammars/language-ncl:
- source.ncl
vendor/grammars/language-python:
- source.python
- source.regexp.python
@@ -426,8 +437,6 @@ vendor/grammars/powershell:
- source.powershell
vendor/grammars/processing.tmbundle:
- source.processing
vendor/grammars/prolog.tmbundle:
- source.prolog
vendor/grammars/protobuf-tmbundle:
- source.protobuf
vendor/grammars/puppet-textmate-bundle:
@@ -465,6 +474,8 @@ vendor/grammars/smalltalk-tmbundle:
- source.smalltalk
vendor/grammars/sql.tmbundle:
- source.sql
vendor/grammars/st2-zonefile:
- text.zone_file
vendor/grammars/standard-ml.tmbundle:
- source.cm
- source.ml
@@ -472,10 +483,10 @@ vendor/grammars/sublime-MuPAD:
- source.mupad
vendor/grammars/sublime-apl/:
- source.apl
vendor/grammars/sublime-aspectj/:
- source.aspectj
vendor/grammars/sublime-befunge:
- source.befunge
vendor/grammars/sublime-better-typescript:
- source.ts
vendor/grammars/sublime-bsv:
- source.bsv
vendor/grammars/sublime-cirru:
@@ -514,6 +525,9 @@ vendor/grammars/sublime-text-ox/:
- source.ox
vendor/grammars/sublime-text-pig-latin/:
- source.pig_latin
vendor/grammars/sublime-typescript/:
- source.ts
- source.tsx
vendor/grammars/sublime-varnish:
- source.varnish.vcl
vendor/grammars/sublime_cobol:
@@ -524,6 +538,9 @@ vendor/grammars/sublime_cobol:
vendor/grammars/sublime_man_page_support:
- source.man
- text.groff
vendor/grammars/sublimeprolog/:
- source.prolog
- source.prolog.eclipse
vendor/grammars/sublimetext-cuda-cpp:
- source.cuda-c++
vendor/grammars/swift.tmbundle:
@@ -540,6 +557,8 @@ vendor/grammars/turtle.tmbundle:
- source.turtle
vendor/grammars/verilog.tmbundle:
- source.verilog
vendor/grammars/vue-syntax-highlight:
- text.html.vue
vendor/grammars/x86-assembly-textmate-bundle:
- source.asm.x86
vendor/grammars/xc.tmbundle/:

View File

@@ -13,11 +13,18 @@
- (^|/)[Dd]ocumentation/
- (^|/)javadoc/
- ^man/
- ^[Ee]xamples/
## Documentation files ##
- (^|/)CHANGE(S|LOG)?(\.|$)
- (^|/)CONTRIBUTING(\.|$)
- (^|/)COPYING(\.|$)
- (^|/)INSTALL(\.|$)
- (^|/)LICEN[CS]E(\.|$)
- (^|/)[Ll]icen[cs]e(\.|$)
- (^|/)README(\.|$)
- (^|/)[Rr]eadme(\.|$)
# Samples folders
- ^[Ss]amples/

View File

@@ -241,22 +241,26 @@ module Linguist
return lines[0].include?("Code generated by")
end
PROTOBUF_EXTENSIONS = ['.py', '.java', '.h', '.cc', '.cpp']
# Internal: Is the blob a C++, Java or Python source file generated by the
# Protocol Buffer compiler?
#
# Returns true of false.
def generated_protocol_buffer?
return false unless ['.py', '.java', '.h', '.cc', '.cpp'].include?(extname)
return false unless PROTOBUF_EXTENSIONS.include?(extname)
return false unless lines.count > 1
return lines[0].include?("Generated by the protocol buffer compiler. DO NOT EDIT!")
end
APACHE_THRIFT_EXTENSIONS = ['.rb', '.py', '.go', '.js', '.m', '.java', '.h', '.cc', '.cpp']
# Internal: Is the blob generated by Apache Thrift compiler?
#
# Returns true or false
def generated_apache_thrift?
return false unless ['.rb', '.py', '.go', '.js', '.m', '.java', '.h', '.cc', '.cpp'].include?(extname)
return false unless APACHE_THRIFT_EXTENSIONS.include?(extname)
return false unless lines.count > 1
return lines[0].include?("Autogenerated by Thrift Compiler") || lines[1].include?("Autogenerated by Thrift Compiler")

View File

@@ -13,11 +13,14 @@ module Linguist
# ])
#
# Returns an Array of languages, or empty if none matched or were inconclusive.
def self.call(blob, languages)
def self.call(blob, candidates)
data = blob.data
@heuristics.each do |heuristic|
return Array(heuristic.call(data)) if heuristic.matches?(languages)
if heuristic.matches?(blob.name)
languages = Array(heuristic.call(data))
return languages if languages.any? || languages.all? { |l| candidates.include?(l) }
end
end
[] # No heuristics matched
@@ -30,7 +33,7 @@ module Linguist
#
# Examples
#
# disambiguate "Perl", "Prolog" do |data|
# disambiguate ".pm" do |data|
# if data.include?("use strict")
# Language["Perl"]
# elsif /^[^#]+:-/.match(data)
@@ -38,22 +41,23 @@ module Linguist
# end
# end
#
def self.disambiguate(*languages, &heuristic)
@heuristics << new(languages, &heuristic)
def self.disambiguate(*extensions, &heuristic)
@heuristics << new(extensions, &heuristic)
end
# Internal: Array of defined heuristics
@heuristics = []
# Internal
def initialize(languages, &heuristic)
@languages = languages
def initialize(extensions, &heuristic)
@extensions = extensions
@heuristic = heuristic
end
# Internal: Check if this heuristic matches the candidate languages.
def matches?(candidates)
candidates.any? && candidates.all? { |l| @languages.include?(l.name) }
def matches?(filename)
filename = filename.downcase
@extensions.any? { |ext| filename.end_with?(ext) }
end
# Internal: Perform the heuristic
@@ -62,99 +66,9 @@ module Linguist
end
# Common heuristics
ObjectiveCRegex = /^[ \t]*@(interface|class|protocol|property|end|synchronised|selector|implementation)\b/
ObjectiveCRegex = /^\s*(@(interface|class|protocol|property|end|synchronised|selector|implementation)\b|#import\s+.+\.h[">])/
disambiguate "BitBake", "BlitzBasic" do |data|
if /^\s*; /.match(data) || data.include?("End Function")
Language["BlitzBasic"]
elsif /^\s*(# |include|require)\b/.match(data)
Language["BitBake"]
end
end
disambiguate "C#", "Smalltalk" do |data|
if /![\w\s]+methodsFor: /.match(data)
Language["Smalltalk"]
elsif /^\s*namespace\s*[\w\.]+\s*{/.match(data) || /^\s*\/\//.match(data)
Language["C#"]
end
end
disambiguate "Objective-C", "C++", "C" do |data|
if ObjectiveCRegex.match(data)
Language["Objective-C"]
elsif (/^\s*#\s*include <(cstdint|string|vector|map|list|array|bitset|queue|stack|forward_list|unordered_map|unordered_set|(i|o|io)stream)>/.match(data) ||
/^\s*template\s*</.match(data) || /^[ \t]*try/.match(data) || /^[ \t]*catch\s*\(/.match(data) || /^[ \t]*(class|(using[ \t]+)?namespace)\s+\w+/.match(data) || /^[ \t]*(private|public|protected):$/.match(data) || /std::\w+/.match(data))
Language["C++"]
end
end
disambiguate "Perl", "Perl6", "Prolog" do |data|
if data.include?("use v6")
Language["Perl6"]
elsif data.match(/use strict|use\s+v?5\./)
Language["Perl"]
elsif /^[^#]+:-/.match(data)
Language["Prolog"]
end
end
disambiguate "ECL", "Prolog" do |data|
if /^[^#]+:-/.match(data)
Language["Prolog"]
elsif data.include?(":=")
Language["ECL"]
end
end
disambiguate "IDL", "Prolog", "INI", "QMake" do |data|
if /^[^#]+:-/.match(data)
Language["Prolog"]
elsif data.include?("last_client=")
Language["INI"]
elsif data.include?("HEADERS") && data.include?("SOURCES")
Language["QMake"]
elsif /^\s*function[ \w,]+$/.match(data)
Language["IDL"]
end
end
disambiguate "GAP", "Scilab" do |data|
if (data.include?("gap> "))
Language["GAP"]
# Heads up - we don't usually write heuristics like this (with no regex match)
else
Language["Scilab"]
end
end
disambiguate "Common Lisp", "OpenCL", "Cool" do |data|
if /^\s*\((defun|in-package|defpackage) /i.match(data)
Language["Common Lisp"]
elsif /^class/x.match(data)
Language["Cool"]
elsif /\/\* |\/\/ |^\}/.match(data)
Language["OpenCL"]
end
end
disambiguate "Hack", "PHP" do |data|
if data.include?("<?hh")
Language["Hack"]
elsif /<?[^h]/.match(data)
Language["PHP"]
end
end
disambiguate "Scala", "SuperCollider" do |data|
if /\^(this|super)\./.match(data) || /^\s*(\+|\*)\s*\w+\s*{/.match(data) || /^\s*~\w+\s*=\./.match(data)
Language["SuperCollider"]
elsif /^\s*import (scala|java)\./.match(data) || /^\s*val\s+\w+\s*=/.match(data) || /^\s*class\b/.match(data)
Language["Scala"]
end
end
disambiguate "AsciiDoc", "AGS Script", "Public Key" do |data|
disambiguate ".asc" do |data|
if /^(----[- ]BEGIN|ssh-(rsa|dss)) /.match(data)
Language["Public Key"]
elsif /^[=-]+(\s|\n)|{{[A-Za-z]/.match(data)
@@ -164,7 +78,57 @@ module Linguist
end
end
disambiguate "FORTRAN", "Forth", "Formatted" do |data|
disambiguate ".bb" do |data|
if /^\s*; /.match(data) || data.include?("End Function")
Language["BlitzBasic"]
elsif /^\s*(# |include|require)\b/.match(data)
Language["BitBake"]
end
end
disambiguate ".ch" do |data|
if /^\s*#\s*(if|ifdef|ifndef|define|command|xcommand|translate|xtranslate|include|pragma|undef)\b/i.match(data)
Language["xBase"]
end
end
disambiguate ".cl" do |data|
if /^\s*\((defun|in-package|defpackage) /i.match(data)
Language["Common Lisp"]
elsif /^class/x.match(data)
Language["Cool"]
elsif /\/\* |\/\/ |^\}/.match(data)
Language["OpenCL"]
end
end
disambiguate ".cs" do |data|
if /![\w\s]+methodsFor: /.match(data)
Language["Smalltalk"]
elsif /^\s*namespace\s*[\w\.]+\s*{/.match(data) || /^\s*\/\//.match(data)
Language["C#"]
end
end
disambiguate ".d" do |data|
if /^module /.match(data)
Language["D"]
elsif /^((dtrace:::)?BEGIN|provider |#pragma (D (option|attributes)|ident)\s)/.match(data)
Language["DTrace"]
elsif /(\/.*:( .* \\)$| : \\$|^ : |: \\$)/.match(data)
Language["Makefile"]
end
end
disambiguate ".ecl" do |data|
if /^[^#]+:-/.match(data)
Language["ECLiPSe"]
elsif data.include?(":=")
Language["ECL"]
end
end
disambiguate ".for", ".f" do |data|
if /^: /.match(data)
Language["Forth"]
elsif /^([c*][^a-z]| (subroutine|program)\s|\s*!)/i.match(data)
@@ -172,7 +136,17 @@ module Linguist
end
end
disambiguate "F#", "Forth", "GLSL", "Filterscript" do |data|
disambiguate ".fr" do |data|
if /^(: |also |new-device|previous )/.match(data)
Language["Forth"]
elsif /^\s*(import|module|package|data|type) /.match(data)
Language["Frege"]
else
Language["Text"]
end
end
disambiguate ".fs" do |data|
if /^(: |new-device)/.match(data)
Language["Forth"]
elsif /^\s*(#light|import|let|module|namespace|open|type)/.match(data)
@@ -184,7 +158,48 @@ module Linguist
end
end
disambiguate "Limbo", "M", "MUF", "Mathematica", "Matlab", "Mercury", "Objective-C" do |data|
disambiguate ".gs" do |data|
Language["Gosu"] if /^uses java\./.match(data)
end
disambiguate ".h" do |data|
if ObjectiveCRegex.match(data)
Language["Objective-C"]
elsif (/^\s*#\s*include <(cstdint|string|vector|map|list|array|bitset|queue|stack|forward_list|unordered_map|unordered_set|(i|o|io)stream)>/.match(data) ||
/^\s*template\s*</.match(data) || /^[ \t]*try/.match(data) || /^[ \t]*catch\s*\(/.match(data) || /^[ \t]*(class|(using[ \t]+)?namespace)\s+\w+/.match(data) || /^[ \t]*(private|public|protected):$/.match(data) || /std::\w+/.match(data))
Language["C++"]
end
end
disambiguate ".l" do |data|
if /\(def(un|macro)\s/.match(data)
Language["Common Lisp"]
elsif /^(%[%{}]xs|<.*>)/.match(data)
Language["Lex"]
elsif /^\.[a-z][a-z](\s|$)/i.match(data)
Language["Groff"]
elsif /^\((de|class|rel|code|data|must)\s/.match(data)
Language["PicoLisp"]
end
end
disambiguate ".ls" do |data|
if /^\s*package\s*[\w\.\/\*\s]*\s*{/.match(data)
Language["LoomScript"]
else
Language["LiveScript"]
end
end
disambiguate ".lsp", ".lisp" do |data|
if /^\s*\((defun|in-package|defpackage) /i.match(data)
Language["Common Lisp"]
elsif /^\s*\(define /.match(data)
Language["NewLisp"]
end
end
disambiguate ".m" do |data|
if ObjectiveCRegex.match(data)
Language["Objective-C"]
elsif data.include?(":- module")
@@ -202,45 +217,117 @@ module Linguist
end
end
disambiguate "Gosu", "JavaScript" do |data|
Language["Gosu"] if /^uses java\./.match(data)
end
disambiguate "LoomScript", "LiveScript" do |data|
if /^\s*package\s*[\w\.\/\*\s]*\s*{/.match(data)
Language["LoomScript"]
else
Language["LiveScript"]
disambiguate ".ml" do |data|
if /(^\s*module)|let rec |match\s+(\S+\s)+with/.match(data)
Language["OCaml"]
elsif /=> |case\s+(\S+\s)+of/.match(data)
Language["Standard ML"]
end
end
disambiguate "Common Lisp", "NewLisp" do |data|
if /^\s*\((defun|in-package|defpackage) /i.match(data)
Language["Common Lisp"]
elsif /^\s*\(define /.match(data)
Language["NewLisp"]
end
end
disambiguate "TypeScript", "XML" do |data|
if data.include?("<TS ")
disambiguate ".mod" do |data|
if data.include?('<!ENTITY ')
Language["XML"]
elsif /MODULE\s\w+\s*;/i.match(data) || /^\s*END \w+;$/i.match(data)
Language["Modula-2"]
else
Language["TypeScript"]
[Language["Linux Kernel Module"], Language["AMPL"]]
end
end
disambiguate "Frege", "Forth", "Text" do |data|
if /^(: |also |new-device|previous )/.match(data)
Language["Forth"]
elsif /^\s*(import|module|package|data|type) /.match(data)
Language["Frege"]
else
disambiguate ".ms" do |data|
if /^[.'][a-z][a-z](\s|$)/i.match(data)
Language["Groff"]
elsif /((^|\s)move?[. ])|\.(include|globa?l)\s/.match(data)
Language["GAS"]
end
end
disambiguate ".n" do |data|
if /^[.']/.match(data)
Language["Groff"]
elsif /^(module|namespace|using)\s/.match(data)
Language["Nemerle"]
end
end
disambiguate ".ncl" do |data|
if data.include?("THE_TITLE")
Language["Text"]
end
end
disambiguate "PLSQL", "SQLPL", "PLpgSQL", "SQL" do |data|
disambiguate ".nl" do |data|
if /^(b|g)[0-9]+ /.match(data)
Language["NL"]
else
Language["NewLisp"]
end
end
disambiguate ".php" do |data|
if data.include?("<?hh")
Language["Hack"]
elsif /<?[^h]/.match(data)
Language["PHP"]
end
end
disambiguate ".pl" do |data|
if /^(use v6|(my )?class|module)/.match(data)
Language["Perl6"]
elsif /use strict|use\s+v?5\./.match(data)
Language["Perl"]
elsif /^[^#]+:-/.match(data)
Language["Prolog"]
end
end
disambiguate ".pm", ".t" do |data|
if /^(use v6|(my )?class|module)/.match(data)
Language["Perl6"]
elsif /use strict|use\s+v?5\./.match(data)
Language["Perl"]
end
end
disambiguate ".pro" do |data|
if /^[^#]+:-/.match(data)
Language["Prolog"]
elsif data.include?("last_client=")
Language["INI"]
elsif data.include?("HEADERS") && data.include?("SOURCES")
Language["QMake"]
elsif /^\s*function[ \w,]+$/.match(data)
Language["IDL"]
end
end
disambiguate ".r" do |data|
if /\bRebol\b/i.match(data)
Language["Rebol"]
elsif data.include?("<-")
Language["R"]
end
end
disambiguate ".rs" do |data|
if /^(use |fn |mod |pub |macro_rules|impl|#!?\[)/.match(data)
Language["Rust"]
elsif /#include|#pragma\s+(rs|version)|__attribute__/.match(data)
Language["RenderScript"]
end
end
disambiguate ".sc" do |data|
if /\^(this|super)\./.match(data) || /^\s*(\+|\*)\s*\w+\s*{/.match(data) || /^\s*~\w+\s*=\./.match(data)
Language["SuperCollider"]
elsif /^\s*import (scala|java)\./.match(data) || /^\s*val\s+\w+\s*=/.match(data) || /^\s*class\b/.match(data)
Language["Scala"]
end
end
disambiguate ".sql" do |data|
if /^\\i\b|AS \$\$|LANGUAGE '+plpgsql'+/i.match(data) || /SECURITY (DEFINER|INVOKER)/i.match(data) || /BEGIN( WORK| TRANSACTION)?;/i.match(data)
#Postgres
Language["PLpgSQL"]
@@ -256,75 +343,20 @@ module Linguist
end
end
disambiguate "D", "DTrace", "Makefile" do |data|
if /^module /.match(data)
Language["D"]
elsif /^((dtrace:::)?BEGIN|provider |#pragma (D (option|attributes)|ident)\s)/.match(data)
Language["DTrace"]
elsif /(\/.*:( .* \\)$| : \\$|^ : |: \\$)/.match(data)
Language["Makefile"]
end
end
disambiguate "OCaml", "Standard ML" do |data|
if /(^\s*module)|let rec |match\s+(\S+\s)+with/.match(data)
Language["OCaml"]
elsif /=> |case\s+(\S+\s)+of/.match(data)
Language["Standard ML"]
end
end
disambiguate "XML", "Modula-2", "Linux Kernel Module", "AMPL" do |data|
if data.include?('<!ENTITY ')
disambiguate ".ts" do |data|
if data.include?("<TS ")
Language["XML"]
elsif /MODULE\s\w+\s*;/i.match(data) || /^\s*END \w+;$/i.match(data)
Language["Modula-2"]
else
[Language["Linux Kernel Module"], Language["AMPL"]]
Language["TypeScript"]
end
end
disambiguate "NL", "NewLisp" do |data|
if /^(b|g)[0-9]+ /.match(data)
Language["NL"]
disambiguate ".tst" do |data|
if (data.include?("gap> "))
Language["GAP"]
# Heads up - we don't usually write heuristics like this (with no regex match)
else
Language["NewLisp"]
end
end
disambiguate "Rust", "RenderScript" do |data|
if /^(use |fn |mod |pub |macro_rules|impl|#!?\[)/.match(data)
Language["Rust"]
elsif /#include|#pragma\s+(rs|version)|__attribute__/.match(data)
Language["RenderScript"]
end
end
disambiguate "Common Lisp", "Lex", "Groff", "PicoLisp" do |data|
if /\(def(un|macro)\s/.match(data)
Language["Common Lisp"]
elsif /^(%[%{}]xs|<.*>)/.match(data)
Language["Lex"]
elsif /^\.[a-z][a-z](\s|$)/i.match(data)
Language["Groff"]
elsif /^\((de|class|rel|code|data|must)\s/.match(data)
Language["PicoLisp"]
end
end
disambiguate "Groff", "Nemerle" do |data|
if /^[.']/.match(data)
Language["Groff"]
elsif /^(module|namespace|using)\s/.match(data)
Language["Nemerle"]
end
end
disambiguate "GAS", "Groff" do |data|
if /^[.'][a-z][a-z](\s|$)/i.match(data)
Language["Groff"]
elsif /((^|\s)move?[. ])|\.(include|globa?l)\s/.match(data)
Language["GAS"]
Language["Scilab"]
end
end
end

View File

@@ -150,7 +150,8 @@ module Linguist
#
# Returns the Language or nil if none was found.
def self.find_by_name(name)
name && @name_index[name.downcase]
return nil if name.to_s.empty?
name && (@name_index[name.downcase] || @name_index[name.split(',').first.downcase])
end
# Public: Look up Language by one of its aliases.
@@ -164,7 +165,8 @@ module Linguist
#
# Returns the Language or nil if none was found.
def self.find_by_alias(name)
name && @alias_index[name.downcase]
return nil if name.to_s.empty?
name && (@alias_index[name.downcase] || @alias_index[name.split(',').first.downcase])
end
# Public: Look up Languages by filename.
@@ -240,7 +242,8 @@ module Linguist
#
# Returns the Language or nil if none was found.
def self.[](name)
name && @index[name.downcase]
return nil if name.to_s.empty?
name && (@index[name.downcase] || @index[name.split(',').first.downcase])
end
# Public: A List of popular languages

View File

@@ -8,7 +8,8 @@
# Use "text" if a mode does not exist.
# wrap - Boolean wrap to enable line wrapping (default: false)
# extensions - An Array of associated extensions (the first one is
# considered the primary extension)
# considered the primary extension, the others should be
# listed alphabetically)
# interpreters - An Array of associated interpreters
# searchable - Boolean flag to enable searching (defaults to true)
# search_term - Deprecated: Some languages maybe indexed under a
@@ -214,7 +215,7 @@ AspectJ:
color: "#a957b0"
extensions:
- .aj
tm_scope: none
tm_scope: source.aspectj
ace_mode: text
Assembly:
@@ -501,6 +502,13 @@ Chapel:
- .chpl
ace_mode: text
Charity:
type: programming
extensions:
- .ch
tm_scope: none
ace_mode: text
ChucK:
type: programming
extensions:
@@ -609,6 +617,7 @@ Common Lisp:
- .lsp
- .ny
- .podsl
- .sexp
interpreters:
- lisp
- sbcl
@@ -743,6 +752,14 @@ DM:
tm_scope: source.c++
ace_mode: c_cpp
DNS Zone:
type: data
extensions:
- .zone
- .arpa
tm_scope: text.zone_file
ace_mode: text
DTrace:
type: programming
aliases:
@@ -827,6 +844,14 @@ ECL:
tm_scope: none
ace_mode: text
ECLiPSe:
type: programming
group: prolog
extensions:
- .ecl
tm_scope: source.prolog.eclipse
ace_mode: prolog
Eagle:
type: markup
color: "#814C05"
@@ -1261,6 +1286,14 @@ Groovy Server Pages:
tm_scope: text.html.jsp
ace_mode: jsp
HCL:
type: programming
extensions:
- .hcl
- .tf
ace_mode: ruby
tm_scope: source.ruby
HTML:
type: markup
tm_scope: text.html.basic
@@ -1335,6 +1368,7 @@ Haml:
Handlebars:
type: markup
color: "#01a9d6"
group: HTML
aliases:
- hbs
- htmlbars
@@ -1513,7 +1547,9 @@ JSON:
searchable: false
extensions:
- .json
- .geojson
- .lock
- .topojson
filenames:
- .jshintrc
- composer.lock
@@ -1541,6 +1577,14 @@ JSONiq:
- .jq
tm_scope: source.jq
JSX:
type: programming
group: JavaScript
extensions:
- .jsx
tm_scope: source.js.jsx
ace_mode: javascript
Jade:
group: HTML
type: markup
@@ -1594,7 +1638,6 @@ JavaScript:
- .jsfl
- .jsm
- .jss
- .jsx
- .njs
- .pac
- .sjs
@@ -1687,6 +1730,7 @@ LSL:
ace_mode: lsl
extensions:
- .lsl
- .lslp
interpreters:
- lsl
color: '#3d9970'
@@ -1695,8 +1739,8 @@ LabVIEW:
type: programming
extensions:
- .lvproj
tm_scope: none
ace_mode: text
tm_scope: text.xml
ace_mode: xml
Lasso:
type: programming
@@ -1918,6 +1962,7 @@ Makefile:
- GNUmakefile
- Kbuild
- Makefile
- Makefile.inc
- makefile
interpreters:
- make
@@ -2091,6 +2136,14 @@ Myghty:
tm_scope: none
ace_mode: text
NCL:
type: programming
color: "#28431f"
extensions:
- .ncl
tm_scope: source.ncl
ace_mode: text
NL:
type: data
extensions:
@@ -2401,6 +2454,7 @@ PLSQL:
type: programming
ace_mode: sql
tm_scope: source.plsql.oracle
color: "#dad8d8"
extensions:
- .pls
- .pkb
@@ -2595,11 +2649,11 @@ Prolog:
color: "#74283c"
extensions:
- .pl
- .ecl
- .pro
- .prolog
interpreters:
- swipl
tm_scope: source.prolog
ace_mode: prolog
Propeller Spin:
@@ -2630,7 +2684,7 @@ Public Key:
Puppet:
type: programming
color: "#332A77"
color: "#302B6D"
extensions:
- .pp
filenames:
@@ -2659,7 +2713,7 @@ PureScript:
color: "#1D222D"
extensions:
- .purs
tm_scope: source.haskell
tm_scope: source.purescript
ace_mode: haskell
Python:
@@ -2737,7 +2791,7 @@ R:
ace_mode: r
RAML:
type: data
type: markup
ace_mode: yaml
tm_scope: source.yaml
color: "#77d9fb"
@@ -2780,7 +2834,7 @@ RMarkdown:
ace_mode: markdown
extensions:
- .rmd
tm_scope: none
tm_scope: source.gfm
Racket:
type: programming
@@ -2910,6 +2964,7 @@ Ruby:
- .pryrc
- Appraisals
- Berksfile
- Brewfile
- Buildfile
- Deliverfile
- Fastfile
@@ -3158,6 +3213,7 @@ Slim:
color: "#ff8f77"
extensions:
- .slim
tm_scope: text.slim
ace_mode: text
Smali:
@@ -3336,6 +3392,7 @@ Text:
extensions:
- .txt
- .fr
- .ncl
tm_scope: none
ace_mode: text
@@ -3500,6 +3557,14 @@ Volt:
tm_scope: source.d
ace_mode: d
Vue:
type: markup
color: "#2c3e50"
extensions:
- .vue
tm_scope: text.html.vue
ace_mode: html
Web Ontology Language:
type: markup
color: "#9cc9dd"
@@ -3515,6 +3580,16 @@ WebIDL:
tm_scope: source.webidl
ace_mode: text
X10:
type: programming
aliases:
- xten
ace_mode: text
extensions:
- .x10
color: "#4B6BEF"
tm_scope: source.x10
XC:
type: programming
color: "#99DA07"
@@ -3552,6 +3627,7 @@ XML:
- .iml
- .ivy
- .jelly
- .jsproj
- .kml
- .launch
- .mdpolicy
@@ -3612,6 +3688,14 @@ XML:
- Web.config
- packages.config
XPages:
type: programming
extensions:
- .xsp-config
- .xsp.metadata
tm_scope: none
ace_mode: xml
XProc:
type: programming
extensions:
@@ -3676,7 +3760,9 @@ YAML:
- .yml
- .reek
- .rviz
- .syntax
- .yaml
- .yaml-tmlanguage
ace_mode: yaml
Yacc:
@@ -3781,8 +3867,13 @@ wisp:
xBase:
type: programming
color: "#403a40"
aliases:
- advpl
- clipper
- foxpro
extensions:
- .prg
- .ch
- .prw
tm_scope: source.harbour
ace_mode: text

View File

@@ -4,7 +4,11 @@ require 'rugged'
module Linguist
class LazyBlob
GIT_ATTR = ['linguist-documentation', 'linguist-language', 'linguist-vendored']
GIT_ATTR = ['linguist-documentation',
'linguist-language',
'linguist-vendored',
'linguist-generated']
GIT_ATTR_OPTS = { :priority => [:index], :skip_system => true }
GIT_ATTR_FLAGS = Rugged::Repository::Attributes.parse_opts(GIT_ATTR_OPTS)
@@ -31,14 +35,6 @@ module Linguist
name, GIT_ATTR, GIT_ATTR_FLAGS)
end
def vendored?
if attr = git_attributes['linguist-vendored']
return boolean_attribute(attr)
else
return super
end
end
def documentation?
if attr = git_attributes['linguist-documentation']
boolean_attribute(attr)
@@ -47,6 +43,22 @@ module Linguist
end
end
def generated?
if attr = git_attributes['linguist-generated']
boolean_attribute(attr)
else
super
end
end
def vendored?
if attr = git_attributes['linguist-vendored']
return boolean_attribute(attr)
else
super
end
end
def language
return @language if defined?(@language)
@@ -67,6 +79,10 @@ module Linguist
@size
end
def cleanup!
@data.clear if @data
end
protected
# Returns true if the attribute is present and not the string "false".

View File

@@ -126,12 +126,13 @@ module Linguist
end
protected
MAX_TREE_SIZE = 100_000
def compute_stats(old_commit_oid, cache = nil)
return {} if current_tree.count_recursive(MAX_TREE_SIZE) >= MAX_TREE_SIZE
old_tree = old_commit_oid && Rugged::Commit.lookup(repository, old_commit_oid).tree
read_index
diff = Rugged::Tree.diff(repository, old_tree, current_tree)
# Clear file map and fetch full diff if any .gitattributes files are changed
@@ -157,8 +158,11 @@ module Linguist
blob = Linguist::LazyBlob.new(repository, delta.new_file[:oid], new, mode.to_s(8))
next unless blob.include_in_language_stats?
file_map[new] = [blob.language.group.name, blob.size]
if blob.include_in_language_stats?
file_map[new] = [blob.language.group.name, blob.size]
end
blob.cleanup!
end
end

View File

@@ -2,7 +2,7 @@ module Linguist
module Strategy
class Modeline
EmacsModeline = /-\*-\s*(?:(?!mode)[\w-]+\s*:\s*(?:[\w+-]+)\s*;?\s*)*(?:mode\s*:)?\s*([\w+-]+)\s*(?:;\s*(?!mode)[\w-]+\s*:\s*[\w+-]+\s*)*;?\s*-\*-/i
VimModeline = /vim:\s*set\s*(?:ft|filetype)=(\w+):/i
VimModeline = /vim:\s*set.*\s(?:ft|filetype)=(\w+)\s?.*:/i
# Public: Detects language based on Vim and Emacs modelines
#

View File

@@ -96,7 +96,7 @@ module Linguist
end
# Skip number literals
elsif s.scan(/(0x)?\d(\d|\.)*/)
elsif s.scan(/(0x\h(\h|\.)*|\d(\d|\.)*)([uU][lL]{0,2}|([eE][-+]\d*)?[fFlL]*)/)
# SGML style brackets
elsif token = s.scan(/<[^\s<>][^<>]*>/)

View File

@@ -78,6 +78,9 @@
# Haxelib projects often contain a neko bytecode file named run.n
- run.n$
# Bootstrap Datepicker
- bootstrap-datepicker/
## Commonly Bundled JavaScript frameworks ##
# jQuery
@@ -88,6 +91,34 @@
- (^|/)jquery\-ui(\-\d\.\d+(\.\d+)?)?(\.\w+)?\.(js|css)$
- (^|/)jquery\.(ui|effects)\.([^.]*)\.(js|css)$
# jQuery Gantt
- jquery.fn.gantt.js
# jQuery fancyBox
- jquery.fancybox.js
# Fuel UX
- fuelux.js
# jQuery File Upload
- (^|/)jquery\.fileupload(-\w+)?\.js$
# Slick
- (^|/)slick\.\w+.js$
# Leaflet plugins
- (^|/)Leaflet\.Coordinates-\d+\.\d+\.\d+\.src\.js$
- leaflet.draw-src.js
- leaflet.draw.css
- Control.FullScreen.css
- Control.FullScreen.js
- leaflet.spin.js
- wicket-leaflet.js
# Sublime Text workspace files
- .sublime-project
- .sublime-workspace
# Prototype
- (^|/)prototype(.*)\.js$
- (^|/)effects\.js$
@@ -122,7 +153,7 @@
- (^|/)Chart\.js$
# Codemirror
- (^|/)[Cc]ode[Mm]irror/(lib|mode|theme|addon|keymap)
- (^|/)[Cc]ode[Mm]irror/(\d+\.\d+/)?(lib|mode|theme|addon|keymap|demo)
# SyntaxHighlighter - http://alexgorbatchev.com/
- (^|/)shBrush([^.]*)\.js$
@@ -164,6 +195,11 @@
## Obj-C ##
# Xcode
- \.xctemplate/
- \.imageset/
# Carthage
- ^Carthage/
@@ -179,6 +215,10 @@
# Fabric
- Fabric.framework/
# git config files
- gitattributes$
- gitignore$
- gitmodules$
## Groovy ##
@@ -224,21 +264,9 @@
# Html5shiv
- (^|/)html5shiv\.js$
# Samples folders
- ^[Ss]amples/
# LICENSE, README, git config files
- ^COPYING$
- LICENSE$
- License$
- gitattributes$
- gitignore$
- gitmodules$
- ^README$
- ^readme$
# Test fixtures
- ^[Tt]ests?/fixtures/
- ^[Ss]pecs?/fixtures/
# PhoneGap/Cordova
- (^|/)cordova([^.]*)\.js$

View File

@@ -1,3 +1,3 @@
module Linguist
VERSION = "4.5.9"
VERSION = "4.6.3"
end

View File

@@ -0,0 +1,6 @@
%
% Some very badly written Charity
%
data LA(A) -> D = ss: A -> D
| ff: -> D.

View File

@@ -0,0 +1,2 @@
((exe_name hello)
(link_order (world hello)))

View File

@@ -0,0 +1,103 @@
(:TURTLE
(:@PREFIX "rdf:" "<http://www.w3.org/1999/02/22-rdf-syntax-ns#>")
(:@PREFIX "owl:" "<http://www.w3.org/2002/07/owl#>")
(:@PREFIX "dc:" "<http://purl.org/dc/elements/1.1/>")
(:@PREFIX "xsd:" "<http://www.w3.org/2001/XMLSchema#>")
(:@PREFIX "rdfs:" "<http://www.w3.org/2000/01/rdf-schema#>")
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/channel>")
(:PREDICATE-OBJECT-LIST
(:URIREF #1="<http://www.w3.org/1999/02/22-rdf-syntax-ns#type>")
(:OBJECTS
(:QNAME "rdfs:Class")))
(:PREDICATE-OBJECT-LIST
(:QNAME "rdfs:comment")
(:OBJECTS
(:STRING "An RSS information channel.")))
(:PREDICATE-OBJECT-LIST
(:QNAME "rdfs:isDefinedBy")
(:OBJECTS
(:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST
(:QNAME "rdfs:label")
(:OBJECTS
(:STRING "Channel"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/description>")
(:PREDICATE-OBJECT-LIST
(:URIREF #1#)
(:OBJECTS
(:QNAME "rdf:Property")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment")
(:OBJECTS (:STRING "A short text description of the subject.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "Description")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:subPropertyOf") (:OBJECTS (:QNAME "dc:description"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/image>")
(:PREDICATE-OBJECT-LIST (:URIREF #1#) (:OBJECTS (:QNAME "rdfs:Class")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment") (:OBJECTS (:STRING "An RSS image.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "Image"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/item>")
(:PREDICATE-OBJECT-LIST (:URIREF #1#) (:OBJECTS (:QNAME "rdfs:Class")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment") (:OBJECTS (:STRING "An RSS item.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "Item"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/items>")
(:PREDICATE-OBJECT-LIST (:URIREF #1#) (:OBJECTS (:QNAME "rdf:Property")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment")
(:OBJECTS
(:STRING "Points to a list of rss:item elements that are members of the subject channel.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "Items"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/link>")
(:PREDICATE-OBJECT-LIST (:URIREF #1#) (:OBJECTS (:QNAME "rdf:Property")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment")
(:OBJECTS (:STRING "The URL to which an HTML rendering of the subject will link.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "Link")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:subPropertyOf") (:OBJECTS (:QNAME "dc:identifier"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/name>")
(:PREDICATE-OBJECT-LIST (:URIREF #1#) (:OBJECTS (:QNAME "rdf:Property")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment")
(:OBJECTS (:STRING "The text input field's (variable) name.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "Name"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/textinput>")
(:PREDICATE-OBJECT-LIST (:URIREF #1#) (:OBJECTS (:QNAME "rdfs:Class")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment") (:OBJECTS (:STRING "An RSS text input.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "Text Input"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/title>")
(:PREDICATE-OBJECT-LIST (:URIREF #1#) (:OBJECTS (:QNAME "rdf:Property")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment")
(:OBJECTS (:STRING "A descriptive title for the channel.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "Title")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:subPropertyOf") (:OBJECTS (:QNAME "dc:title"))))
(:TRIPLES (:URIREF "<http://purl.org/rss/1.0/url>")
(:PREDICATE-OBJECT-LIST (:URIREF #1#) (:OBJECTS (:QNAME "rdf:Property")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:comment")
(:OBJECTS
(:STRING
"The URL of the image to used in the 'src' attribute of the channel's image tag when rendered as HTML.")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:isDefinedBy")
(:OBJECTS (:URIREF "<http://purl.org/rss/1.0/>")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:label") (:OBJECTS (:STRING "URL")))
(:PREDICATE-OBJECT-LIST (:QNAME "rdfs:subPropertyOf") (:OBJECTS (:QNAME "dc:identifier")))))

View File

@@ -0,0 +1,13 @@
$ORIGIN 0.0.0.c.2.1.0.3.0.0.2.1.e.f.f.3.ip6.arpa.
$TTL 60
@ IN SOA ns root (
2002042901 ; SERIAL
7200 ; REFRESH
600 ; RETRY
36000000 ; EXPIRE
120 ; MINIMUM
)
NS ns.example.com.
c.a.7.e.d.7.e.f.f.f.0.2.8.0.a.0 PTR sip01.example.com.

View File

@@ -0,0 +1,12 @@
$TTL 3d
@ IN SOA root.localhost. root.sneaky.net. (
2015042907 ; serial
3d ; refresh
1h ; retry
12d ; expire
2h ; negative response TTL
)
IN NS root.localhost.
IN NS localhost. ; secondary name server is preferably externally maintained
www IN A 3.141.59.26

6
samples/HCL/example.hcl Normal file
View File

@@ -0,0 +1,6 @@
consul = "1.2.3.4"
// This is a comment
template "foo" {
bar = "zip"
}

13
samples/HCL/example.tf Normal file
View File

@@ -0,0 +1,13 @@
resource "aws_instance" "web" {
// Copies the myapp.conf file to /etc/myapp.conf
provisioner "file" {
source = "conf/myapp.conf"
destination = "/etc/myapp.conf"
}
// Copies the configs.d folder to /etc/configs.d
provisioner "file" {
source = "conf/configs.d"
destination = "/etc"
}
}

82
samples/JSON/geo.geojson Normal file
View File

@@ -0,0 +1,82 @@
{
"type": "FeatureCollection",
"features": [
{
"type": "Feature",
"properties": {
"name": "Australia Post - North Ryde BC",
"geo": [-33.787792, 151.13288],
"streetAddress": "11 Waterloo Road",
"addressLocality": "Macquarie Park",
"addressRegion": "New South Wales",
"addressCountry": "Australia",
"postalCode": "2113"
},
"geometry": {
"type": "Point",
"coordinates": [151.13288, -33.787792, 0]
}
},
{
"type": "Feature",
"properties": {
"name": "George Weston Foods Limited",
"geo": [-37.8263884, 144.9105381],
"streetAddress": "Level 3, 187 Todd Road",
"addressLocality": "Port Melbourne",
"addressRegion": "Victoria",
"addressCountry": "Australia",
"postalCode": "3207"
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[144.9097088901841, -37.82622654171794, 0],
[144.9099724266943, -37.82679388891783, 0],
[144.9110127325916, -37.82651526396403, 0],
[144.9112227645738, -37.82655667152123, 0],
[144.9113739439796, -37.82618552508767, 0],
[144.9112740633105, -37.82615750100924, 0],
[144.9111355846674, -37.82584493693527, 0],
[144.9097088901841, -37.82622654171794, 0]
]
]
}
},
{
"type": "Feature",
"properties": {
"name": "George Weston Foods Limited",
"geo": [-37.05202791502396, 144.2085614999388],
"streetAddress": "67 Richards Road",
"addressLocality": "Castlemaine",
"addressRegion": "Victoria",
"addressCountry": "Australia",
"postalCode": "3450"
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[144.2052428913937, -37.04906391287216, 0],
[144.205540392692, -37.05049727485623, 0],
[144.2059800881858, -37.05066835966983, 0],
[144.206490656024, -37.05279538900776, 0],
[144.2064525845008, -37.05366195881602, 0],
[144.2084322301922, -37.0538920493147, 0],
[144.2084811895712, -37.05266519735124, 0],
[144.2079784002005, -37.05041270555773, 0],
[144.2074017905817, -37.04817406993293, 0],
[144.2061363939852, -37.04834972871226, 0],
[144.2052428913937, -37.04906391287216, 0]
]
]
}
}
]
}

File diff suppressed because one or more lines are too long

23
samples/JSX/sample.jsx Normal file
View File

@@ -0,0 +1,23 @@
'use strict';
const React = require('react')
module.exports = React.createClass({
render: function() {
let {feeds, log} = this.props;
log.info(feeds);
return <div className="feed-list">
<h3>News Feed's</h3>
<ul>
{feeds.map(function(feed) {
return <li key={feed.name} className={feed.fetched ? 'loaded' : 'loading'}>
{feed.data && feed.data.length > 0 ?
<span>{feed.name} <span className='light'>({feed.data.length})</span></span>
: 'feed.name' }
</li>
})}
</ul>
</div>;
}
});

74
samples/LSL/LSL.lslp Normal file
View File

@@ -0,0 +1,74 @@
/*
Testing syntax highlighting
for the Linden Scripting Language
*/
integer someIntNormal = 3672;
integer someIntHex = 0x00000000;
integer someIntMath = PI_BY_TWO;
integer event = 5673;// 'event' is invalid.illegal
key someKeyTexture = TEXTURE_DEFAULT;
string someStringSpecial = EOF;
some_user_defined_function_without_return_type(string inputAsString)
{
llSay(PUBLIC_CHANNEL, inputAsString);
}
string user_defined_function_returning_a_string(key inputAsKey)
{
return (string)inputAsKey;
}
default
{
state_entry()
{
key someKey = NULL_KEY;
someKey = llGetOwner();
string someString = user_defined_function_returning_a_string(someKey);
some_user_defined_function_without_return_type(someString);
}
touch_start(integer num_detected)
{
list agentsInRegion = llGetAgentList(AGENT_LIST_REGION, []);
integer numOfAgents = llGetListLength(agentsInRegion);
integer index; // defaults to 0
for (; index <= numOfAgents - 1; index++) // for each agent in region
{
llRegionSayTo(llList2Key(agentsInRegion, index), PUBLIC_CHANNEL, "Hello, Avatar!");
}
}
touch_end(integer num_detected)
{
someIntNormal = 3672;
someIntHex = 0x00000000;
someIntMath = PI_BY_TWO;
event = 5673;// 'event' is invalid.illegal
someKeyTexture = TEXTURE_DEFAULT;
someStringSpecial = EOF;
llSetInventoryPermMask("some item", MASK_NEXT, PERM_ALL);// 'llSetInventoryPermMask' is reserved.godmode
llWhisper(PUBLIC_CHANNEL, "Leaving \"default\" now...");
state other;
}
}
state other
{
state_entry()
{
llWhisper(PUBLIC_CHANNEL, "Entered \"state other\", returning to \"default\" again...");
state default;
}
}

View File

@@ -0,0 +1,31 @@
# $OpenBSD: Makefile.inc,v 1.2 2003/11/14 20:09:20 drahn Exp $
# $NetBSD: Makefile.inc,v 1.1 1996/09/30 16:34:59 ws Exp $
.if !defined(__stand_makefile_inc)
__stand_makefile_inc=1
KERN_AS= library
S=$(.CURDIR)/../../../$(R)
.if !make(libdep) && !make(sadep) && !make(salibdir) && !make(kernlibdir) && !make(obj) && !defined(NOMACHINE)
.BEGIN:
@([ -h machine ] || ln -s $(S)/arch/$(MACHINE)/include machine)
.endif
#
EXTRACFLAGS= -msoft-float
REAL_VIRT?= -v
ENTRY?= _start
INCLUDES+= -I. -I$(.OBJDIR) -I$(.CURDIR)/.. -I$(S)/arch -I$(S)
INCLUDES+= -I$(S)/lib/libsa
DEFS+= -DSTANDALONE
CFLAGS+= $(INCLUDES) $(DEFS) $(EXTRACFLAGS)
CFLAGS+= -fno-stack-protector
LDFLAGS?= -X -N -Ttext $(RELOC) -e $(ENTRY)
cleandir:
rm -rf lib machine
.endif

View File

@@ -0,0 +1,109 @@
undef("PrnOscPat_driver")
function PrnOscPat_driver(eof[*][*][*]:numeric, eof_ts[*][*]:numeric, kPOP[1]:integer)
; =================================================================
; compute Principal Oscillation Patterns (POPs)
; =================================================================
local dim_ts, dim_eof, neof, ntim, nlat, mlon, dnam_ts, dnam_eof, neof, j \
, cov0, cov1, cov0_inverse, A, z, Z, pr, pi, zr, zi, mean, stdev \
, evlr, eigi, eigr
begin
dim_ts = dimsizes(eof_ts) ; (neof,ntim)
dim_eof = dimsizes(eof) ; (neof,nlat,mlon)
ntim = dim_ts(1)
neof = dim_eof(0)
nlat = dim_eof(1)
mlon = dim_eof(2)
dnam_ts = getvardims(eof_ts) ; dimension names
dnam_eof= getvardims(eof) ; used at end for meta data
; =================================================================
; lag-0 and lag-1 matrices
; =================================================================
if (get_ncl_version().eq."6.1.2") then ; bug in 6.1.2
cov0 = covcorm(eof_ts,(/1,0/)) ; lag-0 covariance matrix
else
cov0 = covcorm(eof_ts,(/0,1/)) ; lag-0 covariance matrix (n x n)
end if
; either
cov1 = covcorm_xy(eof_ts, eof_ts, (/0,1,0/)) ; lag-1
;cov1 = covcorm_xy(eof_ts(:,0:ntim-2) \ ; alternative, brute force
; ,eof_ts(:,1:ntim-1), (/0,0,0/))
;printVarSummary(cov1)
; =================================================================
; matrix A contains information for evolution of the POP system.
; POPs are eigenvectors of A.
; =================================================================
cov0_inverse = inverse_matrix(cov0)
A = cov1#inverse_matrix(cov0) ; [*][*] => neof x neof
; =================================================================
; NCL 6.1.1 of dgeevx: evlr(2,2,N,N) ; (left(0)/right(1), real(0)/imag(1),:,:)
; Eigenvalues are returned as attributes: eigi = evlr@eigi ; eigr = evlr@eigr
; =================================================================
evlr = dgeevx_lapack(A, "B", "V", "V", "B", False)
; =================================================================
; POP time series from eigenvalues and right eigenvectors
; =================================================================
;PR = (/ evlr(1,0,:,:) /) ; right ev (1), real part (0)
;PI = (/ evlr(1,1,:,:) /) ; right ev (1), imag part (1)
; kPOP is what we want; use righteigenvector
pr = (/ evlr(1,0,kPOP-1,:) /) ; right ev (1), real part (0), row 'kPOP-1'
pi = (/ evlr(1,1,kPOP-1,:) /) ; right ev (1), imag part (1), row 'kPOP-1'
z = inverse_matrix( (/ (/sum(pr*pr), sum(pr*pi)/) \
, (/sum(pr*pi), sum(pi*pi)/) /))#(/pr,pi/)#eof_ts
; complex conjugate
z = (/z(0,:), -z(1,:)/) ; real & imag series
z = dim_rmvmean_n(z,1)
mean = dim_avg_n(z,1) ; calculate mean
stdev= dim_stddev_n(z,1) ; calculate stdev
z = dim_standardize_n(z,1,1) ; standardize time series
z!0 = "nPOP" ; add meta data
z!1 = dnam_ts(1)
z&nPOP = (/0,1/)
z&$dnam_ts(1)$ = eof_ts&$dnam_ts(1)$
z@stdev = stdev
z@mean = mean
z@long_name = "POP timeseries"
;printVarSummary(z)
; =================================================================
; POP spatial patterns
; =================================================================
zr = pr(0)*eof(0,:,:) ; construct POP spatial domain
zi = pi(0)*eof(0,:,:)
do j=1,neof-1
zr = zr + pr(j)*eof(j,:,:)
zi = zi + pi(j)*eof(j,:,:)
end do
Z = (/zr*stdev(0), -zi*stdev(1)/) ; scale patterns by time series stdev
Z!0 = "nPOP" ; add meta data
Z!1 = dnam_eof(1)
Z!2 = dnam_eof(2)
Z&nPOP = (/0,1/)
Z&$dnam_eof(1)$ = eof&$dnam_eof(1)$
Z&$dnam_eof(2)$ = eof&$dnam_eof(2)$
Z@long_name = "POP pattern"
;printVarSummary(Z)
; =================================================================
; return POP time series and POP spatial patterns as a
; variable of type 'list' which contains 2 variables
; =================================================================
return( [/z, Z/] ) ; this is type "list"
end

View File

@@ -0,0 +1,115 @@
;*************************************************
; WRF static: panel different variables
;************************************************
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/wrf/WRF_contributed.ncl"
begin
;************************************************
; open file and read in data
;************************************************
f = addfile("static.wrfsi.nc", "r")
;************************************************
; Read variables
;************************************************
use = f->use(0,0,:,:) ; land use dominant category
stl = f->stl(0,0,:,:) ; top layer (0-30cm) dom cat soiltype
sbl = f->sbl(0,0,:,:) ; bottom layer (30-90cm) dom cat soiltype
lat2d = f->lat(0,0,:,:)
lon2d = f->lon(0,0,:,:)
lsMask= f->lnd(0,0,:,:) ; land (1) water (0) mas
;************************************************
; Use mask function to set all ocean areas to _FillValue
;************************************************
use = mask(use,lsMask,1)
stl = mask(stl,lsMask,1)
sbl = mask(sbl,lsMask,1)
;************************************************
; Associate 2D coordinates with variables for plotting
;************************************************
use@lat2d = lat2d
use@lon2d = lon2d
stl@lat2d = lat2d
stl@lon2d = lon2d
sbl@lat2d = lat2d
sbl@lon2d = lon2d
;************************************************
; The file should be examined via: ncdump -v grid_type static.wrsi
; This will print the print type. then enter below.
;************************************************
projection = "mercator"
;************************************************
; create plots
;************************************************
wks = gsn_open_wks("ps" ,"WRF_static") ; ps,pdf,x11,ncgm,eps
gsn_define_colormap(wks ,"BlAqGrYeOrReVi200"); choose colormap
res = True ; plot mods desired
res@gsnSpreadColors = True ; use full range of colormap
res@cnFillOn = True ; color plot desired
res@cnLinesOn = False ; turn off contour lines
res@cnLineLabelsOn = False ; turn off contour labels
res@cnLevelSpacingF = 1 ; manually specify interval
res@cnFillMode = "RasterFill" ; activate raster mode
res@lbLabelAutoStride = True ; let NCL figure lb stride
;************************************************
; Turn on lat / lon labeling
;************************************************
;;res@pmTickMarkDisplayMode = "Always" ; turn on tickmarks
dimll = dimsizes(lat2d)
nlat = dimll(0)
mlon = dimll(1)
res@mpProjection = projection
res@mpLimitMode = "Corners"
res@mpLeftCornerLatF = lat2d(0,0)
res@mpLeftCornerLonF = lon2d(0,0)
res@mpRightCornerLatF = lat2d(nlat-1,mlon-1)
res@mpRightCornerLonF = lon2d(nlat-1,mlon-1)
res@mpCenterLonF = f->LoV ; set center logitude
if (projection.eq."LambertConformal") then
res@mpLambertParallel1F = f->Latin1
res@mpLambertParallel2F = f->Latin2
res@mpLambertMeridianF = f->LoV
end if
res@mpFillOn = False ; turn off map fill
res@mpOutlineDrawOrder = "PostDraw" ; draw continental outline last
res@mpOutlineBoundarySets = "GeophysicalAndUSStates" ; state boundaries
;;res@tfDoNDCOverlay = True ; True only for 'native' grid
res@gsnAddCyclic = False ; data are not cyclic
;************************************************
; allocate array for 3 plots
;************************************************
plts = new (3,"graphic")
;************************************************
; Tell NCL not to draw or advance frame for individual plots
;************************************************
res@gsnDraw = False ; (a) do not draw
res@gsnFrame = False ; (b) do not advance 'frame'
plts(0) = gsn_csm_contour_map(wks,use,res)
plts(1) = gsn_csm_contour_map(wks,stl,res)
plts(2) = gsn_csm_contour_map(wks,sbl,res)
;************************************************
; create panel: panel plots have their own set of resources
;************************************************
resP = True ; modify the panel plot
resP@txString = "Land Use and Soil Type"
resP@gsnMaximize = True ; maximize panel area
resP@gsnPanelRowSpec = True ; specify 1 top, 2 lower level
gsn_panel(wks,plts,(/1,2/),resP) ; now draw as one plot
end

160
samples/NCL/WRF_track_1.ncl Normal file
View File

@@ -0,0 +1,160 @@
;********************************************************
; Plot storm stracks from wrfout files.
;********************************************************
;
; JUN-18-2005
; So-Young Ha (MMM/NCAR)
; SEP-01-2006
; Slightly modified by Mary Haley to add some extra comments.
; ===========================================
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/wrf/WRF_contributed.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/wrf/WRFUserARW.ncl"
begin
; DATES
date = (/1512,1600,1612,1700,1712,1800,1812,1900/)
ndate = dimsizes(date)
sdate = sprinti("%4.0i",date)
; Experiment name (for legend)
EXP = (/"EXP_I"/) ; (/"EXP_I","EXP_II","EXP_III"/)
nexp = dimsizes(EXP)
; To get lat/lon info.
a = addfile("wrfout_d01_2003-07-15_00:00:00.nc","r")
lat2d = a->XLAT(0,:,:)
lon2d = a->XLONG(0,:,:)
dimll = dimsizes(lat2d)
nlat = dimll(0)
mlon = dimll(1)
; Sea Level Pressure
slp = wrf_user_getvar(a,"slp",0)
dims = dimsizes(slp)
; Array for track
time = new(ndate,string)
imin = new(ndate,integer)
jmin = new(ndate,integer)
smin = new(ndate,integer)
; =======
; ndate
; =======
fs = systemfunc("ls wrfout*00")
nfs= dimsizes(fs)
if(nfs .ne. ndate) then
print("Check input data:"+nfs+" .ne. "+ndate)
end if
do ifs=0,nfs-1
f = addfile(fs(ifs)+".nc","r")
time(ifs) = wrf_user_list_times(f)
; print(time(ifs))
slp2d = wrf_user_getvar(f,"slp",0)
; We need to convert 2-D array to 1-D array to find the minima.
slp1d = ndtooned(slp2d)
smin(ifs) = minind(slp1d)
; Convert the index for 1-D array back to the indeces for 2-D array.
minij = ind_resolve(ind(slp1d.eq.min(slp2d)),dims)
imin(ifs) = minij(0,0)
jmin(ifs) = minij(0,1)
; print(time(ifs)+" : "+min(slp2d)+" ("+imin(ifs)+","+jmin(ifs)+")")
end do
;
; Graphics section
wks=gsn_open_wks("ps","track") ; Open PS file.
gsn_define_colormap(wks,"BlGrYeOrReVi200") ; Change color map.
res = True
res@gsnDraw = False ; Turn off draw.
res@gsnFrame = False ; Turn off frame advance.
res@gsnMaximize = True ; Maximize plot in frame.
res@tiMainString = "Hurricane Isabel" ; Main title
WRF_map_c(a,res,0) ; Set up map resources
; (plot options)
plot = gsn_csm_map(wks,res) ; Create a map.
; Set up resources for polymarkers.
gsres = True
gsres@gsMarkerIndex = 16 ; filled dot
;gsres@gsMarkerSizeF = 0.005 ; default - 0.007
cols = (/5,160,40/)
; Set up resources for polylines.
res_lines = True
res_lines@gsLineThicknessF = 3. ; 3x as thick
dot = new(ndate,graphic) ; Make sure each gsn_add_polyxxx call
line = new(ndate,graphic) ; is assigned to a unique variable.
; Loop through each date and add polylines to the plot.
do i = 0,ndate-2
res_lines@gsLineColor = cols(0)
xx=(/lon2d(imin(i),jmin(i)),lon2d(imin(i+1),jmin(i+1))/)
yy=(/lat2d(imin(i),jmin(i)),lat2d(imin(i+1),jmin(i+1))/)
line(i) = gsn_add_polyline(wks,plot,xx,yy,res_lines)
end do
lon1d = ndtooned(lon2d)
lat1d = ndtooned(lat2d)
; Loop through each date and add polymarkers to the plot.
do i = 0,ndate-1
print("dot:"+lon1d(smin(i))+","+lat1d(smin(i)))
gsres@gsMarkerColor = cols(0)
dot(i)=gsn_add_polymarker(wks,plot,lon1d(smin(i)),lat1d(smin(i)),gsres)
end do
; Date (Legend)
txres = True
txres@txFontHeightF = 0.015
txres@txFontColor = cols(0)
txid1 = new(ndate,graphic)
; Loop through each date and draw a text string on the plot.
do i = 0, ndate-1
txres@txJust = "CenterRight"
ix = smin(i) - 4
print("Eye:"+ix)
if(i.eq.1) then
txres@txJust = "CenterLeft"
ix = ix + 8
end if
txid1(i) = gsn_add_text(wks,plot,sdate(i),lon1d(ix),lat1d(ix),txres)
end do
; Add marker and text for legend. (Or you can just use "pmLegend" instead.)
txres@txJust = "CenterLeft"
txid2 = new(nexp,graphic)
pmid2 = new(nexp,graphic)
do i = 0,nexp-1
gsres@gsMarkerColor = cols(i)
txres@txFontColor = cols(i)
ii = ((/129,119,109/)) ; ilat
jj = ((/110,110,110/)) ; jlon
ji = ii*mlon+jj ; col x row
pmid2(i) = gsn_add_polymarker(wks,plot,lon1d(ji(i)),lat1d(ji(i)),gsres)
txid2(i) = gsn_add_text(wks,plot,EXP(i),lon1d(ji(i)+5),lat1d(ji(i)),txres)
end do
draw(plot)
frame(wks)
end

129
samples/NCL/cru_8.ncl Normal file
View File

@@ -0,0 +1,129 @@
;*****************************************************
; cru_8.ncl
;
; Concepts illustrated:
; - Plotting CRU (Climate Research Unit)/ BADC data
; - Selecting a sub-period
; - calculating a climatology
; - Drawing raster contours; very basic graphics
;
;*****************************************************
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl" ; not needed 6.20 onward
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
; create references (pointers) to the files
diri = "./"
fcld = addfile(diri+"cru_ts3.21.1901.2012.cld.dat.nc", "r")
fdtr = addfile(diri+"cru_ts3.21.1901.2012.dtr.dat.nc", "r")
ffrs = addfile(diri+"cru_ts3.21.1901.2012.frs.dat.nc", "r")
fpet = addfile(diri+"cru_ts3.21.1901.2012.pet.dat.nc", "r")
fpre = addfile(diri+"cru_ts3.21.1901.2012.pre.dat.nc", "r")
ftmn = addfile(diri+"cru_ts3.21.1901.2012.tmn.dat.nc", "r")
ftmp = addfile(diri+"cru_ts3.21.1901.2012.tmp.dat.nc", "r")
ftmx = addfile(diri+"cru_ts3.21.1901.2012.tmx.dat.nc", "r")
fvap = addfile(diri+"cru_ts3.21.1901.2012.vap.dat.nc", "r")
fwet = addfile(diri+"cru_ts3.21.1901.2012.wet.dat.nc", "r")
; specify start & last dates (arbitrary)
ymStrt = 199101
ymLast = 200012
; get index values of start/lat dates
time = fcld->time
yyyymm = cd_calendar(time, -1)
ntStrt = ind(yyyymm.eq.ymStrt) ; index values
ntLast = ind(yyyymm.eq.ymLast)
; read time segment
cld = fcld->cld(ntStrt:ntLast,:,:)
dtr = fdtr->dtr(ntStrt:ntLast,:,:)
frs = ffrs->frs(ntStrt:ntLast,:,:)
pet = fpet->pet(ntStrt:ntLast,:,:)
pre = fpre->pre(ntStrt:ntLast,:,:)
tmn = ftmn->tmn(ntStrt:ntLast,:,:)
tmp = ftmp->tmp(ntStrt:ntLast,:,:)
tmx = ftmx->tmx(ntStrt:ntLast,:,:)
vap = fvap->vap(ntStrt:ntLast,:,:)
wet = fwet->wet(ntStrt:ntLast,:,:)
printVarSummary(cld) ; [time | 120] x [lat | 360] x [lon | 720]
; calculate monthly climatologies
cldclm = clmMonTLL(cld)
dtrclm = clmMonTLL(dtr)
frsclm = clmMonTLL(frs)
petclm = clmMonTLL(pet)
preclm = clmMonTLL(pre)
tmnclm = clmMonTLL(tmn)
tmpclm = clmMonTLL(tmp)
tmxclm = clmMonTLL(tmx)
vapclm = clmMonTLL(vap)
wetclm = clmMonTLL(wet)
printVarSummary(cldclm) ; [month | 12] x [lat | 360] x [lon | 720]
;************************************
; create plots ... very simple
;************************************
nt = 6
month = "July"
yrStrt = ymStrt/100
yrLast = ymLast/100
title = month+": "+yrStrt+"-"+yrLast
wks = gsn_open_wks("ps","cru") ; open a ps file
gsn_define_colormap(wks,"ncl_default") ; choose colormap; not needed 6.20 onward
plot = new(2,graphic) ; create graphic array
res = True
res@cnFillOn = True ; turn on color fill; not needed 6.20 onward
res@cnFillMode = "RasterFill" ; Raster Mode
res@cnLinesOn = False ; Turn off contour lines
res@gsnDraw = False ; do not draw picture
res@gsnFrame = False ; do not advance frame
res@lbOrientation = "Vertical" ; vertical label bar
resp = True
resp@gsnMaximize = True ; make ps, eps, pdf large
resp@txString = title+": CLD, FRS"
plot(0)=gsn_csm_contour_map_ce(wks,cldclm(nt,:,:),res)
plot(1)=gsn_csm_contour_map_ce(wks,frsclm(nt,:,:),res)
gsn_panel(wks,plot,(/2,1/),resp)
resp@txString = title+": PET, VAP"
plot(0)=gsn_csm_contour_map_ce(wks,petclm(nt,:,:),res)
plot(1)=gsn_csm_contour_map_ce(wks,vapclm(nt,:,:),res)
gsn_panel(wks,plot,(/2,1/),resp)
resp@txString = title+": TMN, TMX"
plot(0)=gsn_csm_contour_map_ce(wks,tmnclm(nt,:,:),res)
plot(1)=gsn_csm_contour_map_ce(wks,tmxclm(nt,:,:),res)
gsn_panel(wks,plot,(/2,1/),resp)
resp@txString = title+": TMP, DTR"
plot(0)=gsn_csm_contour_map_ce(wks,tmpclm(nt,:,:),res)
plot(1)=gsn_csm_contour_map_ce(wks,dtrclm(nt,:,:),res)
gsn_panel(wks,plot,(/2,1/),resp)
resp@txString = title+": WET, PRE"
plot(0)=gsn_csm_contour_map_ce(wks,wetclm(nt,:,:),res)
;colors = (/ ... /)
;res@cnFillPalette = colors ; optional: distinct colors for categories
res@cnLevelSelectionMode = "ExplicitLevels" ; use unequal spacing
res@cnLevels = (/2.0,10,25,37.5,50,75,100,125,150,175,200,300,400,500,750/)
plot(1)=gsn_csm_contour_map_ce(wks,preclm(nt,:,:),res)
gsn_panel(wks,plot,(/2,1/),resp)

View File

@@ -0,0 +1,20 @@
;******************** Inputs Regarding Input and Output Data *************************************
;netCDFFilePath = "NULL-MYD04_L2.051-MIL2ASAE.0022-AERONET_AOD_L2.2-20112106165049.nc"
;outputFilePath = "plot-output"
;******************* Inputs Regarding Data Structure ***********************************************
;lPlotVariablesList = "mean_AERONET_AOD_L2_2_AOD0558intrp_Ames,mean_MIL2ASAE_0022_AOD0866b_Ames"
;rPlotVariablesList = "medn_MYD04_L2_051_AOD0550dpbl_l_Ames"
;xDimName = "time"
;xDimSize = 365
;******************* Inputs Regarding the View Annotations ****************************************
;title = "MAPSS Time Series"
;yLAxisLabel = "Mean AOD"
;yRAxisLabel = "Median AOD"
;*******************END INPUTS ********************************************************************

128
samples/NCL/hdf4sds_7.ncl Normal file
View File

@@ -0,0 +1,128 @@
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
;**************************************************************
; User Input
;***************************************************************
; INPUT
diri = "./" ; input directory
fili = "wv_LV3_MET08_20050102_12345678_L00013712E00013712.hdf"
pltDir = "./" ; directory for plot output
sfx = get_file_suffix(fili,1)
;pltName = sfx@fBase ; output graphic name
pltName = "hdf4sds"
pltType = "ps"
;***************************************************************
; End User Input
;***************************************************************
;***************************************************************
; Open SEVIRI L3 'wv' HDF file
;***************************************************************
; Note the rather unusual data format: flag *prepended* to data value
;***************************************************************
; integer twc_lv3 ( fakeDim0, fakeDim1 )
; long_name : total water vapour column + flag
; units : fmmmm
; format : I4
; valid_range : ( 10000, 38000 )
; _FillValue : -99
; legend_01 : f = flag
; legend_02 : f = 1 averaged level 2 values
; legend_03 : f = 2 interpolated from averaged level 2 values
; legend_04 : f = 3 gaps filled with NVAP climatology
; legend_05 : mmmm = water vapour column in mm * 100. as integer
; legend_06 : Example: 11025 means: flag = 1, 10.25 mm water vapour column
; min_lat : -74.75
; max_lat : 61.75
; min_lon : -75.25
; max_lon : 75.25
; dlat : 0.5
; dlon : 0.5
;---------------------------------------------------------------
f = addfile (diri+fili, "r")
ifx = f->twc_lv3 ; fmmmm (integer)
printVarSummary(ifx)
flag = ifx/10000 ; extract flag
ix = ifx - flag*10000 ; extract mmmm
x = ix*0.01 ; scale
; create meta data for 'x'
dimx = dimsizes(x)
nlat = dimx(0) ; grid size x(nlat,mlon)
mlon = dimx(1)
lat = fspan(ifx@min_lat, ifx@max_lat, nlat)
lat@units = "degrees_north"
lon = fspan(ifx@min_lon, ifx@max_lon, mlon)
lon@units = "degrees_east"
x!0 = "lat"
x!1 = "lon"
x&lat = lat
x&lon = lon
x@long_name = "SEVIRI: Total Water Vapor"
x@units = "mm"
delete( [/ifx, ix/] ) ; no longer needed
;***************************************************************
; Create plot
;***************************************************************
wks = gsn_open_wks(pltType, pltDir+pltName)
plot = new (2, "graphic")
res = True ; plot mods desired
res@gsnAddCyclic = False ; data noty global
res@gsnDraw = False
res@gsnFrame = False
res@cnFillOn = True ; turn on color fill
res@cnLinesOn = False ; turn of contour lines
res@cnFillMode = "RasterFill" ; Raster Mode
res@cnLinesOn = False ; Turn off contour lines
res@cnLineLabelsOn = False ; Turn off contour lines
res@cnMissingValFillColor= "background" ; "foreground"
res@mpCenterLonF = 0.5*(min(x&lon) + max(x&lon))
res@mpMinLatF = min(x&lat)
res@mpMaxLatF = max(x&lat)
res@mpMinLonF = min(x&lon)
res@mpMaxLonF = max(x&lon)
;res@lbOrientation = "Vertical"
plot(0) = gsn_csm_contour_map_ce(wks,x, res)
; plot flag
copy_VarCoords(x, flag)
flag@long_name = "Flag"
flag@units = "1=avg(L2), 2=int(L2), 3=NVAP"
print(flag&lat+" "+flag(:,{30}))
res@cnLevelSelectionMode = "ManualLevels" ; set manual contour levels
res@cnMinLevelValF = 2 ; set min contour level
res@cnMaxLevelValF = 3 ; one less than max
res@cnLevelSpacingF = 1 ; set contour spacing
res@lbLabelStrings = ispan(1,3,1) ; 1, 2, 3
res@lbLabelPosition = "Center" ; label position
res@lbLabelAlignment = "BoxCenters"
res@gsnLeftString = ""
res@gsnRightString = ""
res@gsnCenterString = "flag: 1=avg(L2), 2=int(L2), 3=NVAP"
plot(1) = gsn_csm_contour_map_ce(wks,flag, res)
resP = True ; modify the panel plot
resP@txString = fili
resP@gsnMaximize = True
gsn_panel(wks,plot,(/1,2/),resP) ; now draw as one plot

125
samples/NCL/mask_12.ncl Normal file
View File

@@ -0,0 +1,125 @@
;----------------------------------------------------------------------
; mask_12.ncl
;
; Concepts illustrated:
; - Using a worldwide shapefile to create a land/ocean mask
; - Masking a data array based on a geographical area
; - Attaching shapefile polylines to a map plot
; - Attaching lat/lon points to a map using gsn_coordinates
;----------------------------------------------------------------------
; Downloaded GSHHS shapefiles from:
;
; http://www.ngdc.noaa.gov/mgg/shorelines/data/gshhg/latest/
;
; Used the "coarsest" one: "GSHHS_shp/c/GSHHS_c_L1.shp".
;----------------------------------------------------------------------
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
load "./shapefile_mask_data.ncl"
;----------------------------------------------------------------------
; Main code
;----------------------------------------------------------------------
begin
WRITE_MASK = True
DEBUG = False
;---Read data to plot and mask
dir = "$NCARG_ROOT/lib/ncarg/data/cdf/"
cdf_prefix = "uv300"
cdf_file = dir + cdf_prefix + ".nc"
fin = addfile(cdf_file,"r")
u = fin->U(1,:,:)
;
; Create a mask array the same size as "u", using
; lat/lon data read off a shapefile.
;
shpfile = "GSHHS_shp/c/GSHHS_c_L1.shp"
opt = True
opt@return_mask = True
land_mask = shapefile_mask_data(u,shpfile,opt)
;---Mask "u" against land and ocean.
u_land_mask = where(land_mask.eq.1,u,u@_FillValue)
u_ocean_mask = where(land_mask.eq.0,u,u@_FillValue)
copy_VarMeta(u,u_land_mask)
copy_VarMeta(u,u_ocean_mask)
;---Start the graphics
wks = gsn_open_wks("ps","mask")
res = True
res@gsnMaximize = True ; maximize plot in frame
res@gsnDraw = False ; don't draw plot yet
res@gsnFrame = False ; don't advance frame yet
res@cnFillOn = True
res@cnLineLabelsOn = False
res@cnLinesOn = False
;---Make sure both plots have same contour levels
mnmxint = nice_mnmxintvl(min(u),max(u),25,False)
res@cnLevelSelectionMode = "ManualLevels"
res@cnMinLevelValF = mnmxint(0)
res@cnMaxLevelValF = mnmxint(1)
res@cnLevelSpacingF = mnmxint(2)
res@lbLabelBarOn = False
res@gsnAddCyclic = False
res@mpFillOn = False
res@mpOutlineOn = False
res@gsnRightString = ""
res@gsnLeftString = ""
;---Create plot of original data and attach shapefile outlines
res@tiMainString = "Original data with shapefile outlines"
map_data = gsn_csm_contour_map(wks,u,res)
dum1 = gsn_add_shapefile_polylines(wks,map_data,shpfile,False)
;---Create plots of masked data
res@tiMainString = "Original data masked against land"
map_land_mask = gsn_csm_contour_map(wks,u_land_mask,res)
res@tiMainString = "Original data masked against ocean"
map_ocean_mask = gsn_csm_contour_map(wks,u_ocean_mask,res)
if(DEBUG) then
mkres = True
; mkres@gsMarkerSizeF = 0.007
mkres@gsnCoordsAttach = True
gsn_coordinates(wks,map_data,u,mkres)
mkres@gsnCoordsNonMissingColor = "yellow"
mkres@gsnCoordsMissingColor = "black"
gsn_coordinates(wks,map_land_mask,u_land_mask,mkres)
gsn_coordinates(wks,map_ocean_mask,u_ocean_mask,mkres)
end if
;---Add shapefile outlines
dum2 = gsn_add_shapefile_polylines(wks,map_land_mask,shpfile,False)
dum3 = gsn_add_shapefile_polylines(wks,map_ocean_mask,shpfile,False)
;---Draw all three plots on one page
pres = True
pres@gsnMaximize = True
pres@gsnPanelLabelBar = True
gsn_panel(wks,(/map_data,map_land_mask,map_ocean_mask/),(/3,1/),pres)
if(WRITE_MASK) then
delete(fin) ; Close file before we open again.
;
; Make copy of file so we don't overwrite original.
; This is not necessary, but it's safer.
;
new_cdf_file = cdf_prefix + "_with_mask.nc"
system("/bin/cp " + cdf_file + " " + new_cdf_file)
finout = addfile(new_cdf_file,"w")
filevardef(finout, "land_mask", typeof(land_mask), (/ "lat", "lon" /) )
finout->land_mask = (/land_mask/)
end if
end

115
samples/NCL/mcsst_1.ncl Normal file
View File

@@ -0,0 +1,115 @@
;*****************************************************
; mcsst_1.ncl
;
; Concepts illustrated:
; - Plotting NAVO MCSST data
; - Using fbindirread to read in fortran binary data
; - Converting "byte" data to "float"
; - Adding meta data (attributes and coordinates) to a variable
; - Adding gray to an existing color map
; - Spanning all but the last two colors in a color map for contour fill
; - Drawing raster contours
;
;*****************************************************
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
;***************************************
; type of data available on file
;***************************************
; ipar=0 Weekly Binned Sea Surface Temperature
; ipar=1 Number of Points in Bin
; ipar=2 Weekly Binned Sea Surface Temperature Anomaly
; ipar=3 Interpolated Sea Surface Temperature
; ipar=4 Interpolated Sea Surface Temperature Anomaly
;***************************************
begin
ipar = 3
fname = "2001311d18N16.dat"
tmp = fbindirread(fname,ipar,(/1024,2048/),"byte")
;***************************************
; convert to float and then change to true SST
;***************************************
xslope = 0.15
if(ipar.eq.4.or.ipar.eq.2)then ; anom has different intercept
yint = -20.0
end if
if(ipar.eq.3.or.ipar.eq.0)then
yint = -3.0
end if
sst = new((/1024,2048/),"float") ; create float var
sst = tmp*xslope+yint ; convert to float
delete(tmp) ; delete unecessary array
;***************************************
; assign missing values. The original missing value was zero, but since it was
; not assigned in NCL, it was not recognized. The new missing values are
; listed below. These will be changed later.
;***************************************
if(ipar.eq.4)then
sst@_FillValue = -20
end if
if(ipar.eq.3.or.ipar.eq.0)then
sst@_FillValue = -3
end if
;***************************************
; create coordinate variables
;***************************************
nlat = 1024
dy = 180./nlat
lat = (90. -(ispan(0,1023,1)*dy))-dy/2
lat!0 = "lat"
lat&lat = lat
lat@units = "degrees_north"
nlon = 2048
dx = 360./nlon
lon = (ispan(0,2047,1)*dx)+dx/2-180. ; note -180. added by sjm to align
lon!0 = "lon"
lon&lon = lon
lon@units = "degrees_east"
;***************************************
; fill out the netCDF data model
;***************************************
sst!0 = "lat" ; name dimensions
sst!1 = "lon" ; ditto
sst = sst(::-1,:) ; reverse lat orientation
sst@long_name = "NAVO MCSST" ; assign long_name
sst@units = "deg C" ; assign units
sst&lat = lat ; assign lat cv
sst&lon = lon ; assign lon cv
sst@_FillValue = -999. ; assign missing value
;***************************************
; get year and day from filename
;***************************************
res = True ; plot mods desired
title = stringtochar(fname) ; parse file name to get date
year = title(0:3)
jday = title(4:6)
res@gsnCenterString = year+" "+jday ; create center string
;***************************************
; create plot
;***************************************
wks = gsn_open_wks("ps","mcsst") ; open workstation (plot destination)
gsn_define_colormap(wks,"BlGrYeOrReVi200") ; choose colormap
;
; This will not be necessary in V6.1.0 and later. Named colors can
; be used without having to first add them to the color map.
;
d = NhlNewColor(wks,0.8,0.8,0.8) ; add gray to colormap
res@cnFillOn = True ; turn on color
res@gsnSpreadColors = True ; use full range of colormap
res@gsnSpreadColorStart = 2 ; start at color 2
res@gsnSpreadColorEnd = -3 ; don't use added gray
res@cnLinesOn = False ; no contour lines
res@cnFillDrawOrder = "PreDraw" ; draw contours before continents
res@gsnMaximize = True ; maximize plot
; For a grid this size, it is better to use raster mode. It will be
; significantly faster, and will not go over NCL's 16mb default plot size.
res@cnFillMode = "RasterFill" ; turn on raster mode
plot = gsn_csm_contour_map_ce(wks,sst,res) ; contour the variable
end

3
samples/NCL/primero.ncl Normal file
View File

@@ -0,0 +1,3 @@
val=102
a=val/4.
print(a)

172
samples/NCL/topo_9.ncl Normal file
View File

@@ -0,0 +1,172 @@
;----------------------------------------------------------------------
; topo_9.ncl
;
; Concepts illustrated:
; - Recreating a jpeg topographic image as an NCL map object
; - Zooming in on a jpeg image
; - Drawing a box around an area of interest on a map
; - Attaching polylines to a map
; - Using "overlay" to overlay multiple contour plots
; - Using more than 256 colors per frame
; - Using functions for cleaner code
;----------------------------------------------------------------------
; NOTE: This example will only work with NCL V6.1.0 and later.
;
; This script recreates a JPEG image that was converted to a NetCDF
; file with color separated bands using the open source tool
; "gdal_translate":
;
; gdal_translate -ot Int16 -of netCDF EarthMap_2500x1250.jpg \
; EarthMap_2500x1250.nc
;----------------------------------------------------------------------
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
;----------------------------------------------------------------------
; This function imports a JPEG image that's on the whole globe,
; and recreates it as an NCL map object that is zoomed in on the
; southern tip of Africa.
;----------------------------------------------------------------------
undef("recreate_jpeg_image")
function recreate_jpeg_image(wks,minlat,maxlat,minlon,maxlon)
begin
orig_jpg_filename = "EarthMap_2500x1250.jpg"
nc_filename = "EarthMap_2500x1250.nc"
;--You could use a system call to do the NetCDF conversion
; cmd = "gdal_translate -ot Int16 -of netCDF " + jpeg_filename + \
; " " + nc_filename)
; system(cmd)
;---Read the three bands of data
f = addfile(nc_filename,"r")
Band1 = where(f->Band1.gt.255, 255, f->Band1) ; red channel
Band2 = where(f->Band2.gt.255, 255, f->Band2) ; green channel
Band3 = where(f->Band3.gt.255, 255, f->Band3) ; blue channel
band_dims = dimsizes(Band3)
nlat = band_dims(0)
nlon = band_dims(1)
print("dimensions of image = " + nlat + " x " + nlon)
;
; Add lat/lon data so we can overlay on a map, and/or
; overlay contours. We know the image is global,
; cylindrical equidistant, and centered about lon=0.
;
lat = fspan( -90, 90,nlat)
lon = fspan(-180,180,nlon)
lat@units = "degrees_north"
lon@units = "degrees_east"
Band1!0 = "lat"
Band1!1 = "lon"
Band2!0 = "lat"
Band2!1 = "lon"
Band3!0 = "lat"
Band3!1 = "lon"
Band1&lat = lat
Band1&lon = lon
Band2&lat = lat
Band2&lon = lon
Band3&lat = lat
Band3&lon = lon
res = True
res@gsnMaximize = True
res@gsnFrame = False ; Don't draw or advance
res@gsnDraw = False ; frame yet.
res@cnFillOn = True
res@cnFillMode = "RasterFill" ; Raster fill can be faster
res@cnLevelSelectionMode = "EqualSpacedLevels"
res@cnMaxLevelCount = 254
res@cnFillBackgroundColor = (/ 1., 1., 1., 1./)
res@cnLinesOn = False ; Turn off contour lines .
res@cnLineLabelsOn = False ; Turn off contour labels
res@cnInfoLabelOn = False ; Turn off info label
res@lbLabelBarOn = False ; Turn off labelbar
res@gsnRightString = "" ; Turn off subtitles
res@gsnLeftString = ""
res@pmTickMarkDisplayMode = "Always"
;---Construct RGBA colormaps...
ramp = fspan(0., 1., 255)
reds = new((/255, 4/), float)
greens = new((/255, 4/), float)
blues = new((/255, 4/), float)
reds = 0
greens = 0
blues = 0
reds(:,0) = ramp
greens(:,1) = ramp
blues(:,2) = ramp
; The red contour map is plotted fully opaque; the green and blue
; are plotted completely transparent. When overlain, the colors
; combine (rather magically).
reds(:,3) = 1.
greens(:,3) = 0
blues(:,3) = 0
res@cnFillColors = greens
greenMap = gsn_csm_contour(wks, Band2, res)
res@cnFillColors = blues
blueMap = gsn_csm_contour(wks, Band3, res)
;---This will be our base, so make it a map plot.
res@cnFillColors = reds
res@gsnAddCyclic = False
res@mpFillOn = False
;---Zoom in on area of interest
res@mpMinLatF = minlat
res@mpMaxLatF = maxlat
res@mpMinLonF = minlon
res@mpMaxLonF = maxlon
redMap = gsn_csm_contour_map(wks, Band1, res)
;---Overlay everything to create the topo map
overlay(redMap, greenMap)
overlay(redMap, blueMap)
return(redMap)
end
;----------------------------------------------------------------------
; Main code
;----------------------------------------------------------------------
begin
;---Recreating jpeg images only works for X11 and PNG.
wks = gsn_open_wks("png","topo")
;---Southern part of Africa
minlat = -40
maxlat = 5
minlon = 10
maxlon = 40
map = recreate_jpeg_image(wks,minlat,maxlat,minlon,maxlon)
;---Overlay a red box
lonbox = (/ 15, 35, 35, 15, 15/)
latbox = (/-30,-30,-10,-10,-30/)
lnres = True
lnres@gsLineColor = "red" ; red box
lnres@gsLineThicknessF = 4.0 ; make box thicker
box = gsn_add_polyline(wks,map,lonbox,latbox,lnres)
draw(map) ; Drawing the map will draw the red box
frame(wks)
end

120
samples/NCL/traj_3.ncl Normal file
View File

@@ -0,0 +1,120 @@
;*************************************************
; traj_3.ncl
;*************************************************
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
external TRAJ "./particle.so"
;*************************************************
begin
path = "./data.asc"
data = asciiread(path,(/500,6/),"float")
;*************************************************
; some parameters
;*************************************************
np = 1
nq = 500
ncor= 8
xrot = new((/np,nq/),float)
yrot = new((/np,nq/),float)
xaxis = new(ncor,float)
yaxis = new(ncor,float)
;**************************************************
; convert data into rotated format
;**************************************************
TRAJ::particle(path,xrot,yrot,nq,np,xaxis,yaxis,ncor)
;**************************************************
; create plot
;**************************************************
wks = gsn_open_wks("ps","traj") ; Open an ps file
xyres = True
xyres@gsnFrame = False ; don't advance the frame
xyres@gsnDraw = False ; don't draw indivdual plots
xyres@tmXTBorderOn = False ; don't draw top axis
xyres@tmXBBorderOn = False ; don't draw bottom axis
xyres@tmYRBorderOn = False ; don't draw right axis
xyres@tmYLBorderOn = False ; don't draw left axis
xyres@tmXTOn = False ; don't draw top-axis tick marks
xyres@tmXBOn = False ; don't draw bottom-axis tick marks
xyres@tmYROn = False ; don't draw right-axis tick marks
xyres@tmYLOn = False ; don't draw left-axis tick marks
xyres@xyLineColors = (/"red"/) ; set the line color to red
xyres@xyLineThicknessF = 4.0 ; 4 times the line thickness
xyres@trXMaxF = 15000 ; choose range of axis even though
xyres@trXMinF = -10000 ; we don't see them
xyres@trYMaxF = 1000
xyres@trYMinF = -1000
plot = gsn_xy(wks,xrot,yrot,xyres) ; Draw trajectory
;**********************************************
; create arrays needed for the bounding box
;**********************************************
a1 = new(5,float)
b1 = new(5,float)
a2 = new(5,float)
b2 = new(5,float)
a3 = new(2,float)
b3 = new(2,float)
a4 = new(2,float)
b4 = new(2,float)
a5 = new(2,float)
b5 = new(2,float)
a6 = new(2,float)
b6 = new(2,float)
a0 = new(2,float)
b0 = new(2,float)
;**********************************************
; determine values of each bounding line from information
; returned from particle.f
;**********************************************
a1(0:3) = xaxis(:3)
b1(0:3) = yaxis(:3)
a1(4) = xaxis(0)
b1(4) = yaxis(0)
a2(0:3) = xaxis(4:)
b2(0:3) = yaxis(4:)
a2(4) = xaxis(4)
b2(4) = yaxis(4)
a3 = xaxis(0:4:4)
b3 = yaxis(0:4:4)
a4 = xaxis(1:5:4)
b4 = yaxis(1:5:4)
a5 = xaxis(2:6:4)
b5 = yaxis(2:6:4)
a6 = xaxis(3:7:4)
b6 = yaxis(3:7:4)
a0(0) = xaxis(3)
b0(0) = yaxis(3)
a0(1) = xrot(0,0)
b0(1) = yrot(0,0)
;***************************************************************
; create bounding box by drawing multiple xy plots on top of
; each other. each with their individual axis turned off.
;***************************************************************
xyres@xyLineColors = (/"black"/) ; line color
xyres@xyLineThicknessF = 1.0 ; regular line thickness
bottom = gsn_xy(wks,a1,b1,xyres) ; Draw the bottom bounding box.
top = gsn_xy(wks,a2,b2,xyres) ; Draw the top bounding box.
side1 = gsn_xy(wks,a3,b3,xyres) ; Draw a side line.
side2 = gsn_xy(wks,a4,b4,xyres) ; Draw a side line.
side3 = gsn_xy(wks,a5,b5,xyres) ; Draw a side line.
side4 = gsn_xy(wks,a6,b6,xyres) ; Draw a side line.
;***************************************************************
; now draw a large brown line to represent the chimney
;***************************************************************
xyres@xyLineColors = (/"brown"/) ; chimney color
xyres@xyLineThicknessF = 9.0 ; thick line
xyres@tiMainString = "Pollutant Trajectory in a 3D Volume"
chimney = gsn_xy(wks,a0,b0,xyres) ; Draw the chimney.
draw(wks)
frame(wks)
end

167
samples/NCL/tsdiagram_1.ncl Normal file
View File

@@ -0,0 +1,167 @@
; Read potential temp (TEMP), salinity (SALT)
; Compute potential density (PD) for specified range PD(t,s)
; (use ncl function based on Yeager's algorithm for rho computation)
; Assumes annual and zonally avgeraged input data set (i.e, one time slice)
; Used K.Lindsay's "za" for zonal avg -- already binned into basins
; Plots temp vs salt (scatter plot), pd overlay
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/shea_util.ncl"
begin
; ================================> ; PARAMETERS
case = "PHC2_gx1v3"
ocnfile = "za_PHC2_T_S_gx1v3.nc"
depth_min = 14895.82 ; in cm, depth of first layer to be included
depth_max = 537499.9
;
; plot limits
;
smincn = 32.5
smaxcn = 37.0
tmincn = -2.
tmaxcn = 22.
;
; Choose basin index
;
; 0 = global 1 = southern ocean 2 = pacific 3 = indian 6 = atlantic
; 8 = labrador 9 = GIN 10 = arctic
;
bi = 2
;=====> basin check
if(bi.lt.0.or.bi.gt.10) then
print("basin index "+ bi + " not supported")
exit
end if
if(bi.eq.0) then
basin = "Global"
blab = "global"
end if
if(bi.eq.1) then
basin = "Southern Ocean"
blab = "so"
end if
if(bi.eq.2) then
basin = "Pacific Ocean"
blab = "pacific"
end if
if(bi.eq.3) then
basin = "Indian Ocean"
blab = "indian"
end if
if(bi.eq.6) then
basin = "Atlantic Ocean"
blab = "atlanticn"
end if
if(bi.eq.8) then
basin = "Labrador Sea"
blab = "lab"
end if
if(bi.eq.9) then
basin = "GIN Sea"
blab = "gin"
end if
if(bi.eq.10) then
basin = "Arctic Ocean"
blab = "arctic"
end if
;=====> initial resource settings
wks = gsn_open_wks("ps","tsdiagram") ; Open a Postscript file
;===== data
focn = addfile(ocnfile, "r")
salt = focn->SALT(0,:,{depth_min:depth_max},:) ;(basins, z_t, lat_t)
temp = focn->TEMP(0,:,{depth_min:depth_max},:)
;====section out choice basin
temp_ba = temp(bi,:,:)
salt_ba = salt(bi,:,:)
;===== put into scatter array format
tdata_ba = ndtooned(temp_ba)
sdata_ba = ndtooned(salt_ba)
ydata = tdata_ba
xdata = sdata_ba
;============== compute potenial density (PD), using rho_mwjf
;
; for potential density, depth = 0. (i.e. density as if brought to surface)
;
;===========================================================================
; WARNING: T-S diagrams use POTENTIAL DENSITY... if set depth to something
; other then 0, then you will be plotting density contours computed for the
; specified depth layer.
;===========================================================================
depth = 0. ;in meters
tspan = fspan(tmincn,tmaxcn,51)
sspan = fspan(smincn,smaxcn,51)
; the more points the better... using Yeager's numbers
t_range = conform_dims((/51,51/),tspan,0)
s_range = conform_dims((/51,51/),sspan,1)
pd = rho_mwjf(t_range,s_range,depth)
pd!0 = "temp"
pd!1 = "salt"
pd&temp = tspan
pd&salt = sspan
pd = 1000.*(pd-1.) ; Put into kg/m3 pot den units
; printVarSummary(pd)
; printVarInfo(pd,"rho_mwjf")
;=================Graphics
;--- scatter plot
res = True
res@gsnMaximize = True
res@gsnDraw = False
res@gsnFrame = False
res@xyMarkLineModes = "Markers"
res@xyMarkers = 16
res@xyMarkerColors = "black"
res@pmLegendDisplayMode = "Never"
res@txFontHeightF = 0.01
res@tiMainString = case + " ANN AVG: T-S Diagram"
res@tiXAxisString = salt@units
res@tiXAxisFontHeightF = 0.02
res@tiYAxisString = temp@units
res@tiYAxisFontHeightF = 0.02
res@trXMinF = smincn
res@trXMaxF = smaxcn
res@trYMinF = tmincn
res@trYMaxF = tmaxcn
res@gsnRightString = depth_min/100. + "-"+depth_max/100. +"m"
res@gsnLeftString = basin
plot = gsn_csm_xy(wks,xdata,ydata,res)
;----- pd overlay
resov = True
resov@gsnDraw = False
resov@gsnFrame = False
resov@cnLevelSelectionMode = "AutomaticLevels"
resov@cnInfoLabelOn = "False"
resov@cnLineLabelPlacementMode = "Constant"
resov@cnLineLabelFontHeightF = ".02"
plotpd = gsn_csm_contour(wks,pd,resov)
overlay(plot,plotpd)
draw(plot)
frame(wks)
end

141
samples/NCL/unique_9.ncl Normal file
View File

@@ -0,0 +1,141 @@
;************************************
; unique_9.ncl
;
; Concepts illustrated:
; - Drawing raster contours over a map
; - Creating a topography plot using raster contours
; - Reading data from binary files
; - Manually creating lat/lon coordinate arrays
; - Customizing a labelbar for a contour plot
;************************************
; This example generates a topo map over
; the area of Trinidad, Colorado.
;************************************
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
begin
wks = gsn_open_wks("ps","unique")
;----------------- read the west binary data -------------------------
binfile = "trinidad-w.bin"
quad_name = fbinrecread(binfile,0,60,"character")
map_cornersW = fbinrecread(binfile,1,4,"double")
lonW = fbinrecread(binfile,2,(/1201/),"double")
latW = fbinrecread(binfile,3,(/1201/),"double")
minmax_elevW = fbinrecread(binfile,4,2,"double")
tmpW = fbinrecread(binfile,5,(/1201,1201/),"integer")
;----------------- read the east binary data -------------------------
binfile = "trinidad-e.bin"
quad_name = fbinrecread(binfile,0,60,"character")
map_cornersE = fbinrecread(binfile,1,4,"double")
lonE = fbinrecread(binfile,2,(/1201/),"double")
latE = fbinrecread(binfile,3,(/1201/),"double")
minmax_elevE = fbinrecread(binfile,4,2,"double")
tmpE = fbinrecread(binfile,5,(/1201,1201/),"integer")
;----------------------------------------------------------------------
min_elev = min((/minmax_elevW(0),minmax_elevE(0)/))*3.28
max_elev = max((/minmax_elevW(1),minmax_elevE(1)/))*3.28
lat = new(1201,"double")
lat = latW
lat!0 = "lat"
lat&lat = latW ; same as latE
lat@long_name = "latitude"
lat@units = "degrees_north"
lon = new(2401,"double")
lon(0:1200) = lonW
lon(1201:2400) = lonE(1:1200)
lon!0 = "lon"
lon&lon = lon
lon@long_name = "longitude"
lon@units = "degrees_east"
data = new((/1201,2401/),"float") ; (lat,lon)
data!0 = "lat"
data&lat = lat
data!1 = "lon"
data&lon = lon
data(:,0:1200) = (/tmpW*3.28/) ; convert to feet
data(:,1201:2400) = (/tmpE(:,1:1200)*3.28/) ; convert to feet
;-------------------------------------------------------------
;
; Define colormap.
;
cmap = (/(/1.00, 1.00, 1.00/),(/0.00, 0.00, 0.00/), \
(/0.51, 0.13, 0.94/),(/0.00, 0.00, 0.59/), \
(/0.00, 0.00, 0.80/),(/0.25, 0.41, 0.88/), \
(/0.12, 0.56, 1.00/),(/0.00, 0.75, 1.00/), \
(/0.63, 0.82, 1.00/),(/0.82, 0.96, 1.00/), \
(/1.00, 1.00, 0.78/),(/1.00, 0.88, 0.20/), \
(/1.00, 0.67, 0.00/),(/1.00, 0.43, 0.00/), \
(/1.00, 0.00, 0.00/),(/0.78, 0.00, 0.00/), \
(/0.63, 0.14, 0.14/),(/1.00, 0.41, 0.70/)/)
gsn_define_colormap(wks,cmap)
res = True
res@gsnMaximize = True
res@gsnAddCyclic = False
; map plot resources
res@mpFillOn = False
res@mpLimitMode = "Corners"
res@mpDataBaseVersion = "Ncarg4_1"
res@mpOutlineBoundarySets = "AllBoundaries"
res@mpLeftCornerLonF = map_cornersW(0)
res@mpLeftCornerLatF = map_cornersW(1)
res@mpRightCornerLonF = map_cornersE(2)
res@mpRightCornerLatF = map_cornersE(3)
; contour resources
res@cnFillOn = True
res@cnLinesOn = False
res@cnFillMode = "RasterFill"
res@cnLevelSelectionMode = "ExplicitLevels"
res@cnLevels = (/ 5000., 6000., 7000., 8000., 8500., 9000., \
9500.,10000.,10500.,11000.,11500.,12000., \
12500.,13000.,13500./)
; tickmark resources
res@pmTickMarkDisplayMode = "Always"
res@tmXBLabelFontHeightF = 0.010
; labelbar resources
res@pmLabelBarWidthF = 0.60
res@txFontHeightF = 0.012
res@lbTitleString = "elevation above mean sea level (feet)"
res@lbTitleFontHeightF = 0.012
res@lbLabelFontHeightF = 0.008
res@lbTitleOffsetF = -0.27
res@lbBoxMinorExtentF = 0.15
res@pmLabelBarOrthogonalPosF = -.05
; title resources
res@tiMainString = "USGS DEM TRINIDAD (1 x 2 degrees)"
res@tiMainOffsetYF = -0.02 ; Move title down towards graphic.
res@tiMainFontHeightF = 0.015
res@gsnLeftString = "Min Elevation: "+min_elev
res@gsnRightString = "Max Elevation: "+max_elev
res@gsnCenterString = "Scale 1:250,000"
plot = gsn_csm_contour_map(wks,data,res)
end

131
samples/NCL/viewport_4.ncl Normal file
View File

@@ -0,0 +1,131 @@
; ***********************************************
; viewport_4.ncl
;
; Concepts illustrated:
; - Drawing an XY plot with multiple curves
; - Using drawNDCGrid to draw a nicely labeled NDC grid
; - Changing the size/shape of an XY plot using viewport resources
; - Drawing two XY plots on the same page using viewport resources
; - Drawing polylines, polymarkers, and text in NDC space
; - Using "getvalues" to retrieve resource values
; - Maximizing plots after they've been created
; ***********************************************
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/shea_util.ncl"
;********************************************************************
; Draw a box around the viewport of the given object..
;********************************************************************
procedure draw_vp_box(wks,plot)
local vpx, vpy, vpw, vph, xbox, ybox, lnres, mkres, txres
begin
; Retrieve the viewport values of the drawable object.
getvalues plot
"vpXF" : vpx
"vpYF" : vpy
"vpWidthF" : vpw
"vpHeightF" : vph
end getvalues
; Set up some marker resources.
mkres = True
mkres@gsMarkerIndex = 16 ; filled dot
mkres@gsMarkerSizeF = 0.02 ; larger than default
mkres@gsMarkerColor = "Red"
; Draw a single marker at the vpXF/vpYF location.
gsn_polymarker_ndc(wks,vpx,vpy,mkres)
; Set up some text resources.
txres = True
txres@txJust = "BottomLeft"
txres@txFontHeightF = 0.018
txres@txFontColor = "Blue"
txres@txBackgroundFillColor = "white"
gsn_text_ndc(wks,"(vpXF="+vpx+", vpYF="+vpy+")",vpx,vpy+0.02,txres)
; Set up some line resources.
lnres = True
lnres@gsLineColor = "Red" ; line color
lnres@gsLineThicknessF = 2.0 ; 3.5 times as thick
; Draw lines indicating the width and height
xline = (/vpx, vpx+vpw/)
yline = (/vpy-0.05,vpy-0.05/)
gsn_polyline_ndc(wks,xline,yline,lnres)
xline = (/vpx+0.05,vpx+0.05/)
yline = (/vpy,vpy-vph/)
gsn_polyline_ndc(wks,xline,yline,lnres)
txres@txJust = "CenterCenter"
gsn_text_ndc(wks,"vpWidthF = " + vpw,vpx+vpw/2.,vpy-0.05,txres)
txres@txAngleF = 90.
gsn_text_ndc(wks,"vpHeightF = " + vph,vpx+0.05,vpy-vph/2.,txres)
end
;********************************************************************
; Main code
;********************************************************************
begin
;************************************************
; read in data
;************************************************
f = addfile ("$NCARG_ROOT/lib/ncarg/data/cdf/uv300.nc","r")
u = f->U ; get u data
;************************************************
; plotting parameters
;************************************************
wks = gsn_open_wks ("ps","viewport") ; open workstation
res = True ; plot mods desired
res@gsnFrame = False ; don't advance frame yet
res@vpWidthF = 0.8 ; set width and height
res@vpHeightF = 0.3
; First plot
res@tiMainString = "Plot 1"
res@vpXF = 0.15
res@vpYF = 0.9 ; Higher on the page
plot1 = gsn_csm_xy (wks,u&lat,u(0,:,{82}),res) ; create plot
; Second plot
res@tiMainString = "Plot 2"
res@vpXF = 0.15 ; Same X location as first plot
res@vpYF = 0.4 ; Lower on the page
plot2 = gsn_csm_xy (wks,u&lat,u(0,:,{3}),res) ; create plot
; Advance the frame
frame(wks)
; Now draw the two plots with illustrations.
drawNDCGrid(wks) ; Draw helpful grid lines showing NDC square.
draw(plot1) ; Draw the two plots
draw(plot2)
draw_vp_box(wks,plot1) ; Draw boxes around the two viewports.
draw_vp_box(wks,plot2)
frame(wks) ; Advance the frame.
;
; Uncomment the next two lines if you want to maximize these plots for
; PS or PDF output.
;
; psres = True
; maximize_output(wks,psres) ; calls draw and frame for you
end

View File

@@ -0,0 +1,120 @@
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
begin
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;
; Example of plotting station model data over a map
; illustrating how the wind barb directions are adjusted
; for the map projection.
;
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;
; City names.
;
cities = (/ "NCAR", "Seattle", "San Francisco", \
"Los Angeles", "Billings", "El Paso", \
"Houston", "Kansas City", "Minneapolis", \
"Chicago", "Detroit", "Atlanta", \
"Miami", "New York", "Eugene", \
"Boise", "Salt Lake", "Phoenix", \
"Albuquerque", "Bismarck", "Tulsa", \
"Dallas", "Little Rock", "Lexington", \
"Charlotte", "Norfolk", "Bangor" \
/)
city_lats = (/ 40.0, 47.6, 37.8, \
34.1, 45.8, 31.8, \
29.8, 39.1, 45.0, \
41.9, 42.3, 33.8, \
25.8, 40.8, 44.1, \
43.6, 40.7, 33.5, \
35.1, 46.7, 36.0, \
32.8, 34.7, 38.1, \
35.2, 36.8, 44.8 \
/)
city_lons = (/ -105.0, -122.3, -122.4, \
-118.3, -108.5, -106.5, \
-095.3, -094.1, -093.8, \
-087.6, -083.1, -084.4, \
-080.2, -074.0, -123.1, \
-116.2, -111.9, -112.1, \
-106.6, -100.8, -096.0, \
-096.8, -092.3, -084.1, \
-080.8, -076.3, -068.8 \
/)
;
; Station model data for the 27 cities.
;
imdat = (/"11000000751126021360300004955054054600007757087712", \
"11103100011104021080300004959055050600517043080369", \
"11206200031102021040300004963056046601517084081470", \
"11309300061000021020300004967057042602017125082581", \
"11412400091002021010300004971058038602517166083592", \
"11515500121004020000300004975050034603017207084703", \
"11618600151006020030300004979051030603507248085814", \
"11721700181008020050300004983052026604007289086925", \
"11824800211009020070300004987053022604507323087036", \
"11927900241011020110300004991054018605017364088147", \
"11030000271013020130300004995055014605517405089258", \
"11133100301015020170300004999056010606017446080369", \
"11236200331017020200300004000057006606517487081470", \
"11339300361019020230300004004058002607017528082581", \
"11442400391021020250300004008050000607517569083692", \
"11545500421023020270300004012051040608017603084703", \
"11648600451025020290300004017052008608517644085814", \
"11751700481027020310300004021053012609017685086925", \
"11854800511029020330300004025054016609507726087036", \
"11958900541031020360300004029055018610007767088147", \
"11060000571033020380300004033056030610507808089258", \
"11163100601035020410300004037057034611007849080369", \
"11266200631037020430300004041058043611507883081470", \
"11369300661039020470300004045050041612007924082581", \
"11472400691041020500300004048051025612507965083692", \
"11575500721043020530300004051052022613507996084703", \
"11678600751048021580300004055053013614007337085814" \
/)
;
; Define a color map and open a workstation.
;
cmap = (/ \
(/ 1., 1., 1. /), \ ; color index 0 - white
(/ 0., 0., 0. /) \ ; color index 1 - black
/)
wks = gsn_open_wks("ps","weather_sym")
gsn_define_colormap(wks,cmap)
;
; Draw a world map.
;
mpres = True
mpres@gsnFrame = False
mpres@mpSatelliteDistF = 1.3
mpres@mpOutlineBoundarySets = "USStates"
mpres@mpCenterLatF = 40.
mpres@mpCenterLonF = -97.
mpres@mpCenterRotF = 35.
map = gsn_map(wks,"Satellite",mpres)
;
; Scale the station model plot (all aspects of the station
; model plots are scaled as per the size of the wind barb).
;
wmsetp("wbs",0.018)
;
; In the middle of Nebraska, draw a wind barb for a north wind
; with a magnitude of 15 knots.
;
wmbarbmap(wks,42.,-99.,0.,-15.)
;
; Draw the station model data at the selected cities. The call
; to wmsetp informs wmstnm that the wind barbs will be drawn over
; a map. To illustrate the adjustment for plotting the model
; data over a map, all winds are from the north.
;
wmsetp("ezf",1)
wmstnm(wks,city_lats,city_lons,imdat)
frame(wks)
end

151
samples/NCL/xy_29.ncl Normal file
View File

@@ -0,0 +1,151 @@
; xy_29.ncl
;
; Concepts illustrated:
; - Reading data from an ASCII file with headers
; - Creating a separate procedure to create a specific plot
; - Attaching polymarkers to an XY plot
;
; This script was originally from Dr. Birgit Hassler (NOAA)
;****************************************************
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
;************************************************
; Plot Procedure
;************************************************
procedure plotTCOPolym(pltName[1]:string, pltType[1]:string, filName[1]:string \
,xTitle[1]:string , yTitle[1]:string \
,year[*]:numeric, y[*]:numeric)
local wks, res, ntim, gsres, MarkerCol, OldYear, i, xmarker, ymarker
begin
wks = gsn_open_wks(pltType,pltName)
gsn_define_colormap(wks,"default")
res = True
res@gsnMaximize = True ; make "ps", "eps", "pdf" large
res@vpHeightF = 0.5 ; change aspect ratio of plot
res@vpWidthF = 0.75
res@vpXF = 0.15 ; start plot at x ndc coord
res@tiXAxisString = xTitle
res@tiYAxisString = yTitle
res@tiMainString = filName
ntim = dimsizes(year)
res@trXMinF = year(0)-1
res@trXMaxF = year(ntim-1)+1
res@gsnDraw = False
res@gsnFrame = False
res@xyMarkLineMode = "markers"
res@xyMarker = 16
res@xyMarkerColor = "Background"
plot = gsn_csm_xy (wks,year,y,res) ; create plot frame ork
; add different color polymarkers for each year
gsres = True
MarkerCol = 2
OldYear = year(0)
do i=0,ntim-1
xmarker = year(i)
ymarker = y(i)
if (i.gt.0) then
if (year(i).gt.OldYear) then
MarkerCol = MarkerCol+1
end if
OldYear = year(i)
end if
gsres@gsMarkerColor = MarkerCol
gsres@gsMarkerIndex = 16
;gsres@gsMarkerSizeF = 15.0
; add (attach) polymarkers to existing plot object
plot@$unique_string("dum")$ = gsn_add_polymarker(wks,plot,xmarker,ymarker,gsres)
end do
draw(plot)
frame(wks)
end
;***********************************************************
; MAIN
;***********************************************************
pltType = "ps" ; "ps", "eps", "png", "x11"
; read multiple ascii file names
;;fili = "Southpole_TCOTimeSeries_11.dat"
diri = "./"
fili = systemfunc("cd "+diri+" ; ls *TCOT*dat")
print(fili)
nfil = dimsizes(fili)
nhead= 4 ; number of header lines on ascii file(s)
ncol = 4 ; year, month, day, O3
do nf=0,nfil-1
sfx = get_file_suffix(fili(nf), 0) ; sfx = ".dat"
filx = sfx@fBase ; filx= "Southpole_TCOTimeSeries_11"
; read ascii files
data = readAsciiTable(diri+fili(nf), ncol, "float", nhead)
dimd = dimsizes(data)
ntim = dimd(0) ; # rows
year = toint( data(:,0) ) ; user decision ... convert to integer
mon = toint( data(:,1) )
day = toint( data(:,2) )
hour = new (ntim, "integer", "No_FillValue")
mn = new (ntim, "integer", "No_FillValue")
sec = new (ntim, "double" , "No_FillValue")
hour = 0
mn = 0
sec = 0d0
; create COARDS/udunits time variable
;;tunits = "days since 1900-01-01 00:00:0.0"
tunits = "days since "+year(0)+"-"+mon(0)+"-"+day(0)+" 00:00:0.0"
time = cd_inv_calendar(year,mon,day,hour,mn,sec,tunits, 0)
time!0 = "time"
time&time = time
;printVarSummary(time)
; create a Gregorin 'date' variable
date = year*10000 + mon*100 + day
date!0 = "time"
date@units = "yyyymmdd"
date&time = time
;printVarSummary(date)
O3 = data(:,3)
O3@long_name = "total column ozone"
O3@units = "DU"
O3!0 = "time"
O3&time = time
;printVarSummary(O3)
;print(" ")
;print(date+" "+time+" "+O3)
; plot
yTitle = O3@long_name
year@long_name = "YEAR"
plotTCOPolym (filx, pltType, fili(nf), year@long_name, yTitle, year, O3)
delete(time) ; delete ... size (# rows) may change in the next file
delete(date)
delete(year)
delete(mon )
delete(day )
delete(mn )
delete(sec )
delete(O3 )
delete(data)
end do

View File

@@ -0,0 +1,16 @@
//
// Siesta.h
// Siesta
//
// Created by Paul on 2015/6/14.
// Copyright © 2015 Bust Out Solutions. MIT license.
//
#import <UIKit/UIKit.h>
//! Project version number for Siesta.
FOUNDATION_EXPORT double SiestaVersionNumber;
//! Project version string for Siesta.
FOUNDATION_EXPORT const unsigned char SiestaVersionString[];

699
samples/Perl6/List.pm Normal file
View File

@@ -0,0 +1,699 @@
# for our tantrums
my class X::TypeCheck { ... }
my role Supply { ... }
my sub combinations($n, $k) {
my @result;
my @stack;
return ([],) unless $k;
@stack.push(0);
gather while @stack {
my $index = @stack - 1;
my $value = @stack.pop;
while $value < $n {
@result[$index++] = $value++;
@stack.push($value);
if $index == $k {
take [@result];
$value = $n; # fake a last
}
}
}
}
my sub permutations(Int $n) {
$n == 1 ?? ( [0,] ) !!
gather for ^$n -> $i {
my @i = grep none($i), ^$n;
take [$i, @i[@$_]] for permutations($n - 1);
}
}
my class List does Positional { # declared in BOOTSTRAP
# class List is Iterable is Cool
# has Mu $!items; # VM's array of our reified elements
# has Mu $!flattens; # true if this list flattens its parcels
# has Mu $!nextiter; # iterator for generating remaining elements
method new(|) {
my Mu $args := nqp::p6argvmarray();
nqp::shift($args);
nqp::p6list($args, self.WHAT, Mu);
}
multi method Bool(List:D:) { self.gimme(1).Bool }
multi method Int(List:D:) { self.elems }
multi method end(List:D:) { self.elems - 1 }
multi method Numeric(List:D:) { self.elems }
multi method Str(List:D:) { self.join(' ') }
# Pretend we're a Match assuming we're a list of Matches
method to() { self.elems ?? self[self.end].to !! Nil }
method from() { self.elems ?? self[0].from !! Nil }
method fmt($format = '%s', $separator = ' ') {
self.map({ .fmt($format) }).join($separator);
}
method flat() { self.flattens
?? self
!! nqp::p6list(nqp::list(self), List, Bool::True)
}
method list() { self }
method lol() {
self.gimme(0);
my Mu $rpa := nqp::clone($!items);
nqp::push($rpa, $!nextiter) if $!nextiter.defined;
nqp::p6list($rpa, LoL, Mu);
}
method flattens() { $!flattens }
method Capture() {
self.gimme(*);
my $cap := nqp::create(Capture);
nqp::bindattr($cap, Capture, '$!list', $!items);
$cap
}
method Parcel() {
my Mu $rpa := nqp::clone(nqp::p6listitems(self));
nqp::push($rpa, $!nextiter) if $!nextiter.defined;
nqp::p6parcel($rpa, Any);
}
method Supply(List:D:) { Supply.from-list(self) }
multi method at_pos(List:D: int \pos) is rw {
fail X::OutOfRange.new(:what<Index>,:got(pos),:range<0..Inf>)
if nqp::islt_i(pos,0);
self.exists_pos(pos) ?? nqp::atpos($!items,pos) !! Nil;
}
multi method at_pos(List:D: Int:D \pos) is rw {
my int $pos = nqp::unbox_i(pos);
fail X::OutOfRange.new(:what<Index>,:got(pos),:range<0..Inf>)
if nqp::islt_i($pos,0);
self.exists_pos($pos) ?? nqp::atpos($!items,$pos) !! Nil;
}
method eager() { self.gimme(*); self }
method elems() {
return 0 unless self.DEFINITE;
return nqp::elems(nqp::p6listitems(self)) unless nqp::defined($!nextiter);
# Get as many elements as we can. If gimme stops before
# reaching the end of the list, assume the list is infinite.
my $n := self.gimme(*);
nqp::defined($!nextiter) ?? Inf !! $n
}
multi method exists_pos(List:D: int $pos) {
return False if nqp::islt_i($pos,0);
self.gimme($pos + 1);
nqp::p6bool(
nqp::not_i(nqp::isnull(nqp::atpos($!items,$pos)))
);
}
multi method exists_pos(List:D: Int:D $pos) {
return False if $pos < 0;
self.gimme($pos + 1);
nqp::p6bool(
nqp::not_i(nqp::isnull(nqp::atpos($!items,nqp::unbox_i($pos))))
);
}
method gimme($n, :$sink) {
return unless self.DEFINITE;
# loop through iterators until we have at least $n elements
my int $count = nqp::elems(nqp::p6listitems(self));
if nqp::istype($n, Whatever) || nqp::istype($n, Num) && nqp::istrue($n == Inf) {
while $!nextiter.DEFINITE && !$!nextiter.infinite {
$!nextiter.reify(*, :$sink);
$count = nqp::elems($!items);
}
}
else {
my int $target = $n.Int;
while nqp::isconcrete($!nextiter) && $count < $target {
$!nextiter.reify($target - $count, :$sink);
$count = nqp::elems($!items);
}
}
# return the number of elements we have now
$count
}
multi method infinite(List:D:) { $!nextiter.infinite }
method iterator() {
# Return a reified ListIter containing our currently reified elements
# and any subsequent iterator.
my $iter := nqp::create(ListIter);
nqp::bindattr($iter, ListIter, '$!nextiter', $!nextiter);
nqp::bindattr($iter, ListIter, '$!reified', self.Parcel());
$iter;
}
method munch($n is copy) {
$n = 0 if $n < 0;
$n = self.gimme($n) if nqp::not_i(nqp::istype($n, Int))
|| nqp::not_i(nqp::islist($!items))
|| nqp::islt_i(nqp::elems($!items), nqp::unbox_i($n));
nqp::p6parcel(
nqp::p6shiftpush(nqp::list(), $!items, nqp::unbox_i($n)),
Any
)
}
proto method pick(|) { * }
multi method pick() {
fail "Cannot .pick from infinite list" if self.infinite;
my $elems = self.elems;
$elems ?? self.at_pos($elems.rand.floor) !! Nil;
}
multi method pick($n is copy) {
fail "Cannot .pick from infinite list" if self.infinite;
## We use a version of Fisher-Yates shuffle here to
## replace picked elements with elements from the end
## of the list, resulting in an O(n) algorithm.
my $elems = self.elems;
return unless $elems;
$n = Inf if nqp::istype($n, Whatever);
$n = $elems if $n > $elems;
return self.at_pos($elems.rand.floor) if $n == 1;
my Mu $rpa := nqp::clone($!items);
my $i;
my Mu $v;
gather while $n > 0 {
$i = nqp::rand_I(nqp::decont($elems), Int);
$elems--; $n--;
$v := nqp::atpos($rpa, nqp::unbox_i($i));
# replace selected element with last unpicked one
nqp::bindpos($rpa, nqp::unbox_i($i),
nqp::atpos($rpa, nqp::unbox_i($elems)));
take-rw $v;
}
}
method pop() is parcel {
my $elems = self.gimme(*);
fail 'Cannot .pop from an infinite list' if $!nextiter.defined;
$elems > 0
?? nqp::pop($!items)
!! fail 'Element popped from empty list';
}
method shift() is parcel {
# make sure we have at least one item, then shift+return it
nqp::islist($!items) && nqp::existspos($!items, 0) || self.gimme(1)
?? nqp::shift($!items)
!! fail 'Element shifted from empty list';
}
my &list_push = multi method push(List:D: *@values) {
fail 'Cannot .push an infinite list' if @values.infinite;
nqp::p6listitems(self);
my $elems = self.gimme(*);
fail 'Cannot .push to an infinite list' if $!nextiter.DEFINITE;
# push is always eager
@values.gimme(*);
# need type checks?
my $of := self.of;
unless $of =:= Mu {
X::TypeCheck.new(
operation => '.push',
expected => $of,
got => $_,
).throw unless nqp::istype($_, $of) for @values;
}
nqp::splice($!items,
nqp::getattr(@values, List, '$!items'),
$elems, 0);
self;
}
multi method push(List:D: \value) {
if nqp::iscont(value) || nqp::not_i(nqp::istype(value, Iterable)) && nqp::not_i(nqp::istype(value, Parcel)) {
$!nextiter.DEFINITE && self.gimme(*);
fail 'Cannot .push to an infinite list' if $!nextiter.DEFINITE;
nqp::p6listitems(self);
nqp::istype(value, self.of)
?? nqp::push($!items, nqp::assign(nqp::p6scalarfromdesc(nqp::null), value))
!! X::TypeCheck.new(
operation => '.push',
expected => self.of,
got => value,
).throw;
self
}
else {
list_push(self, value)
}
}
multi method unshift(List:D: \value) {
if nqp::iscont(value) || !(nqp::istype(value, Iterable) || nqp::istype(value, Parcel)) {
nqp::p6listitems(self);
value.gimme(*) if nqp::istype(value, List); # fixes #121994
nqp::istype(value, self.of)
?? nqp::unshift($!items, my $ = value)
!! X::TypeCheck.new(
operation => '.push',
expected => self.of,
got => value,
).throw;
self
}
else {
callsame();
}
}
multi method unshift(List:D: *@values) {
fail 'Cannot .unshift an infinite list' if @values.infinite;
nqp::p6listitems(self);
# don't bother with type checks
my $of := self.of;
if ( $of =:= Mu ) {
nqp::unshift($!items, @values.pop) while @values;
}
# we must check types
else {
while @values {
my $value := @values.pop;
if nqp::istype($value, $of) {
nqp::unshift($!items, $value);
}
# huh?
else {
X::TypeCheck.new(
operation => '.unshift',
expected => $of,
got => $value,
).throw;
}
}
}
self
}
method plan(List:D: |args) {
nqp::p6listitems(self);
my $elems = self.gimme(*);
fail 'Cannot add plan to an infinite list' if $!nextiter.defined;
# # need type checks?
# my $of := self.of;
#
# unless $of =:= Mu {
# X::TypeCheck.new(
# operation => '.push',
# expected => $of,
# got => $_,
# ).throw unless nqp::istype($_, $of) for @values;
# }
nqp::bindattr(self, List, '$!nextiter', nqp::p6listiter(nqp::list(args.list), self));
Nil;
}
proto method roll(|) { * }
multi method roll() {
fail "Cannot .roll from infinite list" if self.infinite;
my $elems = self.elems;
$elems ?? self.at_pos($elems.rand.floor) !! Nil;
}
multi method roll($n is copy) {
fail "Cannot .roll from infinite list" if self.infinite;
my $elems = self.elems;
return unless $elems;
$n = Inf if nqp::istype($n, Whatever);
return self.at_pos($elems.rand.floor) if $n == 1;
gather while $n > 0 {
take nqp::atpos($!items, nqp::unbox_i($elems.rand.floor.Int));
$n--;
}
}
method reverse() {
self.gimme(*);
fail 'Cannot .reverse from an infinite list' if $!nextiter.defined;
my Mu $rev := nqp::list();
my Mu $orig := nqp::clone($!items);
nqp::push($rev, nqp::pop($orig)) while $orig;
my $rlist := nqp::create(self.WHAT);
nqp::bindattr($rlist, List, '$!items', $rev);
$rlist;
}
method rotate(Int $n is copy = 1) {
self.gimme(*);
fail 'Cannot .rotate an infinite list' if $!nextiter.defined;
my $items = nqp::p6box_i(nqp::elems($!items));
return self if !$items;
$n %= $items;
return self if $n == 0;
my Mu $res := nqp::clone($!items);
if $n > 0 {
nqp::push($res, nqp::shift($res)) while $n--;
}
elsif $n < 0 {
nqp::unshift($res, nqp::pop($res)) while $n++;
}
my $rlist := nqp::create(self.WHAT);
nqp::bindattr($rlist, List, '$!items', $res);
$rlist;
}
method splice($offset = 0, $size?, *@values) {
self.gimme(*);
my $o = $offset;
my $s = $size;
my $elems = self.elems;
$o = $o($elems) if nqp::istype($o, Callable);
X::OutOfRange.new(
what => 'offset argument to List.splice',
got => $offset,
range => (0..^self.elems),
).fail if $o < 0;
$s //= self.elems - ($o min $elems);
$s = $s(self.elems - $o) if nqp::istype($s, Callable);
X::OutOfRange.new(
what => 'size argument to List.splice',
got => $size,
range => (0..^(self.elems - $o)),
).fail if $s < 0;
my @ret = self[$o..($o + $s - 1)];
nqp::splice($!items,
nqp::getattr(@values.eager, List, '$!items'),
$o.Int, $s.Int);
@ret;
}
method sort($by = &infix:<cmp>) {
fail 'Cannot .sort an infinite list' if self.infinite; #MMD?
# Instead of sorting elements directly, we sort a Parcel of
# indices from 0..^$list.elems, then use that Parcel as
# a slice into self. This is for historical reasons: on
# Parrot we delegate to RPA.sort. The JVM implementation
# uses a Java collection sort. MoarVM has its sort algorithm
# implemented in NQP.
# nothing to do here
my $elems := self.elems;
return self if $elems < 2;
# Range is currently optimized for fast Parcel construction.
my $index := Range.new(0, $elems, :excludes-max).reify(*);
my Mu $index_rpa := nqp::getattr($index, Parcel, '$!storage');
# if $by.arity < 2, then we apply the block to the elements
# for sorting.
if ($by.?count // 2) < 2 {
my $list = self.map($by).eager;
nqp::p6sort($index_rpa, -> $a, $b { $list.at_pos($a) cmp $list.at_pos($b) || $a <=> $b });
}
else {
my $list = self.eager;
nqp::p6sort($index_rpa, -> $a, $b { $by($list.at_pos($a), $list.at_pos($b)) || $a <=> $b });
}
self[$index];
}
multi method ACCEPTS(List:D: $topic) { self }
method uniq(|c) {
DEPRECATED('unique', |<2014.11 2015.11>);
self.unique(|c);
}
proto method unique(|) {*}
multi method unique() {
my $seen := nqp::hash();
my str $target;
gather for @.list {
$target = nqp::unbox_s($_.WHICH);
unless nqp::existskey($seen, $target) {
nqp::bindkey($seen, $target, 1);
take $_;
}
}
}
multi method unique( :&as!, :&with! ) {
my @seen = "should be Mu, but doesn't work in settings :-("
my Mu $target;
gather for @.list {
$target = &as($_);
if first( { with($target,$_) }, @seen ) =:= Nil {
@seen.push($target);
take $_;
}
};
}
multi method unique( :&as! ) {
my $seen := nqp::hash();
my str $target;
gather for @.list {
$target = &as($_).WHICH;
unless nqp::existskey($seen, $target) {
nqp::bindkey($seen, $target, 1);
take $_;
}
}
}
multi method unique( :&with! ) {
nextwith() if &with === &[===]; # use optimized version
my @seen; # should be Mu, but doesn't work in settings :-(
my Mu $target;
gather for @.list {
$target := $_;
if first( { with($target,$_) }, @seen ) =:= Nil {
@seen.push($target);
take $_;
}
}
}
my @secret;
proto method squish(|) {*}
multi method squish( :&as!, :&with = &[===] ) {
my $last = @secret;
my str $which;
gather for @.list {
$which = &as($_).Str;
unless with($which,$last) {
$last = $which;
take $_;
}
}
}
multi method squish( :&with = &[===] ) {
my $last = @secret;
gather for @.list {
unless with($_,$last) {
$last = $_;
take $_;
}
}
}
proto method rotor(|) {*}
multi method rotor(1, 0) { self }
multi method rotor($elems = 2, $overlap = 1) {
X::OutOfRange.new(
what => 'Overlap argument to List.rotor',
got => $overlap,
range => (0 .. $elems - 1),
).fail unless 0 <= $overlap < $elems;
X::OutOfRange.new(
what => 'Elements argument to List.rotor',
got => $elems,
range => (0 .. *),
).fail unless 0 <= $elems;
my $finished = 0;
gather while $finished + $overlap < self.gimme($finished + $elems) {
take item self[$finished ..^ $finished + $elems];
$finished += $elems - $overlap
}
}
multi method gist(List:D:) {
@(self).map( -> $elem {
given ++$ {
when 101 { '...' }
when 102 { last }
default { $elem.gist }
}
} ).join: ' ';
}
multi method perl(List:D \SELF:) {
self.gimme(*);
self.Parcel.perl ~ '.list'
~ (nqp::iscont(SELF) ?? '.item' !! '')
}
method REIFY(Parcel \parcel, Mu \nextiter) {
nqp::splice($!items, nqp::getattr(parcel, Parcel, '$!storage'),
nqp::elems($!items), 0);
nqp::bindattr(self, List, '$!nextiter', nextiter);
parcel
}
method FLATTENABLE_LIST() { self.gimme(*); $!items }
method FLATTENABLE_HASH() { nqp::hash() }
multi method DUMP(List:D: :$indent-step = 4, :%ctx?) {
return DUMP(self, :$indent-step) unless %ctx;
my $flags := ("\x221e" if self.infinite);
my Mu $attrs := nqp::list();
nqp::push($attrs, '$!flattens');
nqp::push($attrs, $!flattens );
nqp::push($attrs, '$!items' );
nqp::push($attrs, $!items );
nqp::push($attrs, '$!nextiter');
nqp::push($attrs, $!nextiter );
self.DUMP-OBJECT-ATTRS($attrs, :$indent-step, :%ctx, :$flags);
}
multi method keys(List:D:) {
self.values.map: { (state $)++ }
}
multi method kv(List:D:) {
gather for self.values {
take (state $)++;
take-rw $_;
}
}
multi method values(List:D:) {
my Mu $rpa := nqp::clone(nqp::p6listitems(self));
nqp::push($rpa, $!nextiter) if $!nextiter.defined;
nqp::p6list($rpa, List, self.flattens);
}
multi method pairs(List:D:) {
self.values.map: {; (state $)++ => $_ }
}
method reduce(List: &with) {
fail('can only reduce with arity 2')
unless &with.arity <= 2 <= &with.count;
return unless self.DEFINITE;
my \vals = self.values;
my Mu $val = vals.shift;
$val = with($val, $_) for vals;
$val;
}
method sink() {
self.gimme(*, :sink) if self.DEFINITE && $!nextiter.DEFINITE;
Nil;
}
# this is a remnant of a previous implementation of .push(), which
# apparently is used by LoL. Please remove when no longer necessary.
method STORE_AT_POS(Int \pos, Mu \v) is rw {
nqp::bindpos($!items, nqp::unbox_i(pos), v)
}
proto method combinations($?) {*}
multi method combinations( Int $of ) {
([self[@$_]] for combinations(self.elems, $of).eager)
}
multi method combinations( Range $of = 0 .. * ) {
gather for @$of {
last if $_ > self.elems;
take self.combinations($_);
}
}
method permutations() {
# need block on Moar because of RT#121830
gather { take [self[@$_]] for permutations(self.elems).eager }
}
}
sub eager(|) {
nqp::p6parcel(nqp::p6argvmarray(), Any).eager
}
sub flat(|) {
nqp::p6list(nqp::p6argvmarray(), List, Bool::True)
}
sub list(|) {
nqp::p6list(nqp::p6argvmarray(), List, Mu)
}
proto sub infix:<xx>(|) { * }
multi sub infix:<xx>() { fail "No zero-arg meaning for infix:<xx>" }
multi sub infix:<xx>(Mu \x) {x }
multi sub infix:<xx>(Mu \x, $n is copy, :$thunked!) {
$n = nqp::p6bool(nqp::istype($n, Whatever)) ?? Inf !! $n.Int;
GatherIter.new({ take x.() while --$n >= 0; }, :infinite($n == Inf)).list
}
multi sub infix:<xx>(Mu \x, Whatever, :$thunked!) {
GatherIter.new({ loop { take x.() } }, :infinite(True)).flat
}
multi sub infix:<xx>(Mu \x, Whatever) {
GatherIter.new({ loop { take x } }, :infinite(True)).flat
}
multi sub infix:<xx>(Mu \x, $n) {
my int $size = $n.Int;
my Mu $rpa := nqp::list();
if $size > 0 {
nqp::setelems($rpa, $size);
nqp::setelems($rpa, 0);
$size = $size + 1;
nqp::push($rpa,x) while $size = $size - 1;
}
nqp::p6parcel($rpa, Any);
}
proto sub pop(@) {*}
multi sub pop(@a) { @a.pop }
proto sub shift(@) {*}
multi sub shift(@a) { @a.shift }
proto sub unshift(|) {*}
multi sub unshift(\a, \elem) { a.unshift: elem }
multi sub unshift(\a, *@elems) { a.unshift: @elems }
proto sub push(|) {*}
multi sub push(\a, \elem) { a.push: elem }
multi sub push(\a, *@elems) { a.push: @elems }
sub reverse(*@a) { @a.reverse }
sub rotate(@a, Int $n = 1) { @a.rotate($n) }
sub reduce (&with, *@list) { @list.reduce(&with) }
sub splice(@arr, $offset = 0, $size?, *@values) {
@arr.splice($offset, $size, @values)
}
multi sub infix:<cmp>(@a, @b) { (@a Zcmp @b).first(&prefix:<?>) || @a <=> @b }
# vim: ft=perl6 expandtab sw=4

View File

@@ -0,0 +1 @@
hiera_include('classes')

View File

@@ -0,0 +1,9 @@
tap 'caskroom/cask'
tap 'telemachus/brew', 'https://telemachus@bitbucket.org/telemachus/brew.git'
brew 'emacs', args: ['with-cocoa', 'with-gnutls']
brew 'redis', restart_service: true
brew 'mongodb'
brew 'sphinx'
brew 'imagemagick'
brew 'mysql'
cask 'google-chrome'

2324
samples/Rust/hashmap.rs Normal file

File diff suppressed because it is too large Load Diff

12
samples/Rust/main.rs Normal file
View File

@@ -0,0 +1,12 @@
extern crate foo;
extern crate bar;
use foo::{self, quix};
use bar::car::*;
use bar;
fn main() {
println!("Hello {}", "World");
panic!("Goodbye")
}

18
samples/Text/01_top.ncl Normal file
View File

@@ -0,0 +1,18 @@
<?xml version="1.0" encoding="ISO-8859-1"?>
<ncl id="topProperty" xmlns="http://www.ncl.org.br/NCL3.0/EDTVProfile">
<head>
<regionBase>
<region height="50%" id="imageReg" top="50%"/>
</regionBase>
<descriptorBase>
<descriptor id="imageDescriptor" region="imageReg"/>
</descriptorBase>
</head>
<body>
<port component="image" id="entry"/>
<media descriptor="imageDescriptor" id="image" src="../resources/images/background.jpg"/>
</body>
</ncl>

View File

@@ -0,0 +1,34 @@
G04 DipTrace 2.4.0.2*
%INLIDARLite.ncl*%
%MOIN*%
%ADD11C,0.0394*%
%FSLAX44Y44*%
G04*
G70*
G90*
G75*
G01*
%LNBoardOutline*%
%LPD*%
X0Y23622D2*
D11*
X27953D1*
Y0D1*
X0D1*
Y23622D1*
X591Y23110D2*
X13819D1*
X591Y591D2*
Y11614D1*
Y12087D2*
Y23110D1*
X14291D2*
X27520D1*
X591Y591D2*
X13819D1*
X14291D2*
X27520D1*
Y11614D1*
Y12087D2*
Y23110D1*
M02*

View File

@@ -0,0 +1,22 @@
#define YmakeRoot $(DESTDIR)@prefix@
#define ManRoot $(DESTDIR)@mandir@
#define LibRoot $(DESTDIR)@libdir@/ncarg
#define SharePath $(DESTDIR)@datadir@
#define BuildWithF90 TRUE
#define IncSearch -I/usr/include/netcdf -I/usr/include/udunits2 -I/usr/include/freetype2 -I/usr/include/gdal
#define LibSearch -L@libdir@/hdf
#define BuildNetCDF4 1
#define NetCDF4lib -lnetcdf
#define BuildCAIRO 1
#define CAIROlib -lcairo -lfreetype
#define BuildGDAL 1
#define GDALlib -lgdal
#define BuildHDFEOS 0
#define BuildHDFEOS5 0
#define BuildTRIANGLE 0
#define HDFlib -lmfhdf -ldf -ljpeg -lz
#define HDF5lib -lhdf5_hl -lhdf5
#define BuildUdunits 1
#define UdUnitslib -ludunits2

46
samples/Text/main.ncl Normal file
View File

@@ -0,0 +1,46 @@
<?xml version="1.0" encoding="ISO-8859-1"?>
<!--
2008 PUC-RIO/LABORATORIO TELEMIDIA,
Some Rights Reserved.
This program is free software; you can redistribute it and/or modify it under
the terms of the GNU General Public License version 2 as published by
the Free Software Foundation.
This program is distributed in the hope that it will be useful, but WITHOUT ANY
WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE. See the GNU General Public License version 2 for more
details.
You should have received a copy of the GNU General Public License version 2
along with this program; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA
-->
<ncl id="teste" xmlns="http://www.ncl.org.br/NCL3.0/EDTVProfile"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.ncl.org.br/NCL3.0/EDTVProfile
http://www.ncl.org.br/NCL3.0/profiles/NCL30EDTV.xsd">
<head>
<regionBase>
<region id="luaRegion" width="100%" height="100%"/>
</regionBase>
<descriptorBase>
<descriptor id="luaDesc" region="luaRegion" focusIndex="luaIdx"/>
</descriptorBase>
</head>
<body>
<port id="init" component="lua"/>
<media type="application/x-ginga-settings" id="programSettings">
<property name="currentKeyMaster" value="luaIdx"/>
</media>
<media id="lua" descriptor="luaDesc" src="game.lua"/>
</body>
</ncl>

45
samples/Text/min-help.ncl Normal file
View File

@@ -0,0 +1,45 @@
THE_URL:file://localhost/Users/hubery/Public/ucar/Document/Functions/Built-in/min.shtml
THE_TITLE:min
NCL Home > Documentation > Functions > General applied math
min
Computes the minimum value of a multi-dimensional array.
Prototype
function min (
value : numeric
)
return_val [1] : numeric
Arguments
value
An array of one or more numeric values of any dimension.
Return value
Returns a scalar of the same type as value.
Description
This function returns the minimum value for an array of any dimensionality. Missing values are ignored; a missing value
is returned only if all values are missing.
See Also
max, minind, maxind, dim_min, dim_max, dim_min_n, dim_max_n
Examples
Example 1
f = (/2.1, 3.2, 4.3, 5.4, 6.5, 7.6, 8.7, 9.8/)
min_f = min(f)
print(min_f) ; Should be 2.1
©2015 UCAR | Privacy Policy | Terms of Use | Contact the Webmaster | Sponsored by NSF

21
samples/Text/receiver.ncl Normal file
View File

@@ -0,0 +1,21 @@
<?xml version="1.0" encoding="ISO-8859-1"?>
<ncl id="sender" xmlns="http://www.ncl.org.br/NCL3.0/EDTVProfile">
<head>
<regionBase>
<region id="rTV" width="100%" height="100%" zIndex="1"/>
</regionBase>
<descriptorBase>
<descriptor id="dTV" region="rTV" />
</descriptorBase>
</head>
<body>
<port id="pLua" component="lua" />
<media id="lua" descriptor="dTV" src="receiver.lua" />
</body>
</ncl>

View File

@@ -0,0 +1,40 @@
THE_URL:file://localhost/Users/hubery/Public/ucar/Document/Functions/Contributed/rmMonAnnCycLLT.shtml
THE_TITLE:rmMonAnnCycLLT
NCL Home > Documentation > Functions > Climatology
rmMonAnnCycLLT
Removes the annual cycle from "monthly" data.
Prototype
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
function rmMonAnnCycLLT (
x [*][*][*] : numeric
)
return_val [dimsizes(x)] : typeof(x)
Arguments
x
A three-dimensional array of monthly values, dimensioned lat x lon x time. The time dimension must be a multiple of 12.
Return value
The results are returned in an array of the same type and dimensionality as x. If the input data contains metadata, these
will be retained.
Description
This function removes the annual cycle from month (number of months = 12) data and subtracts the long term means from
each month.
See Also
rmMonAnnCycLLT, rmMonAnnCycTLL, rmMonAnnCycLLLT
©2015 UCAR | Privacy Policy | Terms of Use | Contact the Webmaster | Sponsored by NSF

View File

@@ -0,0 +1,35 @@
THE_URL:file://localhost/Users/hubery/Public/ucar/Document/Functions/Contributed/zonalAve.shtml
THE_TITLE:zonalAve
NCL Home > Documentation > Functions > General applied math
zonalAve
Computes a zonal average of the input array.
Prototype
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
function zonalAve (
x : numeric
)
return_val : typeof(x)
Arguments
x
An array of any size and type.
Return value
The results are returned in an array of the same type and one dimension smaller than x. Metadata are preserved.
Description
This function computes a zonal average of the input array x. If the input array has a "long_name" or "short_name"
attribute, it will be updated.
©2015 UCAR | Privacy Policy | Terms of Use | Contact the Webmaster | Sponsored by NSF

21
samples/Vue/basic.vue Normal file
View File

@@ -0,0 +1,21 @@
<style>
.red {
color: #f00;
}
</style>
<template>
<div>
<h2 v-class="red">{{msg}}</h2>
</div>
</template>
<script>
module.exports = {
data: function () {
return {
msg: 'Hello from Vue!'
}
}
}
</script>

View File

@@ -0,0 +1,31 @@
<style lang="stylus">
font-stack = Helvetica, sans-serif
primary-color = #999
body
font 100% font-stack
color primary-color
</style>
<template lang="jade">
div
h1 {{msg}}
comp-a
comp-b
</template>
<script lang="babel">
import compA from './components/a.vue'
import compB from './components/b.vue'
export default {
data () {
return {
msg: 'Hello from Babel!'
}
},
components: {
'comp-a': compA,
'comp-b': compB
}
}
</script>

72
samples/X10/ArraySum.x10 Normal file
View File

@@ -0,0 +1,72 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.io.Console;
/**
* A simple illustration of loop parallelization within a single place.
*/
public class ArraySum {
var sum:Long;
val data:Rail[Long];
public def this(n:Long) {
// Create a Rail with n elements (0..(n-1)), all initialized to 1.
data = new Rail[Long](n, 1);
sum = 0;
}
def sum(a:Rail[Long], start:Long, last:Long) {
var mySum: Long = 0;
for (i in start..(last-1)) {
mySum += a(i);
}
return mySum;
}
def sum(numThreads:Long) {
val mySize = data.size/numThreads;
finish for (p in 0..(numThreads-1)) async {
val mySum = sum(data, p*mySize, (p+1)*mySize);
// Multiple activities will simultaneously update
// this location -- so use an atomic operation.
atomic sum += mySum;
}
}
public static def main(args:Rail[String]) {
var size:Long = 5*1000*1000;
if (args.size >=1)
size = Long.parse(args(0));
Console.OUT.println("Initializing.");
val a = new ArraySum(size);
val P = [1,2,4];
//warmup loop
Console.OUT.println("Warming up.");
for (numThreads in P)
a.sum(numThreads);
for (numThreads in P) {
Console.OUT.println("Starting with " + numThreads + " threads.");
a.sum=0;
var time: long = - System.nanoTime();
a.sum(numThreads);
time += System.nanoTime();
Console.OUT.println("For p=" + numThreads
+ " result: " + a.sum
+ ((size==a.sum)? " ok" : " bad")
+ " (time=" + (time/(1000*1000)) + " ms)");
}
}
}

View File

@@ -0,0 +1,50 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.xrx.Runtime;
/**
* Demonstrate how to instantiate the X10 runtime as an executor service
* submit jobs to the runtime, wait jobs to complete and cancel all jobs
*
* Compile with: x10c -O -EXECUTOR_MODE=true Cancellation.x10
* Run with: X10_CANCELLABLE=true X10_NPLACES=4 x10 -DX10RT_IMPL=JavaSockets Cancellation
*/
class Cancellation {
static def job(id:Long, iterations:Long) = ()=>{
at (Place.places().next(here)) async {
for (i in 1..iterations) {
finish for (p in Place.places()) {
at (p) async Console.OUT.println(here+" says hello (job " + id + ", iteration " + i + ")");
}
Console.ERR.println();
System.sleep(200);
}
}
};
public static def main(args:Rail[String]):void {
val w1 = Runtime.submit(job(1, 5));
w1.await(); Console.ERR.println("Job 1 completed\n");
val w2 = Runtime.submit(job(2, 1000));
System.threadSleep(1000);
val c1 = Runtime.cancelAll();
try { w2.await(); } catch (e:Exception) { Console.ERR.println("Job 2 aborted with exception " + e +"\n"); }
c1.await(); // waiting for cancellation to be processed
System.threadSleep(1000);
Runtime.submit(job(3, 1000));
Runtime.submit(job(4, 1000));
System.threadSleep(1000);
val c2 = Runtime.cancelAll();
c2.await();
Console.ERR.println("Goodbye\n");
}
}

52
samples/X10/Fibonacci.x10 Normal file
View File

@@ -0,0 +1,52 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.io.Console;
/**
* This is a small program to illustrate the use of
* <code>async</code> and <code>finish</code> in a
* prototypical recursive divide-and-conquer algorithm.
* It is obviously not intended to show a efficient way to
* compute Fibonacci numbers in X10.<p>
*
* The heart of the example is the <code>run</code> method,
* which directly embodies the recursive definition of
* <pre>
* fib(n) = fib(n-1)+fib(n-2);
* </pre>
* by using an <code>async</code> to compute <code>fib(n-1)</code> while
* the current activity computes <code>fib(n-2)</code>. A <code>finish</code>
* is used to ensure that both computations are complete before
* their results are added together to compute <code>fib(n)</code>
*/
public class Fibonacci {
public static def fib(n:long) {
if (n<=2) return 1;
val f1:long;
val f2:long;
finish {
async { f1 = fib(n-1); }
f2 = fib(n-2);
}
return f1 + f2;
}
public static def main(args:Rail[String]) {
val n = (args.size > 0) ? Long.parse(args(0)) : 10;
Console.OUT.println("Computing fib("+n+")");
val f = fib(n);
Console.OUT.println("fib("+n+") = "+f);
}
}

View File

@@ -0,0 +1,86 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.array.*;
import x10.compiler.Foreach;
import x10.compiler.Inline;
/**
* This is a sample program illustrating how to use
* X10's array classes. It also illustrates the use
* of foreach to acheive intra-place parallelism.
*
* The program solves a set of 2D partial differential
* equations by iteratively applying a 5-point stencil
* operation until convergence is reached.
*/
public class HeatTransfer_v0 {
static val EPSILON = 1.0e-5;
val N:Long;
val A:Array_2[Double]{self!=null};
val Tmp:Array_2[Double]{self!=null};
public def this(size:Long) {
N = size;
A = new Array_2[Double](N+2, N+2); // zero-initialized N+2 * N+2 array of doubles
for (j in 1..N) A(0, j) = 1; // set one border row to 1
Tmp = new Array_2[Double](A);
}
final @Inline def stencil(x:Long, y:Long):Double {
return (A(x-1,y) + A(x+1,y) + A(x,y-1) + A(x,y+1)) / 4;
}
def run() {
val is = new DenseIterationSpace_2(1,1,N,N);
var delta:Double;
do {
// Compute new values, storing in tmp
delta = Foreach.blockReduce(is,
(i:Long, j:Long)=>{
Tmp(i,j) = stencil(i,j);
// Reduce max element-wise delta (A now holds previous values)
return Math.abs(Tmp(i,j) - A(i,j));
},
(a:Double, b:Double)=>Math.max(a,b), 0.0
);
// swap backing data of A and Tmp
Array.swap(A, Tmp);
} while (delta > EPSILON);
}
def prettyPrintResult() {
for (i in 1..N) {
for (j in 1..N) {
Console.OUT.printf("%1.4f ",A(i,j));
}
Console.OUT.println();
}
}
public static def main(args:Rail[String]) {
val n = args.size > 0 ? Long.parse(args(0)) : 8;
Console.OUT.println("HeatTransfer example with N="+n+" and epsilon="+EPSILON);
Console.OUT.println("Initializing data structures");
val ht = new HeatTransfer_v0(n);
Console.OUT.println("Beginning computation...");
val start = System.nanoTime();
ht.run();
val stop = System.nanoTime();
Console.OUT.printf("...completed in %1.3f seconds.\n", ((stop-start) as double)/1e9);
if (n <= 10) {
ht.prettyPrintResult();
}
}
}

View File

@@ -0,0 +1,114 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.array.*;
import x10.compiler.Foreach;
import x10.util.Team;
/**
* This is a sample program illustrating how to use
* X10's distributed array classes. It also illustrates the use
* of foreach to achieve intra-place parallelism and the mixture
* of APGAS finish/async/at with Team collective operations.
*
* This version of the program uses a vanilla DistArray without
* ghost regions. As a result, the stencil function does
* inefficient fine-grained neighbor communication to get individual values.
* Compare this to HeatTransfer_v2 which utilizes ghost regions and
* bulk ghost-region exchange functions.
*
* The program solves a set of 2D partial differential
* equations by iteratively applying a 5-point stencil
* operation until convergence is reached.
*/
public class HeatTransfer_v1 {
static val EPSILON = 1.0e-5;
val N:Long;
val A:DistArray_BlockBlock_2[Double]{self!=null};
val Tmp:DistArray_BlockBlock_2[Double]{self!=null};
public def this(size:Long) {
N = size;
val init = (i:Long, j:Long)=>i==0 ? 1.0 : 0.0;
A = new DistArray_BlockBlock_2[Double](N+2, N+2, init);
Tmp = new DistArray_BlockBlock_2[Double](N+2, N+2, init);
}
final def stencil(x:Long, y:Long):Double {
val cls = (dx:Long, dy:Long)=>{
val p = A.place(x+dx, y+dy);
p == here ? A(x+dx,y+dy) : at (p) A(x+dx,y+dy)
};
val tmp = cls(-1,0) + cls(1,0) + cls(0,-1) + cls(0,1);
return tmp / 4;
}
def run() {
val myTeam = new Team(A.placeGroup());
finish for (p in A.placeGroup()) at (p) async {
// Compute the subset of the local indices on which
// we want to apply the stencil (the interior points of the N+2 x N+2 grid)
val li = A.localIndices();
val interior = new DenseIterationSpace_2(li.min(0) == 0 ? 1 : li.min(0),
li.min(1) == 0 ? 1 : li.min(1),
li.max(0) == N+1 ? N : li.max(0),
li.max(1) == N+1 ? N : li.max(1));
var delta:Double;
do {
// Compute new values, storing in tmp
val myDelta = Foreach.blockReduce(interior,
(i:Long, j:Long)=>{
Tmp(i,j) = stencil(i,j);
// Reduce max element-wise delta (A now holds previous values)
return Math.abs(Tmp(i,j) - A(i,j));
},
(a:Double, b:Double)=>Math.max(a,b), 0.0
);
myTeam.barrier();
// Unlike Array, DistArray doesn't provide an optimized swap.
// So, until it does, we have to copy the data elements.
Foreach.block(interior, (i:Long, j:Long)=>{
A(i,j) = Tmp(i,j);
});
delta = myTeam.allreduce(myDelta, Team.MAX);
} while (delta > EPSILON);
}
}
def prettyPrintResult() {
for (i in 1..N) {
for (j in 1..N) {
val x = at (A.place(i,j)) A(i,j);
Console.OUT.printf("%1.4f ", x);
}
Console.OUT.println();
}
}
public static def main(args:Rail[String]) {
val n = args.size > 0 ? Long.parse(args(0)) : 8;
Console.OUT.println("HeatTransfer example with N="+n+" and epsilon="+EPSILON);
Console.OUT.println("Initializing data structures");
val ht = new HeatTransfer_v1(n);
Console.OUT.println("Beginning computation...");
val start = System.nanoTime();
ht.run();
val stop = System.nanoTime();
Console.OUT.printf("...completed in %1.3f seconds.\n", ((stop-start) as double)/1e9);
if (n <= 10) {
ht.prettyPrintResult();
}
}
}

View File

@@ -0,0 +1,44 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.io.Console;
/**
* The classic hello world program, with a twist - prints a message
* from the command line at every Place.
* The messages from each Place may appear in any order, but the
* finish ensures that the last message printed will be "Goodbye"
* <pre>
* Typical output:
* [dgrove@linchen samples]$ ./HelloWholeWorld 'best wishes'
* Place(1) says hello and best wishes
* Place(2) says hello and best wishes
* Place(3) says hello and best wishes
* Place(0) says hello and best wishes
* Goodbye
* [dgrove@linchen samples]$
* </pre>
*/
class HelloWholeWorld {
public static def main(args:Rail[String]):void {
if (args.size < 1) {
Console.OUT.println("Usage: HelloWholeWorld message");
return;
}
finish for (p in Place.places()) {
at (p) async Console.OUT.println(here+" says hello and "+args(0));
}
Console.OUT.println("Goodbye");
}
}

View File

@@ -0,0 +1,23 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.io.Console;
/**
* The classic hello world program, shows how to output to the console.
*/
class HelloWorld {
public static def main(Rail[String]) {
Console.OUT.println("Hello World!" );
}
}

45
samples/X10/Histogram.x10 Normal file
View File

@@ -0,0 +1,45 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
public class Histogram {
public static def compute(data:Rail[Int], numBins:Int) {
val bins = new Rail[Int](numBins);
finish for (i in data.range) async {
val b = data(i) % numBins;
atomic bins(b)++;
}
return bins;
}
public static def run(N:Int, S:Int):Boolean {
val a = new Rail[Int](N, (i:long)=> i as int);
val b = compute(a, S);
val v = b(0);
var ok:Boolean = true;
for (x in b.range) ok &= (b(x)==v);
return ok;
}
public static def main(args:Rail[String]) {
if (args.size != 2L) {
Console.OUT.println("Usage: Histogram SizeOfArray NumberOfBins");
return;
}
val N = Int.parse(args(0));
val S = Int.parse(args(1));
val ok = run(N,S);
if (ok) {
Console.OUT.println("Test ok.");
} else {
Console.OUT.println("Test failed.");
}
}
}

55
samples/X10/Integrate.x10 Normal file
View File

@@ -0,0 +1,55 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
/**
* This is a slightly more realistic example of the
* basic computational pattern of using async/finish
* to express recursive divide-and-conquer algorithms.
* The program does integration via Guassian Quadrature.
* <p>
* It also can serve as an example of using a closure.
*/
public class Integrate {
static val epsilon = 1.0e-9;
val fun:(double)=>double;
public def this(f:(double)=>double) { fun = f; }
public def computeArea(left:double, right:double) {
return recEval(left, fun(left), right, fun(right), 0);
}
private def recEval(l:double, fl:double, r:double, fr:double, a:double) {
val h = (r - l) / 2;
val hh = h / 2;
val c = l + h;
val fc = fun(c);
val al = (fl + fc) * hh;
val ar = (fr + fc) * hh;
val alr = al + ar;
if (Math.abs(alr - a) < epsilon) return alr;
val expr1:double;
val expr2:double;
finish {
async { expr1 = recEval(c, fc, r, fr, ar); };
expr2 = recEval(l, fl, c, fc, al);
}
return expr1 + expr2;
}
public static def main(args:Rail[String]) {
val obj = new Integrate((x:double)=>(x*x + 1.0) * x);
val xMax = args.size > 0 ? Long.parse(args(0)) : 10;
val area = obj.computeArea(0, xMax);
Console.OUT.println("The area of (x*x +1) * x from 0 to "+xMax+" is "+area);
}
}

151
samples/X10/KMeans.x10 Normal file
View File

@@ -0,0 +1,151 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.io.Console;
import x10.util.Random;
/**
* A KMeans object o can compute K means of a given set of
* points of dimension o.myDim.
* <p>
* This class implements a sequential program, that is readily parallelizable.
*
* For a scalable, high-performance version of this benchmark see
* KMeans.x10 in the X10 Benchmarks (separate download from x10-lang.org)
*/
public class KMeans(myDim:Long) {
static val DIM=2;
static val K=4;
static val POINTS=2000;
static val ITERATIONS=50;
static val EPS=0.01F;
static type ValVector(k:Long) = Rail[Float]{self.size==k};
static type ValVector = ValVector(DIM);
static type Vector(k:Long) = Rail[Float]{self.size==k};
static type Vector = Vector(DIM);
static type SumVector(d:Long) = V{self.dim==d};
static type SumVector = SumVector(DIM);
/**
* V represents the sum of 'count' number of vectors of dimension 'dim'.
*/
static class V(dim:Long) implements (Long)=>Float {
var vec: Vector(dim);
var count:Int;
def this(dim:Long, init:(Long)=>Float): SumVector(dim) {
property(dim);
vec = new Rail[Float](this.dim, init);
count = 0n;
}
public operator this(i:Long) = vec(i);
def makeZero() {
for (i in 0..(dim-1))
vec(i) =0.0F;
count=0n;
}
def addIn(a:ValVector(dim)) {
for (i in 0..(dim-1))
vec(i) += a(i);
count++;
}
def div(f:Int) {
for (i in 0..(dim-1))
vec(i) /= f;
}
def dist(a:ValVector(dim)):Float {
var dist:Float=0.0F;
for (i in 0..(dim-1)) {
val tmp = vec(i)-a(i);
dist += tmp*tmp;
}
return dist;
}
def dist(a:SumVector(dim)):Float {
var dist:Float=0.0F;
for (i in 0..(dim-1)) {
val tmp = vec(i)-a(i);
dist += tmp*tmp;
}
return dist;
}
def print() {
Console.OUT.println();
for (i in 0..(dim-1)) {
Console.OUT.print((i>0? " " : "") + vec(i));
}
}
def normalize() { div(count);}
def count() = count;
}
def this(myDim:Long):KMeans{self.myDim==myDim} {
property(myDim);
}
static type KMeansData(myK:Long, myDim:Long)= Rail[SumVector(myDim)]{self.size==myK};
/**
* Compute myK means for the given set of points of dimension myDim.
*/
def computeMeans(myK:Long, points:Rail[ValVector(myDim)]):KMeansData(myK, myDim) {
var redCluster : KMeansData(myK, myDim) =
new Rail[SumVector(myDim)](myK, (i:long)=> new V(myDim, (j:long)=>points(i)(j)));
var blackCluster: KMeansData(myK, myDim) =
new Rail[SumVector(myDim)](myK, (i:long)=> new V(myDim, (j:long)=>0.0F));
for (i in 1..ITERATIONS) {
val tmp = redCluster;
redCluster = blackCluster;
blackCluster=tmp;
for (p in 0..(POINTS-1)) {
var closest:Long = -1;
var closestDist:Float = Float.MAX_VALUE;
val point = points(p);
for (k in 0..(myK-1)) { // compute closest mean in cluster.
val dist = blackCluster(k).dist(point);
if (dist < closestDist) {
closestDist = dist;
closest = k;
}
}
redCluster(closest).addIn(point);
}
for (k in 0..(myK-1))
redCluster(k).normalize();
var b:Boolean = true;
for (k in 0..(myK-1)) {
if (redCluster(k).dist(blackCluster(k)) > EPS) {
b=false;
break;
}
}
if (b)
break;
for (k in 0..(myK-1))
blackCluster(k).makeZero();
}
return redCluster;
}
public static def main (Rail[String]) {
val rnd = new Random(0);
val points = new Rail[ValVector](POINTS,
(long)=>new Rail[Float](DIM, (long)=>rnd.nextFloat()));
val result = new KMeans(DIM).computeMeans(K, points);
for (k in 0..(K-1)) result(k).print();
}
}
// vim: shiftwidth=4:tabstop=4:expandtab

147
samples/X10/KMeansDist.x10 Normal file
View File

@@ -0,0 +1,147 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.array.*;
import x10.io.Console;
import x10.util.Random;
/**
* A low performance formulation of distributed KMeans using fine-grained asyncs.
*
* For a highly optimized and scalable, version of this benchmark see
* KMeans.x10 in the X10 Benchmarks (separate download from x10-lang.org)
*/
public class KMeansDist {
static val DIM=2;
static val CLUSTERS=4;
static val POINTS=2000;
static val ITERATIONS=50;
public static def main (Rail[String]) {
val world = Place.places();
val local_curr_clusters =
PlaceLocalHandle.make[Array_2[Float]](world, () => new Array_2[Float](CLUSTERS, DIM));
val local_new_clusters =
PlaceLocalHandle.make[Array_2[Float]](world, () => new Array_2[Float](CLUSTERS, DIM));
val local_cluster_counts =
PlaceLocalHandle.make[Rail[Int]](world, ()=> new Rail[Int](CLUSTERS));
val rnd = PlaceLocalHandle.make[Random](world, () => new Random(0));
val points = new DistArray_Block_2[Float](POINTS, DIM, world, (Long,Long)=>rnd().nextFloat());
val central_clusters = new Array_2[Float](CLUSTERS, DIM, (i:Long, j:Long) => {
at (points.place(i,j)) points(i,j)
});
val old_central_clusters = new Array_2[Float](CLUSTERS, DIM);
val central_cluster_counts = new Rail[Int](CLUSTERS);
for (iter in 1..ITERATIONS) {
Console.OUT.println("Iteration: "+iter);
finish {
// reset state
for (d in world) at (d) async {
for ([i,j] in central_clusters.indices()) {
local_curr_clusters()(i, j) = central_clusters(i, j);
local_new_clusters()(i, j) = 0f;
}
local_cluster_counts().clear();
}
}
finish {
// compute new clusters and counters
for (p in 0..(POINTS-1)) {
at (points.place(p,0)) async {
var closest:Long = -1;
var closest_dist:Float = Float.MAX_VALUE;
for (k in 0..(CLUSTERS-1)) {
var dist : Float = 0;
for (d in 0..(DIM-1)) {
val tmp = points(p,d) - local_curr_clusters()(k, d);
dist += tmp * tmp;
}
if (dist < closest_dist) {
closest_dist = dist;
closest = k;
}
}
atomic {
for (d in 0..(DIM-1)) {
local_new_clusters()(closest,d) += points(p,d);
}
local_cluster_counts()(closest)++;
}
}
}
}
for ([i,j] in old_central_clusters.indices()) {
old_central_clusters(i, j) = central_clusters(i, j);
central_clusters(i, j) = 0f;
}
central_cluster_counts.clear();
finish {
val central_clusters_gr = GlobalRef(central_clusters);
val central_cluster_counts_gr = GlobalRef(central_cluster_counts);
val there = here;
for (d in world) at (d) async {
// access PlaceLocalHandles 'here' and then data will be captured by at and transfered to 'there' for accumulation
val tmp_new_clusters = local_new_clusters();
val tmp_cluster_counts = local_cluster_counts();
at (there) atomic {
for ([i,j] in tmp_new_clusters.indices()) {
central_clusters_gr()(i,j) += tmp_new_clusters(i,j);
}
for (j in 0..(CLUSTERS-1)) {
central_cluster_counts_gr()(j) += tmp_cluster_counts(j);
}
}
}
}
for (k in 0..(CLUSTERS-1)) {
for (d in 0..(DIM-1)) {
central_clusters(k, d) /= central_cluster_counts(k);
}
}
// TEST FOR CONVERGENCE
var b:Boolean = true;
for ([i,j] in old_central_clusters.indices()) {
if (Math.abs(old_central_clusters(i, j)-central_clusters(i, j))>0.0001) {
b = false;
break;
}
}
if (b) break;
}
for (d in 0..(DIM-1)) {
for (k in 0..(CLUSTERS-1)) {
if (k>0)
Console.OUT.print(" ");
Console.OUT.print(central_clusters(k,d));
}
Console.OUT.println();
}
}
}
// vim: shiftwidth=4:tabstop=4:expandtab

View File

@@ -0,0 +1,144 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2015.
*/
import x10.array.Array;
import x10.array.Array_2;
import x10.compiler.Foreach;
import x10.util.Random;
/**
* A better formulation of distributed KMeans using coarse-grained asyncs to
* implement an allreduce pattern for cluster centers and counts.
*
* For a highly optimized and scalable, version of this benchmark see
* KMeans.x10 in the X10 Benchmarks (separate download from x10-lang.org)
*/
public class KMeansDistPlh {
static val DIM=2;
static val CLUSTERS=4;
static class ClusterState {
val clusters = new Array_2[Float](CLUSTERS, DIM);
val clusterCounts = new Rail[Int](CLUSTERS);
}
public static def main(args:Rail[String]) {
val numPoints = args.size > 0 ? Long.parse(args(0)) : 2000;
val iterations = args.size > 1 ? Long.parse(args(1)) : 50;
val world = Place.places();
val clusterStatePlh = PlaceLocalHandle.make[ClusterState](world, () => new ClusterState());
val currentClustersPlh = PlaceLocalHandle.make[Array_2[Float]](world, () => new Array_2[Float](CLUSTERS, DIM));
val pointsPlh = PlaceLocalHandle.make[Array_2[Float]](world, () => {
val rand = new Random(here.id);
return new Array_2[Float](numPoints/world.size(), DIM, (Long,Long)=>rand.nextFloat());
});
val centralCurrentClusters = new Array_2[Float](CLUSTERS, DIM);
val centralNewClusters = new Array_2[Float](CLUSTERS, DIM);
val centralClusterCounts = new Rail[Int](CLUSTERS);
// arbitrarily initialize central clusters to first few points
for ([i,j] in centralCurrentClusters.indices()) {
centralCurrentClusters(i,j) = pointsPlh()(i,j);
}
for (iter in 1..iterations) {
Console.OUT.println("Iteration: "+iter);
finish {
for (place in world) async {
val placeClusters = at(place) {
val currentClusters = currentClustersPlh();
Array.copy(centralCurrentClusters, currentClusters);
val clusterState = clusterStatePlh();
val newClusters = clusterState.clusters;
newClusters.clear();
val clusterCounts = clusterState.clusterCounts;
clusterCounts.clear();
// compute new clusters and counters
val points = pointsPlh();
for (p in 0..(points.numElems_1-1)) {
var closest:Long = -1;
var closestDist:Float = Float.MAX_VALUE;
for (k in 0..(CLUSTERS-1)) {
var dist : Float = 0;
for (d in 0..(DIM-1)) {
val tmp = points(p,d) - currentClusters(k, d);
dist += tmp * tmp;
}
if (dist < closestDist) {
closestDist = dist;
closest = k;
}
}
atomic {
for (d in 0..(DIM-1)) {
newClusters(closest,d) += points(p,d);
}
clusterCounts(closest)++;
}
}
clusterState
};
// combine place clusters to central
atomic {
for ([i,j] in centralNewClusters.indices()) {
centralNewClusters(i,j) += placeClusters.clusters(i,j);
}
for (j in 0..(CLUSTERS-1)) {
centralClusterCounts(j) += placeClusters.clusterCounts(j);
}
}
}
}
for (k in 0..(CLUSTERS-1)) {
for (d in 0..(DIM-1)) {
centralNewClusters(k, d) /= centralClusterCounts(k);
}
}
// TEST FOR CONVERGENCE
var b:Boolean = true;
for ([i,j] in centralCurrentClusters.indices()) {
if (Math.abs(centralCurrentClusters(i, j)-centralNewClusters(i, j)) > 0.0001) {
b = false;
break;
}
}
Array.copy(centralNewClusters, centralCurrentClusters);
if (b) break;
centralNewClusters.clear();
centralClusterCounts.clear();
}
for (d in 0..(DIM-1)) {
for (k in 0..(CLUSTERS-1)) {
if (k > 0)
Console.OUT.print(" ");
Console.OUT.print(centralCurrentClusters(k,d));
}
Console.OUT.println();
}
}
}
// vim: shiftwidth=4:tabstop=4:expandtab

192
samples/X10/KMeansSPMD.x10 Normal file
View File

@@ -0,0 +1,192 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.io.Console;
import x10.io.File;
import x10.io.Marshal;
import x10.io.IOException;
import x10.util.OptionsParser;
import x10.util.Option;
import x10.util.Team;
/**
* An SPMD formulation of KMeans.
*
* For a highly optimized and scalable version of this benchmark see
* KMeans.x10 in the X10 Benchmarks (separate download from x10-lang.org)
*/
public class KMeansSPMD {
public static def printClusters (clusters:Rail[Float], dims:long) {
for (d in 0..(dims-1)) {
for (k in 0..(clusters.size/dims-1)) {
if (k>0)
Console.OUT.print(" ");
Console.OUT.print(clusters(k*dims+d).toString());
}
Console.OUT.println();
}
}
public static def main (args:Rail[String]) {here == Place.FIRST_PLACE } {
val opts = new OptionsParser(args, [
Option("q","quiet","just print time taken"),
Option("v","verbose","print out each iteration"),
Option("h","help","this information")
], [
Option("p","points","location of data file"),
Option("i","iterations","quit after this many iterations"),
Option("c","clusters","number of clusters to find"),
Option("d","dim","number of dimensions"),
Option("s","slices","factor by which to oversubscribe computational resources"),
Option("n","num","quantity of points")
]);
if (opts.filteredArgs().size!=0L) {
Console.ERR.println("Unexpected arguments: "+opts.filteredArgs());
Console.ERR.println("Use -h or --help.");
System.setExitCode(1n);
return;
}
if (opts("-h")) {
Console.OUT.println(opts.usage(""));
return;
}
val fname = opts("-p", "points.dat");
val num_clusters=opts("-c",4);
val num_slices=opts("-s",1);
val num_global_points=opts("-n", 2000);
val iterations=opts("-i",50);
val dim=opts("-d", 4);
val verbose = opts("-v");
val quiet = opts("-q");
if (!quiet)
Console.OUT.println("points: "+num_global_points+" clusters: "+num_clusters+" dim: "+dim);
// file is dimension-major
val file = new File(fname);
val fr = file.openRead();
val init_points = (long) => Float.fromIntBits(Marshal.INT.read(fr).reverseBytes());
val num_file_points = (file.size() / dim / 4) as Int;
val file_points = new Rail[Float](num_file_points*dim, init_points);
val team = Team.WORLD;
val num_slice_points = num_global_points / num_slices / Place.numPlaces();
finish {
for (h in Place.places()) at(h) async {
var compute_time:Long = 0;
var comm_time:Long = 0;
var barrier_time:Long = 0;
val host_clusters = new Rail[Float](num_clusters*dim, (i:long)=>file_points(i));
val host_cluster_counts = new Rail[Int](num_clusters);
for (slice in 0..(num_slices-1)) {
// carve out local portion of points (point-major)
val offset = (slice*Place.numPlaces() + here.id) * num_slice_points;
if (verbose)
Console.OUT.println(h.toString()+" gets "+offset+" len "+num_slice_points);
val init = (i:long) => {
val p=i%num_slice_points;
val d=i/num_slice_points;
return file_points(offset+p+d*num_file_points);
};
// these are pretty big so allocate up front
val host_points = new Rail[Float](num_slice_points*dim, init);
val host_nearest = new Rail[Float](num_slice_points);
val start_time = System.currentTimeMillis();
barrier_time -= System.nanoTime();
team.barrier();
barrier_time += System.nanoTime();
main_loop: for (iter in 0..(iterations-1)) {
//if (offset==0) Console.OUT.println("Iteration: "+iter);
val old_clusters = new Rail[Float](host_clusters.size);
Rail.copy(host_clusters, 0L, old_clusters, 0L, host_clusters.size);
host_clusters.clear();
host_cluster_counts.clear();
compute_time -= System.nanoTime();
for (p in 0..(num_slice_points-1)) {
var closest:Long = -1;
var closest_dist:Float = Float.MAX_VALUE;
for (k in 0..(num_clusters-1)) {
var dist : Float = 0;
for (d in 0..(dim-1)) {
val tmp = host_points(p+d*num_slice_points) - old_clusters(k*dim+d);
dist += tmp * tmp;
}
if (dist < closest_dist) {
closest_dist = dist;
closest = k;
}
}
for (d in 0..(dim-1)) {
host_clusters(closest*dim+d) += host_points(p+d*num_slice_points);
}
host_cluster_counts(closest)++;
}
compute_time += System.nanoTime();
comm_time -= System.nanoTime();
team.allreduce(host_clusters, 0L, host_clusters, 0L, host_clusters.size, Team.ADD);
team.allreduce(host_cluster_counts, 0L, host_cluster_counts, 0L, host_cluster_counts.size, Team.ADD);
comm_time += System.nanoTime();
for (k in 0..(num_clusters-1)) {
for (d in 0..(dim-1)) host_clusters(k*dim+d) /= host_cluster_counts(k);
}
if (offset==0 && verbose) {
Console.OUT.println("Iteration: "+iter);
printClusters(host_clusters,dim);
}
// TEST FOR CONVERGENCE
for (j in 0..(num_clusters*dim-1)) {
if (true/*||Math.abs(clusters_old(j)-host_clusters(j))>0.0001*/) continue main_loop;
}
break;
} // main_loop
} // slice
Console.OUT.printf("%d: computation %.3f s communication %.3f s (barrier %.3f s)\n",
here.id, compute_time/1E9, comm_time/1E9, barrier_time/1E9);
team.barrier();
if (here.id == 0) {
Console.OUT.println("\nFinal results:");
printClusters(host_clusters,dim);
}
} // async
} // finish
}
}
// vim: shiftwidth=4:tabstop=4:expandtab

42
samples/X10/MontyPi.x10 Normal file
View File

@@ -0,0 +1,42 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.array.DistArray_Unique;
import x10.io.Console;
import x10.util.Random;
/**
* Calculation of an approximation to pi by using a Monte Carlo simulation
* (throwing darts into the unit square and determining the fraction that land
* in the unit circle).
*/
public class MontyPi {
public static def main(args:Rail[String]) {
if (args.size != 1L) {
Console.OUT.println("Usage: MontyPi <number of points>");
return;
}
val N = Long.parse(args(0));
val initializer = () => {
val r = new Random();
var result:Long = 0;
for(c in 1..N) {
val x = r.nextDouble();
val y = r.nextDouble();
if (x*x +y*y <= 1.0) result++;
}
result
};
val result = new DistArray_Unique[Long](Place.places(), initializer);
val pi = (4.0*result.reduce((x:Long,y:Long) => x+y, 0) as Double)/(N*Place.numPlaces());
Console.OUT.println("The value of pi is " + pi);
}
}

123
samples/X10/NQueensDist.x10 Normal file
View File

@@ -0,0 +1,123 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
* (C) Copyright Australian National University 2011.
*/
import x10.array.DistArray_Unique;
/**
* A distributed version of NQueens. Runs over NUM_PLACES.
* Identical to NQueensPar, except that work is distributed
* over multiple places rather than shared between threads.
*/
public class NQueensDist {
public static val EXPECTED_SOLUTIONS =
[0, 1, 0, 0, 2, 10, 4, 40, 92, 352, 724, 2680, 14200, 73712, 365596, 2279184, 14772512];
val N:Long;
val P:Long;
val results:DistArray_Unique[Long];
val R:LongRange;
def this(N:Long, P:Long) {
this.N=N;
this.P=P;
this.results = new DistArray_Unique[Long]();
this.R = 0..(N-1);
}
def start() {
new Board().distSearch();
}
def run():Long {
finish start();
val result = results.reduce(((x:Long,y:Long) => x+y),0);
return result;
}
class Board {
val q: Rail[Long];
/** The number of low-rank positions that are fixed in this board for the purposes of search. */
var fixed:Long;
def this() {
q = new Rail[Long](N);
fixed = 0;
}
/**
* @return true if it is safe to put a queen in file <code>j</code>
* on the next rank after the last fixed position.
*/
def safe(j:Long) {
for (k in 0..(fixed-1)) {
if (j == q(k) || Math.abs(fixed-k) == Math.abs(j-q(k)))
return false;
}
return true;
}
/** Search all positions for the current board. */
def search() {
for (k in R) searchOne(k);
}
/**
* Modify the current board by adding a new queen
* in file <code>k</code> on rank <code>fixed</code>,
* and search for all safe positions with this prefix.
*/
def searchOne(k:Long) {
if (safe(k)) {
if (fixed==(N-1)) {
// all ranks safely filled
atomic NQueensDist.this.results(here.id)++;
} else {
q(fixed++) = k;
search();
fixed--;
}
}
}
/**
* Search this board, dividing the work between all places
* using a block distribution of the current free rank.
*/
def distSearch() {
val work = R.split(Place.numPlaces());
finish for (p in Place.places()) {
val myPiece = work(p.id);
at (p) async {
// implicit copy of 'this' made across the at divide
for (k in myPiece) {
searchOne(k);
}
}
}
}
}
public static def main(args:Rail[String]) {
val n = args.size > 0 ? Long.parse(args(0)) : 8;
Console.OUT.println("N=" + n);
//warmup
//finish new NQueensPar(12, 1).start();
val P = Place.numPlaces();
val nq = new NQueensDist(n,P);
var start:Long = -System.nanoTime();
val answer = nq.run();
val result = answer==EXPECTED_SOLUTIONS(n);
start += System.nanoTime();
start /= 1000000;
Console.OUT.println("NQueensDist " + nq.N + "(P=" + P +
") has " + answer + " solutions" +
(result? " (ok)." : " (wrong).") +
"time=" + start + "ms");
}
}

117
samples/X10/NQueensPar.x10 Normal file
View File

@@ -0,0 +1,117 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
* (C) Copyright Australian National University 2011.
*/
/**
* Compute the number of solutions to the N queens problem.
*/
public class NQueensPar {
public static val EXPECTED_SOLUTIONS =
[0, 1, 0, 0, 2, 10, 4, 40, 92, 352, 724, 2680, 14200, 73712, 365596, 2279184, 14772512];
val N:Int;
val P:Int;
var nSolutions:Int = 0n;
val R:IntRange;
def this(N:Int, P:Int) {
this.N=N;
this.P=P;
this.R = 0n..(N-1n);
}
def start() {
new Board().parSearch();
}
class Board {
val q: Rail[Int];
/** The number of low-rank positions that are fixed in this board for the purposes of search. */
var fixed:Int;
def this() {
q = new Rail[Int](N);
fixed = 0n;
}
def this(b:Board) {
this.q = new Rail[Int](b.q);
this.fixed = b.fixed;
}
/**
* @return true if it is safe to put a queen in file <code>j</code>
* on the next rank after the last fixed position.
*/
def safe(j:Int) {
for (k in 0n..(fixed-1n)) {
if (j == q(k) || Math.abs(fixed-k) == Math.abs(j-q(k)))
return false;
}
return true;
}
/** Search all positions for the current board. */
def search() {
for (k in R) searchOne(k);
}
/**
* Modify the current board by adding a new queen
* in file <code>k</code> on rank <code>fixed</code>,
* and search for all safe positions with this prefix.
*/
def searchOne(k:Int) {
if (safe(k)) {
if (fixed==(N-1n)) {
// all ranks safely filled
atomic NQueensPar.this.nSolutions++;
} else {
q(fixed++) = k;
search();
fixed--;
}
}
}
/**
* Search this board, dividing the work between threads
* using a block distribution of the current free rank.
*/
def parSearch() {
for (work in R.split(P)) async {
val board = new Board(this);
for (w in work) {
board.searchOne(w);
}
}
}
}
public static def main(args:Rail[String]) {
val n = args.size > 0 ? Int.parse(args(0)) : 8n;
Console.OUT.println("N=" + n);
//warmup
//finish new NQueensPar(12, 1).start();
val ps = [1n,2n,4n];
for (numTasks in ps) {
Console.OUT.println("starting " + numTasks + " tasks");
val nq = new NQueensPar(n,numTasks);
var start:Long = -System.nanoTime();
finish nq.start();
val result = (nq.nSolutions as Long)==EXPECTED_SOLUTIONS(nq.N);
start += System.nanoTime();
start /= 1000000;
Console.OUT.println("NQueensPar " + nq.N + "(P=" + numTasks +
") has " + nq.nSolutions + " solutions" +
(result? " (ok)." : " (wrong).") + "time=" + start + "ms");
}
}
}

73
samples/X10/QSort.x10 Normal file
View File

@@ -0,0 +1,73 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
/**
* Straightforward quicksort implementation using
* naive partition-in-the-middle and not bothering with
* well-known optimizations such as using insertion sort
* once the partitions get small. This is only intended
* as a simple example of an array-based program that
* combines a recirsive divide and conquer algorithm
* with async and finish, not as a highly efficient
* sorting procedure..
*/
public class QSort {
private static def partition(data:Rail[int], left:long, right:long) {
var i:long = left;
var j:long = right;
var tmp:int;
var pivot:long = data((left + right) / 2);
while (i <= j) {
while (data(i) < pivot) i++;
while (data(j) > pivot) j--;
if (i <= j) {
tmp = data(i);
data(i) = data(j);
data(j) = tmp;
i++;
j--;
}
}
return i;
}
public static def qsort(data:Rail[int], left:long, right:long) {
index:long = partition(data, left, right);
finish {
if (left < index - 1)
async qsort(data, left, index - 1);
if (index < right)
qsort(data, index, right);
}
}
public static def main(args:Rail[String]) {
val N = args.size>0 ? Long.parse(args(0)) : 100;
val r = new x10.util.Random();
val data = new Rail[int](N, (long)=>r.nextInt(9999n));
qsort(data, 0, N-1);
for (i in 0..(N-1)) {
Console.OUT.print(data(i));
if (i%10 == 9) {
Console.OUT.println();
} else {
Console.OUT.print(", ");
}
}
Console.OUT.println();
}
}

View File

@@ -0,0 +1,123 @@
/*
* This file is part of the X10 project (http://x10-lang.org).
*
* This file is licensed to You under the Eclipse Public License (EPL);
* You may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.opensource.org/licenses/eclipse-1.0.php
*
* (C) Copyright IBM Corporation 2006-2014.
*/
import x10.io.Console;
import x10.util.Random;
/**
* This class represents a real-world problem in graphics engines --
* determining which objects in a large sprawling world are close enough to the
* camera to be considered for rendering.
*
* It illustrates the usage of X10 structs to define new primitive types.
* In Native X10, structs are allocated within their containing object/stack frame
* and thus using structs instead of classes for Vector3 and WorldObject greatly
* improves the memory efficiency of the computation.
*
* @Author Dave Cunningham
* @Author Vijay Saraswat
*/
class StructSpheres {
static type Real = Float;
static struct Vector3(x:Real, y:Real, z:Real) {
public def getX () = x;
public def getY () = y;
public def getZ () = z;
public def add (other:Vector3)
= Vector3(this.x+other.x, this.y+other.y, this.z+other.z);
public def neg () = Vector3(-this.x, -this.y, -this.z);
public def sub (other:Vector3) = add(other.neg());
public def length () = Math.sqrtf(length2());
public def length2 () = x*x + y*y + z*z;
}
static struct WorldObject {
def this (x:Real, y:Real, z:Real, r:Real) {
pos = Vector3(x,y,z);
renderingDistance = r;
}
public def intersects (home:Vector3)
= home.sub(pos).length2() < renderingDistance*renderingDistance;
protected val pos:Vector3;
protected val renderingDistance:Real;
}
public static def compute():boolean {
val reps = 7500;
// The following correspond to a modern out-door computer game:
val num_objects = 50000;
val world_size = 6000;
val obj_max_size = 400;
val ran = new Random(0);
// the array can go on the heap
// but the elements ought to be /*inlined*/ in the array
val spheres =
new Rail[WorldObject](num_objects, (i:long) => {
val x = (ran.nextDouble()*world_size) as Real;
val y = (ran.nextDouble()*world_size) as Real;
val z = (ran.nextDouble()*world_size) as Real;
val r = (ran.nextDouble()*obj_max_size) as Real;
return WorldObject(x,y,z,r);
});
val time_start = System.nanoTime();
var counter : Long = 0;
// HOT LOOP BEGINS
for (c in 1..reps) {
val x = (ran.nextDouble()*world_size) as Real;
val y = (ran.nextDouble()*world_size) as Real;
val z = (ran.nextDouble()*world_size) as Real;
val pos = Vector3(x,y,z);
for (i in spheres.range()) {
if (spheres(i).intersects(pos)) {
counter++;
}
}
}
// HOT LOOP ENDS
val time_taken = System.nanoTime() - time_start;
Console.OUT.println("Total time: "+time_taken/1E9);
val expected = 109702;
val ok = counter == expected;
if (!ok) {
Console.ERR.println("number of intersections: "+counter
+" (expected "+expected+")");
}
return ok;
}
public static def main (Rail[String]) {
compute();
}
}

View File

@@ -0,0 +1,96 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="14.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|AnyCPU">
<Configuration>Debug</Configuration>
<Platform>AnyCPU</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|ARM">
<Configuration>Debug</Configuration>
<Platform>ARM</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|x64">
<Configuration>Debug</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|x86">
<Configuration>Debug</Configuration>
<Platform>x86</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|AnyCPU">
<Configuration>Release</Configuration>
<Platform>AnyCPU</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|ARM">
<Configuration>Release</Configuration>
<Platform>ARM</Platform>
<UseDotNetNativeToolchain>true</UseDotNetNativeToolchain>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x64">
<Configuration>Release</Configuration>
<Platform>x64</Platform>
<UseDotNetNativeToolchain>true</UseDotNetNativeToolchain>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x86">
<Configuration>Release</Configuration>
<Platform>x86</Platform>
<UseDotNetNativeToolchain>true</UseDotNetNativeToolchain>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>42fc11d8-64c6-4967-a15a-dfd787f68766</ProjectGuid>
</PropertyGroup>
<Import Project="$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props" Condition="Exists('$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props')" />
<PropertyGroup Condition="'$(VisualStudioVersion)' == '' or '$(VisualStudioVersion)' &lt; '14.0'">
<VisualStudioVersion>14.0</VisualStudioVersion>
</PropertyGroup>
<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)\$(WMSJSProjectDirectory)\Microsoft.VisualStudio.$(WMSJSProject).Default.props" />
<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)\$(WMSJSProjectDirectory)\Microsoft.VisualStudio.$(WMSJSProject).props" />
<PropertyGroup>
<EnableDotNetNativeCompatibleProfile>true</EnableDotNetNativeCompatibleProfile>
<TargetPlatformIdentifier>UAP</TargetPlatformIdentifier>
<TargetPlatformVersion>10.0.10240.0</TargetPlatformVersion>
<TargetPlatformMinVersion>10.0.10240.0</TargetPlatformMinVersion>
<MinimumVisualStudioVersion>$(VersionNumberMajor).$(VersionNumberMinor)</MinimumVisualStudioVersion>
<DefaultLanguage>en-US</DefaultLanguage>
</PropertyGroup>
<ItemGroup>
<AppxManifest Include="package.appxmanifest">
<SubType>Designer</SubType>
</AppxManifest>
<Content Include="css\browser.css" />
<Content Include="default.html" />
<Content Include="images\icons.png" />
<Content Include="images\logo_150x150.png" />
<Content Include="images\logo_310x150.png" />
<Content Include="images\logo_310x310.png" />
<Content Include="images\logo_44x44.png" />
<Content Include="images\logo_71x71.png" />
<Content Include="images\logo_badge.png" />
<Content Include="images\logo_bg.png" />
<Content Include="images\logo_splash.png" />
<Content Include="images\logo_store.png" />
<Content Include="js\components\address-bar.js" />
<Content Include="js\browser.js" />
<Content Include="js\components\favorites.js" />
<Content Include="js\components\navigation.js" />
<Content Include="js\components\settings.js" />
<Content Include="js\components\title-bar.js" />
<Content Include="js\components\webview.js" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\NativeListener\NativeListener.vcxproj" />
</ItemGroup>
<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)\$(WMSJSProjectDirectory)\Microsoft.VisualStudio.$(WMSJSProject).targets" />
<!-- To modify your build process, add your task inside one of the targets below then uncomment
that target and the DisableFastUpToDateCheck PropertyGroup.
Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="BeforeBuild">
</Target>
<Target Name="AfterBuild">
</Target>
<PropertyGroup>
<DisableFastUpToDateCheck>true</DisableFastUpToDateCheck>
</PropertyGroup>
-->
</Project>

View File

@@ -0,0 +1,17 @@
<?xml version="1.0" encoding="UTF-8"?>
<faces-config>
<faces-config-extension>
<namespace-uri>http://www.ibm.com/xsp/custom</namespace-uri>
<default-prefix>xc</default-prefix>
</faces-config-extension>
<composite-component>
<component-type>navbar</component-type>
<composite-name>navbar</composite-name>
<composite-file>/navbar.xsp</composite-file>
<composite-extension>
<designer-extension>
<in-palette>true</in-palette>
</designer-extension>
</composite-extension>
</composite-component>
</faces-config>

View File

@@ -0,0 +1,22 @@
<?xml version="1.0" encoding="UTF-8"?><note class="form" maintenanceversion="1.0" replicaid="88257E000001FF59" version="9.0" xmlns="http://www.lotus.com/dxl">
<noteinfo noteid="1aa" sequence="14" unid="47D707801D48026E85257C48007E06C7">
<created><datetime>20131221T175632,71-05</datetime></created>
<modified><datetime>20150305T194407,22-08</datetime></modified>
<revised><datetime>20150305T194407,21-08</datetime></revised>
<lastaccessed><datetime>20150305T194407,22-08</datetime></lastaccessed>
<addedtofile><datetime>20150305T162153,30-08</datetime></addedtofile></noteinfo>
<updatedby><name>CN=Eric McCormick/O=Eric McCormick</name></updatedby>
<wassignedby><name>CN=Eric McCormick/O=Eric McCormick</name></wassignedby>
<item name="$Flags"><text>gC~4;</text></item>
<item name="$TITLE"><text>navbar.xsp</text></item>
<item name="$FileNames" sign="true"><text>navbar.xsp</text></item>
<item name="$DesignerVersion"><text>8.5.3</text></item>
</note>

View File

@@ -0,0 +1,38 @@
# [PackageDev] target_format: plist, ext: tmLanguage
---
name: Ansible
scopeName: source.ansible
fileTypes: []
uuid: 787ae642-b4ae-48b1-94e9-f935bec43a8f
patterns:
- name: comment.line.number-sign.ansible
match: (?:^ *|\G *)((#).*)
captures:
'1': {name: comment.line.number-sign.ansible}
'2': {name: punctuation.definition.comment.line.ansible}
- name: storage.type.ansible
match: (\{\{ *[^\{\}]+ *\}\})|(\$\{[^\{\}]+\})
- name: keyword.other.ansible
match: \- (name\:|include\:) (.*)|(^(- |\s*)\w+\:)
captures:
'2': {name: string.quoted.double.ansible}
- name: variable.complex.ansible
contentName: string.other.ansible
begin: (\w+)(=)\"?
beginCaptures:
'1': {name: entity.other.attribute-name.ansible}
'2': {name: text}
end: \"?\s
patterns:
- include: $self
- name: constant.other.ansible
match: .
- name: string.quoted.double.ansible
match: ^(\[[0-9a-zA-Z_-]+(((\:)children)*)\])
captures:
'2': {name: variable.parameter.ansible}

View File

@@ -0,0 +1,16 @@
---
name: R Console
fileTypes: []
scopeName: source.r-console
uuid: F629C7F3-823B-4A4C-8EEE-9971490C5710
patterns:
- name: source.r.embedded.r-console
begin: "^> "
beginCaptures:
"0":
name: punctuation.section.embedded.r-console
end: \n|\z
patterns:
- include: source.r
keyEquivalent: ^~R

30
samples/xBase/sample.ch Normal file
View File

@@ -0,0 +1,30 @@
#ifndef __HARBOUR__
#ifndef __XPP__
#ifndef __CLIP__
#ifndef FlagShip
#define __CLIPPER__
#endif
#endif
#endif
#endif
/* File create flags */
#define FC_NORMAL 0 /* No file attributes are set */
#define FC_READONLY 1
#define FC_HIDDEN 2
#define FC_SYSTEM 4
// New-style comment
#command SET DELETED <x:ON,OFF,&> => Set( _SET_DELETED, <(x)> )
#command SET DELETED (<x>) => Set( _SET_DELETED, <x> )
#command @ <row>, <col> SAY <exp> [PICTURE <pic>] [COLOR <clr>] => ;
DevPos( <row>, <col> ) ; DevOutPict( <exp>, <pic> [, <clr>] )
#command ENDIF <*x*> => endif
#ifdef __CLIPPER__
#xtranslate hb_MemoWrit( [<x,...>] ) => MemoWrit( <x> )
#xtranslate hb_dbExists( <t> ) => File( <t> )
#xtranslate hb_dbPack() => __dbPack()
#xtranslate hb_default( @<v>, <x> ) => iif( StrTran( ValType( <v> ), "M", "C" ) == StrTran( ValType( <x> ), "M", "C" ),, <v> := <x>, )
#endif

512
samples/xBase/sample.prw Executable file
View File

@@ -0,0 +1,512 @@
/**
* This is a sample file for Linguist.
* It's written in AdvPL, a xBase Language.
*
* Author: Arthur Helfstein Fragoso
*
* This script has the specific use of integrating between a financial institution
* and other two companies in the process of creating Installment Bills for
* customers.
*
* The functions are called from the ERP Protheus TOTVS.
*
**/
#Include "TOPCONN.ch"
#include "tbiconn.ch"
#Include "Protheus.ch"
#Include "rwmake.ch"
#Include "FileIO.ch"
#Include "json.ch"
#Include "utils.ch"
////////////////////////
// Faturando (Reparcelando)
// FA280
// FA280_01
//
User Function FA280()
//Executado uma vez para cada parcela
If cEmpAnt == '06'
SE5->(dbSelectArea("SE5"))
cSet3Filter := "SE5->E5_FATURA == SE1->E1_NUM"
SE5->(dbSetFilter( {|| &cSet3Filter }, cSet3Filter ))
SE5->(dbGoTOP())
aOrig06Tit := {} // = Todos os Titulos que ser<65>o reparcelados
nTotal := 0
While SE5->(!EOF())
AADD(aOrig06Tit, {SE5->E5_PREFIXO, SE5->E5_NUMERO, SE5->E5_VALOR})
nTotal += SE5->E5_VALOR
SE5->(dbSkip())
End
aNovoTitulo:= {;//{"E1_FILIAL" ,SE1ORIG->E1_FILIAL ,Nil},;
;//{"E1_PREFIXO" ,SE1->E1_PREFIXO ,Nil},;
{"E1_NUM" ,SE1->E1_NUM ,Nil},;
{"E1_TIPO" ,SE1->E1_TIPO ,Nil},;
{"E1_PARCELA" ,SE1->E1_PARCELA ,Nil},;
{"E1_NATUREZ" ,SE1->E1_NATUREZ ,Nil},;
{"E1_CLIENTE" ,SE1->E1_CLIENTE ,Nil},;
{"E1_LOJA" ,SE1->E1_LOJA ,Nil},;
{"E1_NRDOC" ,SE1->E1_NRDOC ,Nil},;
;//{"E1_X_COD" ,SE1->E1_NATUREZ ,Nil},;
{"E1_EMISSAO" ,SE1->E1_EMISSAO ,Nil},;
{"E1_VENCTO" ,SE1->E1_VENCTO ,Nil},;
{"E1_VENCREA" ,SE1->E1_VENCREA ,Nil},;
;//{"E1_VALOR" ,SE1->E1_VALOR ,Nil},;
;//{"E1_SALDO" ,SE1->E1_SALDO ,Nil},;
;//{"E1_VLCRUZ" ,SE1->E1_VLCRUZ ,Nil},;
{"E1_PORTADO" ,SE1->E1_PORTADO ,Nil},;
{"E1_FATURA" ,SE1->E1_FATURA ,Nil},;
{"E1_X_DTPAV" ,SE1->E1_X_DTPAV ,Nil},;
{"E1_X_DTSAV" ,SE1->E1_X_DTSAV ,Nil},;
{"E1_X_DTTAV" ,SE1->E1_X_DTTAV ,Nil},;
{"E1_X_DTSPC" ,SE1->E1_X_DTSPC ,Nil},;
{"E1_X_DTPRO" ,SE1->E1_X_DTPRO ,Nil},;
{"E1_NUMBCO" ,SE1->E1_NUMBCO ,Nil},;
{"E1_X_DUDME" ,SE1->E1_X_DUDME ,Nil},;
{"E1_X_TIPOP" ,SE1->E1_X_TIPOP ,Nil},;
{"E1_X_DTCAN" ,SE1->E1_X_DTCAN ,Nil},;
{"E1_X_MOTIV" ,SE1->E1_X_MOTIV ,Nil},;
{"E1_X_DESPC" ,SE1->E1_X_DESPC ,Nil},;
{"E1_NUMNOTA" ,SE1->E1_NUMNOTA ,Nil},;
{"E1_SERIE" ,SE1->E1_SERIE ,Nil},;
{"E1_X_DEPRO" ,SE1->E1_X_DEPRO ,Nil},;
{"E1_X_TPPAI" ,SE1->E1_X_TPPAI ,Nil},;
{"E1_X_CGC" ,SE1->E1_X_CGC ,Nil},;
{"E1_XTPEMP" ,SE1->E1_XTPEMP ,Nil},;
{"E1_X_CTRIM" ,SE1->E1_X_CTRIM ,Nil}}
StartJob("U_FA280_01",getenvserver(),.T., SE1->E1_PREFIXO ,SE1->E1_NUM, SE1->E1_TIPO, SE1->E1_VALOR, aOrig06Tit, nTotal, SE1->E1_PARCELA, aNovoTitulo)
SE5->(dbClearFilter())
EndIf
Return nil
User Function FA280_01(cE1PREFIXO, cE1NUM, cE1TIPO, nE1Valor, aOrig06Tit, nTotal, cE1PARCELA, aNovoTitulo)
Local nValPar := nil
Local aTit05 := {}
RpcSetType(3) // Nao consome licensa
//Prepare Environment Empresa "01" Filial '0102'
// Muda de empresa
While !RpcSetEnv('01', '0102',,,,GetEnvServer(),{})
Sleep(400)
End
nFileLog := u_OpenLog("\Logs\FA280_"+dToS(dDataBase)+".log")
fWrite(nFileLog,"----- FA280 -----"+CRLF)
fWrite(nFileLog,cE1NUM+CRLF)
nParcelas := round(nTotal/nE1Valor, 0)
cUltima := '0'+ chr(64+nParcelas)
fWrite(nFileLog,"valor das parcelas: "+ cvaltochar(nE1Valor) +CRLF)
fWrite(nFileLog,"parcelas: "+ cvaltochar(nParcelas) +CRLF)
fWrite(nFileLog,"parcela atual: "+ cE1PARCELA +CRLF)
fWrite(nFileLog,"ultima parcela: "+ cUltima +CRLF)
n0102total := 0
n0105total := 0
//Loop entre todos os Titulos que serão Reparcelados
For nI := 1 To len(aOrig06Tit)
fWrite(nFileLog,"E5_NUMERO: "+aOrig06Tit[nI][2] +CRLF)
cQuery := "select * from SE1010 where E1_PREFIXO = '"+ aOrig06Tit[nI][1] +"' and E1_NUM = '"+ aOrig06Tit[nI][2] +"' and E1_TIPO = 'FAT' and D_E_L_E_T_ <> '*'"
fWrite(nFileLog,cQuery +CRLF)
If select("SE1ORIG") > 0
SE1ORIG->(DbCloseArea())
endif
TcQuery cQuery New Alias 'SE1ORIG'
dbSelectArea("SE1ORIG")
SE1ORIG->(DBGOTOP())
While SE1ORIG->(!EOF()) //Loop entre as duas filiais: 0102, 0105
fWrite(nFileLog,"SE1ORIG loop: "+SE1ORIG->E1_FILIAL +CRLF)
cFilAnt := SE1ORIG->E1_FILIAL
//Faz a baixa
if alltrim(SE1ORIG->E1_STATUS) == 'A'
fWrite(nFileLog, SE1ORIG->E1_FILIAL+" : Fazendo baixa" +CRLF)
aBaixa := {{"E1_FILIAL" ,SE1ORIG->E1_FILIAL ,Nil},;
{"E1_PREFIXO" ,SE1ORIG->E1_PREFIXO ,Nil},;
{"E1_NUM" ,SE1ORIG->E1_NUM ,Nil},;
{"E1_TIPO" ,SE1ORIG->E1_TIPO ,Nil},;
{"E1_PARCELA" ,SE1ORIG->E1_PARCELA ,Nil},;
{"E1_DESCONT" ,SE1ORIG->E1_DESCONT ,Nil},;
{"E1_JUROS" ,SE1ORIG->E1_JUROS ,Nil},;
{"E1_MULTA" ,SE1ORIG->E1_MULTA ,Nil},;
{"E1_VLRREAL" ,SE1ORIG->E1_VLRREAL ,Nil},;
{"AUTMOTBX" ,"FAT" ,Nil},;
{"AUTDTBAIXA" ,date() ,Nil},;
{"AUTDTCREDITO",date() ,Nil},;
{"AUTHIST" ,"Bx.Emis.Fat."+cE1NUM,Nil},;
{"AUTVALREC" ,SE1ORIG->E1_VALOR ,Nil}}
lMsErroAuto:=.F. //reseta lMsErroAuto
MSExecAuto ({|x,y| FINA070(x,y)},aBaixa, 3)
If lMsErroAuto
fWrite(nFileLog,SE1ORIG->E1_FILIAL+" : Não foi efetuada a baixa do titulo : "+CRLF+ MSErroString()+ CRLF + tojson(aBaixa) + CRLF)
return
else
RECLOCK('SE5',.F.)
E5_FATURA := cE1NUM
E5_FATPREF:= cE1PREFIXO
//E5_LA = S
//E5_MOEDA = ''
//E5_TXMOEDA = 1
MSUNLOCK()
RECLOCK('SE1',.F.)
E1_FATURA := cE1NUM
E1_FATPREF:= cE1PREFIXO
E1_TIPOFAT:= cE1TIPO
E1_FLAGFAT:= 'S'
E1_DTFATUR:= dDataBase
MSUNLOCK()
fWrite(nFileLog,SE1ORIG->E1_FILIAL+" : baixa feita" +CRLF)
endif
endif
//calcula valor total de cada filial para poder calcular a Fatura
if SE1ORIG->E1_FILIAL == '0102'
n0102total += SE1ORIG->E1_VALOR
elseif SE1ORIG->E1_FILIAL == '0105'
n0105total += SE1ORIG->E1_VALOR
else
fWrite(nFileLog,"Programa nao preparado para a filial "+SE1ORIG->E1_FILIAL +CRLF)
endif
SE1ORIG->(dbskip())
End
Next nI
cFilAnt := '0102'
fWrite(nFileLog,"Total 0102: "+cvaltochar(n0102total) +CRLF)
fWrite(nFileLog,"Total 0105: "+cvaltochar(n0105total) +CRLF)
n0102val := round(nE1Valor * n0102total/nTotal, 2)
n0105val := nE1Valor - n0102val
aFili := {}
if n0102total > 0
AADD(aFili,'0102')
endif
if n0105total > 0
AADD(aFili,'0105')
endif
For nI := 1 To len(aFili)
cQuery := "select COUNT(*) as QUANT, SUM(E1_VALOR) as TOTALINC from SE1010 where E1_NUM = '"+ cE1NUM +"' and E1_FILIAL='"+ aFili[nI] +"' and E1_PREFIXO = '"+ cE1PREFIXO +"' and D_E_L_E_T_ <> '*'"
If select("PARC") > 0
PARC->(DbCloseArea())
endif
TcQuery cQuery New Alias 'PARC'
dbSelectArea("PARC")
//verificamos se estamos na ultima parcela
if PARC->QUANT == nParcelas -1 //QUANT = quantidade de parcelas incluida
fWrite(nFileLog,"Ultima Parcela"+CRLF)
//o valor desta será o valor que resta
nValPar := SE1ORIG->E1_VALOR - PARC->TOTALINC
if aFili[nI] == '0102'
n0102val := n0102total - PARC->TOTALINC
elseif aFili[nI] == '0105'
n0105val := n0105total - PARC->TOTALINC
endif
endif
Next nI
fWrite(nFileLog,"Total 0102: "+cvaltochar(n0102total) + " -> Parcela de: "+cvaltochar(n0102val) +CRLF)
fWrite(nFileLog,"Total 0105: "+cvaltochar(n0105total) + " -> Parcela de: "+cvaltochar(n0105val) +CRLF)
/////////////////
For nI := 1 To len(aFili)
if aFili[nI] == '0102'
nValPar := n0102val
elseif aFili[nI] == '0105'
nValPar := n0105val
endif
aTitulo := ACLONE(aNovoTitulo)
AADD(aTitulo, {"E1_PREFIXO" ,cE1PREFIXO ,Nil})
AADD(aTitulo, {"E1_FILIAL" ,aFili[nI] ,Nil})
AADD(aTitulo, {"E1_VALOR" ,nValPar ,Nil})
AADD(aTitulo, {"E1_SALDO" ,nValPar ,Nil})
AADD(aTitulo, {"E1_VLCRUZ" ,nValPar ,Nil})
lMsErroAuto := .F.
if aFili[nI] == '0102'
MSExecAuto({|x,y| FINA040(x,y)},aTitulo,3) //Inclusao
If lMsErroAuto
fWrite(nFileLog,"Erro " + CRLF)
fWrite(nFileLog,"Erro ao incluir titulo: "+CRLF+ MSErroString()+ CRLF + tojson(aTitulo) + CRLF)
return
else
fWrite(nFileLog,"Sucesso "+ CRLF)
fWrite(nFileLog,"Titulo incluido: "+ aFili[nI] +" : " + cValToChar(nValPar) +CRLF)
endif
elseif aFili[nI] == '0105'
fWrite(nFileLog,"Salvando titulos 05 para o final "+aFili[nI]+CRLF)
//StartJob("U_JOBF040",getenvserver(),.T., SE1ORIG->E1_FILIAL, aTitulo)
AADD(aTit05, aTitulo)
//fWrite(nFileLog,"passou pela thread "+CRLF)
else
fWrite(nFileLog,"Erro, filial nao tratada "+aFili[nI]+CRLF)
endif
Next nI
Reset Environment
While !RpcSetEnv('01', '0105',,,,GetEnvServer(),{})
Sleep(400)
End
For nI := 1 To len(aTit05)
lMsErroAuto := .F.
MSExecAuto({|x,y| FINA040(x,y)},aTit05[nI],3) //Inclusao
If lMsErroAuto
fWrite(nFileLog,"Erro " + CRLF)
fWrite(nFileLog,"Erro ao incluir titulo: "+CRLF+ MSErroString()+ CRLF + tojson(aTit05[nI]) + CRLF)
return
else
fWrite(nFileLog,"Sucesso "+ CRLF)
fWrite(nFileLog,"Titulo incluido: "+CRLF)
endif
Next nI
Reset Environment
fClose(nFileLog)
Return
////////////////////////
// Cancelamento da Fatura (Cancelamento do Reparcelamento)
// F280PCAN
// JOBF280C
//
User Function F280PCAN()
/**
* cFatCan - numero da fatura
* cPrefCan - prefixo
* cTipoCan - tipo
**/
If cEmpAnt == '06'
StartJob("U_JOBF280C",getenvserver(),.T., cPrefCan, cFatCan, cTipoCan)
EndIf
Return .T.
User Function JOBF280C(cPrefCan, cFatCan, cTipoCan)
RpcSetType(3) // Nao consome licensa
While !RpcSetEnv('01', '0102',,,,GetEnvServer(),{})
Sleep(400)
End
nFileLog := u_OpenLog("\Logs\F280PCAN_"+dToS(dDataBase)+".log")
fWrite(nFileLog,"----- F280PCAN -----"+CRLF)
fWrite(nFileLog,"E1_PREFIXO = '"+ cPrefCan +"' and E1_NUM = '"+ cFatCan +"' and E1_TIPO = '"+ cTipoCan +"'"+CRLF)
cQuery := "select * from SE1010 where E1_PREFIXO = '"+ cPrefCan +"' and E1_NUM = '"+ cFatCan +"' and E1_TIPO = '"+ cTipoCan +"' and D_E_L_E_T_ <> '*'"
If select("SE1ORIG") > 0
SE1ORIG->(DbCloseArea())
endif
TcQuery cQuery New Alias 'SE1ORIG'
dbSelectArea("SE1ORIG")
SE1ORIG->(DBGOTOP())
While SE1ORIG->(!EOF()) //Loop entre todas as parcelas e filiais
SE1->(dbselectarea("SE1"))
SE1->(dbSetOrder(1))
fWrite(nFileLog,"dbseek" + CRLF)
if ! SE1->(dbSeek(SE1ORIG->E1_FILIAL+ SE1ORIG->E1_PREFIXO+ SE1ORIG->E1_NUM+ SE1ORIG->E1_PARCELA+ SE1ORIG->E1_TIPO))
fWrite(nFileLog,"Erro dbseek" + CRLF)
Alert("Erro. Verificar F280PCAN() - dbseek")
fWrite(nFileLog,"Erro dbseek("+SE1ORIG->E1_FILIAL+ SE1ORIG->E1_PREFIXO+ SE1ORIG->E1_NUM+ SE1ORIG->E1_PARCELA+ SE1ORIG->E1_TIPO+")" + CRLF)
return .F.
endif
cFilAnt := SE1ORIG->E1_FILIAL
aFatura:= {{"E1_FILIAL" ,SE1ORIG->E1_FILIAL ,Nil},;
{"E1_PREFIXO" ,SE1ORIG->E1_PREFIXO ,Nil},;
{"E1_NUM" ,SE1ORIG->E1_NUM ,Nil},;
{"E1_PARCELA" ,SE1ORIG->E1_PARCELA ,Nil},;
{"E1_TIPO" ,SE1ORIG->E1_TIPO ,Nil}}
lMsErroAuto := .F.
MSExecAuto({|x,y| FINA040(x,y)},aFatura,5) //Exclus<75>o
If lMsErroAuto
fWrite(nFileLog,"Erro " + CRLF)
fWrite(nFileLog,"Erro ao remover o titulo: "+CRLF+ MSErroString()+ CRLF + tojson(aFatura) + CRLF)
Alert("Erro ao remover o titulo. Verificar F280PCAN()")
return .F.
else
fWrite(nFileLog,"Sucesso "+ CRLF)
fWrite(nFileLog,"Titulo removido" +CRLF)
endif
SE1ORIG->(dbskip())
end
/////////////////////////////////////////////
/////// Cancela as baixas
///
fWrite(nFileLog,"- cancela baixas" + CRLF)
cQuery := "select * from SE1060 where E1_FATURA = '"+ cFatCan +"' and D_E_L_E_T_ <> '*'"
If select("SE1ORIG") > 0
SE1ORIG->(DbCloseArea())
endif
TcQuery cQuery New Alias 'SE1ORIG'
dbSelectArea("SE1ORIG")
SE1ORIG->(DBGOTOP())
aFili := {"0102", "0105"}
While SE1ORIG->(!EOF()) //Loop entre todas as parcelas e filiais
SE1->(dbselectarea("SE1"))
SE1->(dbSetOrder(1))
For nI := 1 To len(aFili)
cFilAnt := aFili[nI]
fWrite(nFileLog,"dbseek" + CRLF)
if ! SE1->(dbSeek(aFili[nI]+ SE1ORIG->E1_PREFIXO+ SE1ORIG->E1_NUM+ SE1ORIG->E1_PARCELA+ SE1ORIG->E1_TIPO))
fWrite(nFileLog,"dbseek nao encontrou titulo para filial "+aFili[nI] + CRLF)
fWrite(nFileLog,"dbseek('"+aFili[nI]+ SE1ORIG->E1_PREFIXO+ SE1ORIG->E1_NUM+ SE1ORIG->E1_PARCELA+ SE1ORIG->E1_TIPO+"')" + CRLF)
LOOP
endif
nSE5Recno := u_RetSQLOne("select R_E_C_N_O_ from SE5010 where E5_FILIAL = '"+SE1->E1_FILIAL+"' and E5_PREFIXO = '"+SE1->E1_PREFIXO+"' and E5_TIPO = '"+SE1->E1_TIPO+"' and E5_NUMERO = '"+SE1->E1_NUM+"' "+;
" and E5_FATURA = '"+SE1->E1_FATURA+"' and E5_FATPREF='"+SE1->E1_FATPREF+"' and D_E_L_E_T_ <> '*' ", "R_E_C_N_O_")
//Removemos os Flags de Fatura para conseguirmos cancelar a baixa pelo FINA070
RECLOCK('SE1',.F.)
E1_FATURA := ''
E1_FATPREF:= ''
E1_TIPOFAT:= ''
E1_FLAGFAT:= ''
E1_DTFATUR:= StoD('')
MSUNLOCK()
SE5->(DbGoTo(nSE5Recno))
RECLOCK('SE5',.F.)
E5_MOTBX := 'NOR'
//E5_FATURA := ''
//E5_FATPREF:= ''
MSUNLOCK()
aBaixa := {{"E1_FILIAL" ,SE1->E1_FILIAL ,Nil},;
{"E1_PREFIXO" ,SE1->E1_PREFIXO ,Nil},;
{"E1_NUM" ,SE1->E1_NUM ,Nil},;
{"E1_TIPO" ,SE1->E1_TIPO ,Nil},;
{"E1_PARCELA" ,SE1->E1_PARCELA ,Nil},;
{"E1_DESCONT" ,SE1->E1_DESCONT ,Nil},;
{"E1_JUROS" ,SE1->E1_JUROS ,Nil},;
{"E1_MULTA" ,SE1->E1_MULTA ,Nil},;
{"E1_VLRREAL" ,SE1->E1_VLRREAL ,Nil},;
{"AUTMOTBX" ,"NOR" ,Nil},;
{"AUTDTBAIXA" ,date() ,Nil},;
{"AUTDTCREDITO",date() ,Nil},;
{"AUTHIST" ,"" ,Nil},;
{"AUTVALREC" ,SE1->E1_VALOR ,Nil}}
lMsErroAuto:=.F. //reseta lMsErroAuto
MSExecAuto ({|x,y| FINA070(x,y)},aBaixa, 5)
If lMsErroAuto
fWrite(nFileLog,SE1->E1_FILIAL+" : Não foi efetuada o cancelamento de baixa : "+CRLF+ MSErroString()+ CRLF + tojson(aBaixa) + CRLF)
return
else
fWrite(nFileLog,SE1->E1_FILIAL+" : cancelamento de baixa feito" +CRLF)
endif
Next nI
SE1ORIG->(dbskip())
end
Reset Environment
Return

3
test/fixtures/Data/Modelines/ruby2 vendored Normal file
View File

@@ -0,0 +1,3 @@
/* vim: set ts=8 sw=4 filetype=ruby tw=0: */
# Please help how do I into setting vim modlines

3
test/fixtures/Data/Modelines/ruby3 vendored Normal file
View File

@@ -0,0 +1,3 @@
/* vim: set ft=ruby ts=8 sw=4 tw=0: */
# I am not good at humor

View File

@@ -3,6 +3,7 @@ require "minitest/autorun"
require "mocha/setup"
require "linguist"
require 'color-proximity'
require 'licensee'
def fixtures_path
File.expand_path("../fixtures", __FILE__)

View File

@@ -301,6 +301,7 @@ class TestBlob < Minitest::Test
# Codemirror deps
assert sample_blob("codemirror/mode/blah.js").vendored?
assert sample_blob("codemirror/5.0/mode/blah.js").vendored?
# Debian packaging
assert sample_blob("debian/cron.d").vendored?
@@ -308,6 +309,12 @@ class TestBlob < Minitest::Test
# Erlang
assert sample_blob("rebar").vendored?
# git config files
assert_predicate fixture_blob("some/path/.gitattributes"), :vendored?
assert_predicate fixture_blob(".gitignore"), :vendored?
assert_predicate fixture_blob("special/path/.gitmodules"), :vendored?
# Minified JavaScript and CSS
assert sample_blob("foo.min.js").vendored?
assert sample_blob("foo.min.css").vendored?
@@ -326,9 +333,6 @@ class TestBlob < Minitest::Test
assert sample_blob("public/javascripts/controls.js").vendored?
assert sample_blob("public/javascripts/dragdrop.js").vendored?
# Samples
assert sample_blob("Samples/Ruby/foo.rb").vendored?
# jQuery
assert sample_blob("jquery.js").vendored?
assert sample_blob("public/javascripts/jquery.js").vendored?
@@ -358,6 +362,26 @@ class TestBlob < Minitest::Test
assert sample_blob("ui/minified/jquery.effects.blind.min.js").vendored?
assert sample_blob("ui/minified/jquery.ui.accordion.min.js").vendored?
# jQuery Gantt
assert sample_blob("web-app/jquery-gantt/js/jquery.fn.gantt.js").vendored?
# jQuery fancyBox
assert sample_blob("web-app/fancybox/jquery.fancybox.js").vendored?
# Fuel UX
assert sample_blob("web-app/fuelux/js/fuelux.js").vendored?
# jQuery File Upload
assert sample_blob("fileupload-9.0.0/jquery.fileupload-process.js").vendored?
# Slick
assert sample_blob("web-app/slickgrid/controls/slick.columnpicker.js").vendored?
# Leaflet plugins
assert sample_blob("leaflet-plugins/Leaflet.Coordinates-0.5.0.src.js").vendored?
assert sample_blob("leaflet-plugins/leaflet.draw-src.js").vendored?
assert sample_blob("leaflet-plugins/leaflet.spin.js").vendored?
# MooTools
assert sample_blob("public/javascripts/mootools-core-1.3.2-full-compat.js").vendored?
assert sample_blob("public/javascripts/mootools-core-1.3.2-full-compat-yc.js").vendored?
@@ -489,6 +513,11 @@ class TestBlob < Minitest::Test
# Crashlytics
assert sample_blob("Crashlytics.framework/Crashlytics.h").vendored?
# Xcode
assert sample_blob("myapp/My Template.xctemplate/___FILEBASENAME___.h").vendored?
assert sample_blob("myapp/My Images.xcassets/some/stuff.imageset/Contents.json").vendored?
assert !sample_blob("myapp/MyData.json").vendored?
end
def test_documentation
@@ -511,15 +540,36 @@ class TestBlob < Minitest::Test
assert_predicate fixture_blob("README"), :documentation?
assert_predicate fixture_blob("README.md"), :documentation?
assert_predicate fixture_blob("README.txt"), :documentation?
assert_predicate fixture_blob("Readme"), :documentation?
assert_predicate fixture_blob("readme"), :documentation?
assert_predicate fixture_blob("foo/README"), :documentation?
assert_predicate fixture_blob("CHANGE"), :documentation?
assert_predicate fixture_blob("CHANGE.md"), :documentation?
assert_predicate fixture_blob("CHANGE.txt"), :documentation?
assert_predicate fixture_blob("foo/CHANGE"), :documentation?
assert_predicate fixture_blob("CHANGELOG"), :documentation?
assert_predicate fixture_blob("CHANGELOG.md"), :documentation?
assert_predicate fixture_blob("CHANGELOG.txt"), :documentation?
assert_predicate fixture_blob("foo/CHANGELOG"), :documentation?
assert_predicate fixture_blob("CHANGES"), :documentation?
assert_predicate fixture_blob("CHANGES.md"), :documentation?
assert_predicate fixture_blob("CHANGES.txt"), :documentation?
assert_predicate fixture_blob("foo/CHANGES"), :documentation?
assert_predicate fixture_blob("CONTRIBUTING"), :documentation?
assert_predicate fixture_blob("CONTRIBUTING.md"), :documentation?
assert_predicate fixture_blob("CONTRIBUTING.txt"), :documentation?
assert_predicate fixture_blob("foo/CONTRIBUTING"), :documentation?
assert_predicate fixture_blob("examples/some-file.pl"), :documentation?
assert_predicate fixture_blob("Examples/some-example-file.rb"), :documentation?
assert_predicate fixture_blob("LICENSE"), :documentation?
assert_predicate fixture_blob("LICENCE.md"), :documentation?
assert_predicate fixture_blob("License.txt"), :documentation?
assert_predicate fixture_blob("LICENSE.txt"), :documentation?
assert_predicate fixture_blob("foo/LICENSE"), :documentation?
@@ -534,6 +584,11 @@ class TestBlob < Minitest::Test
assert_predicate fixture_blob("foo/INSTALL"), :documentation?
refute_predicate fixture_blob("foo.md"), :documentation?
# Samples
assert sample_blob("Samples/Ruby/foo.rb").documentation?
assert_predicate fixture_blob("INSTALL.txt"), :documentation?
end
def test_language

View File

@@ -3,12 +3,13 @@ require_relative "./helper"
class TestGrammars < Minitest::Test
ROOT = File.expand_path("../..", __FILE__)
LICENSE_WHITELIST = [
PROJECT_WHITELIST = [
# This grammar's MIT license is inside a subdirectory.
"vendor/grammars/SublimePapyrus",
# This grammar has a nonstandard but acceptable license.
"vendor/grammars/gap-tmbundle",
"vendor/grammars/factor",
# These grammars have no license but have been grandfathered in. New grammars
# must have a license that allows redistribution.
@@ -16,6 +17,23 @@ class TestGrammars < Minitest::Test
"vendor/grammars/x86-assembly-textmate-bundle"
].freeze
# List of allowed SPDX license names
LICENSE_WHITELIST = %w[
apache-2.0
bsd-2-clause
bsd-3-clause
cc-by-sa-3.0
gpl-2.0
gpl-3.0
lgpl-3.0
mit
mpl-2.0
textmate
unlicense
wtfpl
zlib
].freeze
def setup
@grammars = YAML.load(File.read(File.join(ROOT, "grammars.yml")))
end
@@ -62,48 +80,38 @@ class TestGrammars < Minitest::Test
end
end
def test_submodules_have_recognized_licenses
unrecognized = submodule_licenses.select { |k,v| v.nil? && Licensee::Project.new(k).license_file }
unrecognized.reject! { |k,v| PROJECT_WHITELIST.include?(k) }
message = "The following submodules have unrecognized licenses:\n* #{unrecognized.keys.join("\n* ")}\n"
message << "Please ensure that the project's LICENSE file contains the full text of the license."
assert_equal Hash.new, unrecognized, message
end
def test_submodules_have_licenses
categories = submodule_paths.group_by do |submodule|
files = Dir[File.join(ROOT, submodule, "*")]
license = files.find { |path| File.basename(path) =~ /\b(un)?licen[cs]e\b/i } || files.find { |path| File.basename(path) =~ /\bcopying\b/i }
if license.nil?
if readme = files.find { |path| File.basename(path) =~ /\Areadme\b/i }
license = readme if File.read(readme) =~ /\blicen[cs]e\b/i
end
end
if license.nil?
:unlicensed
elsif classify_license(license)
:licensed
else
:unrecognized
end
end
unlicensed = submodule_licenses.select { |k,v| v.nil? }.reject { |k,v| PROJECT_WHITELIST.include?(k) }
message = "The following submodules don't have licenses:\n* #{unlicensed.keys.join("\n* ")}\n"
message << "Please ensure that the project has a LICENSE file, and that the LICENSE file contains the full text of the license."
assert_equal Hash.new, unlicensed, message
end
unlicensed = categories[:unlicensed] || []
unrecognized = categories[:unrecognized] || []
disallowed_unlicensed = unlicensed - LICENSE_WHITELIST
disallowed_unrecognized = unrecognized - LICENSE_WHITELIST
extra_whitelist_entries = LICENSE_WHITELIST - (unlicensed | unrecognized)
def test_submodules_have_approved_licenses
unapproved = submodule_licenses.reject { |k,v| LICENSE_WHITELIST.include?(v) || PROJECT_WHITELIST.include?(k) }.map { |k,v| "#{k}: #{v}"}
message = "The following submodules have unapproved licenses:\n* #{unapproved.join("\n* ")}\n"
message << "The license must be added to the LICENSE_WHITELIST in /test/test_grammars.rb once approved."
assert_equal [], unapproved, message
end
message = ""
if disallowed_unlicensed.any?
message << "The following grammar submodules don't seem to have a license. All grammars must have a license that permits redistribution.\n"
message << disallowed_unlicensed.sort.join("\n")
end
if disallowed_unrecognized.any?
message << "\n\n" unless message.empty?
message << "The following grammar submodules have an unrecognized license. Please update #{__FILE__} to recognize the license.\n"
message << disallowed_unrecognized.sort.join("\n")
end
if extra_whitelist_entries.any?
message << "\n\n" unless message.empty?
message << "The following grammar submodules are listed in LICENSE_WHITELIST but either have a license (yay!)\n"
message << "or have been removed from the repository. Please remove them from the whitelist.\n"
message << extra_whitelist_entries.sort.join("\n")
end
def test_submodules_whitelist_has_no_extra_entries
extra_whitelist_entries = PROJECT_WHITELIST - submodule_licenses.select { |k,v| v.nil? }.keys
not_present = extra_whitelist_entries.reject { |k,v| Dir.exists?(k) }
licensed = extra_whitelist_entries.select { |k,v| submodule_licenses[k] }
assert disallowed_unlicensed.empty? && disallowed_unrecognized.empty? && extra_whitelist_entries.empty?, message
msg = "The following whitelisted submodules don't appear to be part of the project:\n* #{not_present.join("\n* ")}"
assert_equal [], not_present, msg
msg = "The following whitelisted submodules actually have licenses and don't need to be whitelisted:\n* #{licensed.join("\n* ")}"
assert_equal [], licensed, msg
end
private
@@ -112,30 +120,57 @@ class TestGrammars < Minitest::Test
@submodule_paths ||= `git config --list --file "#{File.join(ROOT, ".gitmodules")}"`.lines.grep(/\.path=/).map { |line| line.chomp.split("=", 2).last }
end
# Returns a hash of submodules in the form of submodule_path => license
def submodule_licenses
@@submodule_licenses ||= begin
submodules = {}
submodule_paths.each { |submodule| submodules[submodule] = submodule_license(submodule) }
submodules
end
end
# Given the path to a submodule, return its SPDX-compliant license key
def submodule_license(submodule)
# Prefer Licensee to detect a submodule's license
project = Licensee::Project.new(submodule)
return project.license.key if project.license
# We know a license file exists, but Licensee wasn't able to detect the license,
# Let's try our own more permissive regex method
if project.license_file
path = File.expand_path project.license_file.path, submodule
license = classify_license(path)
return license if license
end
# Neither Licensee nor our own regex was able to detect the license, let's check the readme
files = Dir[File.join(ROOT, submodule, "*")]
if readme = files.find { |path| File.basename(path) =~ /\Areadme\b/i }
classify_license(readme)
end
end
def classify_license(path)
content = File.read(path)
return unless content =~ /\blicen[cs]e\b/i
if content.include?("Apache License") && content.include?("2.0")
"Apache 2.0"
"apache-2.0"
elsif content.include?("GNU") && content =~ /general/i && content =~ /public/i
if content =~ /version 2/i
"GPLv2"
"gpl-2.0"
elsif content =~ /version 3/i
"GPLv3"
"gpl-3.0"
end
elsif content.include?("GPL") && content.include?("http://www.gnu.org/licenses/gpl.html")
"GPLv3"
elsif content.include?("Creative Commons")
"CC"
"gpl-3.0"
elsif content.include?("Creative Commons Attribution-Share Alike 3.0")
"cc-by-sa-3.0"
elsif content.include?("tidy-license.txt") || content.include?("If not otherwise specified (see below)")
"textmate"
elsif content =~ /^\s*[*-]\s+Redistribution/ || content.include?("Redistributions of source code")
"BSD"
elsif content.include?("Permission is hereby granted") || content =~ /\bMIT\b/
"MIT"
elsif content.include?("unlicense.org")
"unlicense"
"mit"
elsif content.include?("http://www.wtfpl.net/txt/copying/")
"WTFPL"
"wtfpl"
elsif content.include?("zlib") && content.include?("license") && content.include?("2. Altered source versions must be plainly marked as such")
"zlib"
end

View File

@@ -22,20 +22,15 @@ class TestHeuristcs < Minitest::Test
assert_equal [], results
end
# Candidate languages = ["C++", "Objective-C"]
def test_obj_c_by_heuristics
# Only calling out '.h' filenames as these are the ones causing issues
assert_heuristics({
"Objective-C" => all_fixtures("Objective-C", "*.h"),
"C++" => ["C++/render_adapter.cpp", "C++/ThreadedQueue.h"],
"C" => nil
})
end
def assert_heuristics(hash)
candidates = hash.keys.map { |l| Language[l] }
def test_c_by_heuristics
languages = [Language["C++"], Language["Objective-C"], Language["C"]]
results = Heuristics.call(file_blob("C/ArrowLeft.h"), languages)
assert_equal [], results
hash.each do |language, blobs|
Array(blobs).each do |blob|
result = Heuristics.call(file_blob(blob), candidates)
assert_equal [Language[language]], result, "Failed for #{blob}"
end
end
end
def test_detect_still_works_if_nothing_matches
@@ -44,20 +39,119 @@ class TestHeuristcs < Minitest::Test
assert_equal Language["Objective-C"], match
end
# Candidate languages = ["Perl", "Prolog"]
def test_pl_prolog_perl_by_heuristics
# Candidate languages = ["AGS Script", "AsciiDoc", "Public Key"]
def test_asc_by_heuristics
assert_heuristics({
"Prolog" => all_fixtures("Prolog/*.pl"),
"Perl" => all_fixtures("Perl/*.pl") + ["Perl/perl-test.t"],
"Perl6" => all_fixtures("Perl6/*.pl")
"AsciiDoc" => all_fixtures("AsciiDoc", "*.asc"),
"AGS Script" => all_fixtures("AGS Script", "*.asc"),
"Public Key" => all_fixtures("Public Key", "*.asc")
})
end
# Candidate languages = ["ECL", "Prolog"]
def test_ecl_prolog_by_heuristics
def test_bb_by_heuristics
assert_heuristics({
"ECL" => "ECL/sample.ecl",
"Prolog" => "Prolog/or-constraint.ecl"
"BitBake" => all_fixtures("BitBake", "*.bb"),
"BlitzBasic" => all_fixtures("BlitzBasic", "*.bb")
})
end
def test_ch_by_heuristics
assert_heuristics({
"xBase" => all_fixtures("xBase", ".ch")
})
end
def test_cl_by_heuristics
assert_heuristics({
"Common Lisp" => all_fixtures("Common Lisp", "*.cl"),
"OpenCL" => all_fixtures("OpenCL", "*.cl")
})
end
def test_cs_by_heuristics
assert_heuristics({
"C#" => all_fixtures("C#", "*.cs"),
"Smalltalk" => all_fixtures("Smalltalk", "*.cs")
})
end
# Candidate languages = ["ECL", "ECLiPSe"]
def test_ecl_by_heuristics
assert_heuristics({
"ECL" => all_fixtures("ECL", "*.ecl"),
"ECLiPSe" => all_fixtures("ECLiPSe", "*.ecl")
})
end
def test_f_by_heuristics
assert_heuristics({
"FORTRAN" => all_fixtures("FORTRAN", "*.f") + all_fixtures("FORTRAN", "*.for"),
"Forth" => all_fixtures("Forth", "*.f") + all_fixtures("Forth", "*.for")
})
end
def test_fr_by_heuristics
assert_heuristics({
"Frege" => all_fixtures("Frege", "*.fr"),
"Forth" => all_fixtures("Forth", "*.fr"),
"Text" => all_fixtures("Text", "*.fr")
})
end
def test_fs_by_heuristics
assert_heuristics({
"F#" => all_fixtures("F#", "*.fs"),
"Forth" => all_fixtures("Forth", "*.fs"),
"GLSL" => all_fixtures("GLSL", "*.fs")
})
end
# Candidate languages = ["Hack", "PHP"]
def test_hack_by_heuristics
assert_heuristics({
"Hack" => all_fixtures("Hack", "*.php"),
"PHP" => all_fixtures("PHP", "*.php")
})
end
def test_ls_by_heuristics
assert_heuristics({
"LiveScript" => all_fixtures("LiveScript", "*.ls"),
"LoomScript" => all_fixtures("LoomScript", "*.ls")
})
end
def test_lsp_by_heuristics
assert_heuristics({
"Common Lisp" => all_fixtures("Common Lisp", "*.lsp") + all_fixtures("Common Lisp", "*.lisp"),
"NewLisp" => all_fixtures("NewLisp", "*.lsp") + all_fixtures("NewLisp", "*.lisp")
})
end
# Candidate languages = ["C++", "Objective-C"]
def test_obj_c_by_heuristics
# Only calling out '.h' filenames as these are the ones causing issues
assert_heuristics({
"Objective-C" => all_fixtures("Objective-C", "*.h"),
"C++" => ["C++/scanner.h", "C++/qscicommand.h", "C++/v8.h", "C++/gdsdbreader.h"],
"C" => nil
})
end
# Candidate languages = ["Perl", "Perl6", "Prolog"]
def test_pl_prolog_perl_by_heuristics
assert_heuristics({
"Prolog" => all_fixtures("Prolog", "*.pl"),
"Perl" => ["Perl/oo1.pl", "Perl/oo2.pl", "Perl/oo3.pl", "Perl/fib.pl", "Perl/use5.pl"],
"Perl6" => all_fixtures("Perl6", "*.pl")
})
end
# Candidate languages = ["Perl", "Perl6"]
def test_pm_perl_by_heuristics
assert_heuristics({
"Perl" => all_fixtures("Perl", "*.pm"),
"Perl6" => all_fixtures("Perl6", "*.pm")
})
end
@@ -71,97 +165,28 @@ class TestHeuristcs < Minitest::Test
})
end
# Candidate languages = ["AGS Script", "AsciiDoc", "Public Key"]
def test_asc_by_heuristics
def test_r_by_heuristics
assert_heuristics({
"AsciiDoc" => "AsciiDoc/list.asc",
"AGS Script" => "AGS Script/GlobalScript.asc",
"Public Key" => all_fixtures("Public Key", "*.asc")
})
end
def test_cl_by_heuristics
assert_heuristics({
"Common Lisp" => all_fixtures("Common Lisp"),
"OpenCL" => all_fixtures("OpenCL")
})
end
def test_f_by_heuristics
assert_heuristics({
"FORTRAN" => all_fixtures("FORTRAN"),
"Forth" => all_fixtures("Forth")
})
end
# Candidate languages = ["Hack", "PHP"]
def test_hack_by_heuristics
assert_heuristics({
"Hack" => "Hack/funs.php",
"PHP" => "PHP/Model.php"
"R" => all_fixtures("R", "*.r") + all_fixtures("R", "*.R"),
"Rebol" => all_fixtures("Rebol", "*.r")
})
end
# Candidate languages = ["Scala", "SuperCollider"]
def test_sc_supercollider_scala_by_heuristics
assert_heuristics({
"SuperCollider" => "SuperCollider/WarpPreset.sc",
"Scala" => "Scala/node11.sc"
"SuperCollider" => all_fixtures("SuperCollider", "*.sc"),
"Scala" => all_fixtures("Scala", "*.sc")
})
end
def test_fs_by_heuristics
# Candidate languages = ["Perl", "Perl6"]
def test_t_perl_by_heuristics
assert_heuristics({
"F#" => all_fixtures("F#"),
"Forth" => all_fixtures("Forth"),
"GLSL" => all_fixtures("GLSL")
})
end
def test_fr_by_heuristics
assert_heuristics({
"Frege" => all_fixtures("Frege"),
"Forth" => all_fixtures("Forth"),
"Text" => all_fixtures("Text")
})
end
def test_bb_by_heuristics
assert_heuristics({
"BitBake" => all_fixtures("BitBake"),
"BlitzBasic" => all_fixtures("BlitzBasic")
})
end
def test_lsp_by_heuristics
assert_heuristics({
"Common Lisp" => all_fixtures("Common Lisp"),
"NewLisp" => all_fixtures("NewLisp")
})
end
def test_cs_by_heuristics
assert_heuristics({
"C#" => all_fixtures("C#", "*.cs"),
"Smalltalk" => all_fixtures("Smalltalk", "*.cs")
})
end
def assert_heuristics(hash)
candidates = hash.keys.map { |l| Language[l] }
hash.each do |language, blobs|
Array(blobs).each do |blob|
result = Heuristics.call(file_blob(blob), candidates)
assert_equal [Language[language]], result, "Failed for #{blob}"
end
end
end
def test_ls_by_heuristics
assert_heuristics({
"LiveScript" => "LiveScript/hello.ls",
"LoomScript" => "LoomScript/HelloWorld.ls"
"Perl" => all_fixtures("Perl", "*.t"),
"Perl6" => ["Perl6/01-dash-uppercase-i.t", "Perl6/01-parse.t", "Perl6/advent2009-day16.t",
"Perl6/basic-open.t", "Perl6/calendar.t", "Perl6/for.t", "Perl6/hash.t",
"Perl6/listquote-whitespace.t"]
})
end

View File

@@ -263,6 +263,24 @@ class TestLanguage < Minitest::Test
assert_equal 'AGS Script', Language.find_by_alias('AGS').name
end
def test_find_ignores_comma
assert_equal 'Rust', Language['rust,no_run'].name
end
def test_find_by_name_ignores_comma
assert_equal Language['Rust'], Language.find_by_name('rust,no_run')
end
def test_find_by_alias_ignores_comma
assert_equal Language['Rust'], Language.find_by_alias('rust,no_run')
end
def test_doesnt_blow_up_with_blank_lookup
assert_equal nil, Language.find_by_alias('')
assert_equal nil, Language.find_by_name(nil)
assert_equal nil, Language[""]
end
def test_name
assert_equal 'Perl', Language['Perl'].name
assert_equal 'Python', Language['Python'].name

View File

@@ -9,6 +9,8 @@ class TestModelines < Minitest::Test
def test_modeline_strategy
assert_modeline Language["Ruby"], fixture_blob("Data/Modelines/ruby")
assert_modeline Language["Ruby"], fixture_blob("Data/Modelines/ruby2")
assert_modeline Language["Ruby"], fixture_blob("Data/Modelines/ruby3")
assert_modeline Language["C++"], fixture_blob("Data/Modelines/seeplusplus")
assert_modeline Language["C++"], fixture_blob("Data/Modelines/seeplusplusEmacs1")
assert_modeline Language["C++"], fixture_blob("Data/Modelines/seeplusplusEmacs2")
@@ -27,6 +29,8 @@ class TestModelines < Minitest::Test
def test_modeline_languages
assert_equal Language["Ruby"], fixture_blob("Data/Modelines/ruby").language
assert_equal Language["Ruby"], fixture_blob("Data/Modelines/ruby2").language
assert_equal Language["Ruby"], fixture_blob("Data/Modelines/ruby3").language
assert_equal Language["C++"], fixture_blob("Data/Modelines/seeplusplus").language
assert_equal Language["C++"], fixture_blob("Data/Modelines/seeplusplusEmacs1").language
assert_equal Language["C++"], fixture_blob("Data/Modelines/seeplusplusEmacs2").language

View File

@@ -9,7 +9,7 @@ class TestPedantic < Minitest::Test
assert_sorted LANGUAGES.keys
end
def test_extensions_are_sorted
def test_nonprimary_extensions_are_sorted
LANGUAGES.each do |name, language|
extensions = language['extensions']
assert_sorted extensions[1..-1].map(&:downcase) if extensions && extensions.size > 1
@@ -32,6 +32,18 @@ class TestPedantic < Minitest::Test
end
end
def test_heuristics_are_sorted
file = File.expand_path("../../lib/linguist/heuristics.rb", __FILE__)
heuristics = open(file).each.grep(/^ *disambiguate/)
assert_sorted heuristics
end
def test_heuristics_tests_are_sorted
file = File.expand_path("../test_heuristics.rb", __FILE__)
tests = open(file).each.grep(/^ *def test_[a-z_]+_by_heuristics/)
assert_sorted tests
end
def assert_sorted(list)
list.each_cons(2) do |previous, item|
flunk "#{previous} should come after #{item}" if previous > item

View File

@@ -111,4 +111,14 @@ class TestRepository < Minitest::Test
refute_predicate readme, :documentation?
assert_predicate arduino, :documentation?
end
def test_linguist_override_generated?
attr_commit = "351c1cc8fd57340839bdb400d7812332af80e9bd"
repo = linguist_repo(attr_commit).read_index
rakefile = Linguist::LazyBlob.new(rugged_repository, attr_commit, "Rakefile")
# overridden .gitattributes
assert rakefile.generated?
end
end

View File

@@ -25,6 +25,10 @@ class TestTokenizer < Minitest::Test
assert_equal %w(add \( \)), tokenize('add(123, 456)')
assert_equal %w(|), tokenize('0x01 | 0x10')
assert_equal %w(*), tokenize('500.42 * 1.0')
assert_equal %w(), tokenize('1.23e-04')
assert_equal %w(), tokenize('1.0f')
assert_equal %w(), tokenize('1234ULL')
assert_equal %w(G1 X55 Y5 F2000), tokenize('G1 X55 Y5 F2000')
end
def test_skip_comments

Some files were not shown because too many files have changed in this diff Show More