Compare commits

...

197 Commits

Author SHA1 Message Date
Arfon Smith
b8f3078966 Bumping version to v4.3.1 2015-02-04 08:31:36 -06:00
Arfon Smith
d496aaae55 Grammar update 2015-02-04 08:28:55 -06:00
Brandon Keepers
87e60cfd78 Merge pull request #2063 from larsbrinkhoff/modeline
'Text' shouldn't qualify as a valid modeline language.
2015-02-04 09:18:38 -05:00
Lars Brinkhoff
2077fa3837 'Text' doesn't qualify as a valid modeline language. 2015-02-04 08:20:19 +01:00
Arfon Smith
95bedf0bfc Merge pull request #2072 from larsbrinkhoff/4TH
Add .4TH Forth extension.
2015-02-03 19:49:46 -06:00
Lars Brinkhoff
3a1b17f1f9 Add .4TH Forth extension. 2015-02-03 13:04:06 +01:00
Arfon Smith
bdec1ac64d Merge pull request #2064 from pchaigno/new-grammars
Grammars for 8 languages
2015-02-01 14:09:42 -08:00
Paul Chaignon
36a0d760e9 Grammar for J from Sublime Text package 2015-02-01 21:52:24 +01:00
Paul Chaignon
a901e85c3c Sample for J 2015-02-01 21:50:34 +01:00
Paul Chaignon
6e9dc2339d Grammar for Golo from Sublime Text package 2015-02-01 21:41:22 +01:00
Paul Chaignon
3864e712ef Grammar for GDScript from Sublime Text package 2015-02-01 21:37:07 +01:00
Paul Chaignon
8376f1e4a4 Grammar for Creole from Sublime Text package 2015-02-01 21:34:10 +01:00
Paul Chaignon
1b0fd752d3 Grammar for CLIPS from Sublime Text package 2015-02-01 21:29:03 +01:00
Paul Chaignon
bef473a48b Samples for CLIPS 2015-02-01 21:27:14 +01:00
Paul Chaignon
0c60078d27 Grammar for APL from Sublime Text package 2015-02-01 21:16:45 +01:00
Arfon Smith
2f65462ce0 Trailing slash fix. 2015-02-01 11:39:59 -08:00
Arfon Smith
ace6156c65 Merge pull request #2058 from pchaigno/move-autohotkey-grammar
Update URL for AutoHotkey grammar
2015-02-01 11:24:54 -08:00
Paul Chaignon
ada8feba34 Merge branch 'master' into move-autohotkey-grammar 2015-02-01 20:12:22 +01:00
Arfon Smith
75d685a7f4 Merge pull request #2000 from jayphelps/patch-2
Added `htmlbars` as an alias for Handlebars
2015-02-01 11:03:13 -08:00
Arfon Smith
6b7f20323b Merge pull request #2057 from steinwaywhw/master
Adding ATS language support by converting existing SublimeText syntax def
2015-01-31 19:03:05 -06:00
Steinway Wu
c2ab5bc09d Merge remote-tracking branch 'upstream/master'
Conflicts:
	.gitmodules
2015-01-31 17:24:23 -05:00
Arfon Smith
95d5b8bdbc Merge pull request #2059 from pchaigno/modelica
Support of Modelica language
2015-01-31 15:17:38 -06:00
Steinway Wu
da7b3182e8 update to newer ats-mode-sublime 2015-01-31 16:00:00 -05:00
Paul Chaignon
08790f2f0a Grammar for Modelica 2015-01-31 13:44:11 +01:00
Paul Chaignon
896270e617 Support for Modelica 2015-01-31 13:37:42 +01:00
Paul Chaignon
fb40ee986f Update URL for AutoHotkey grammar 2015-01-31 12:16:41 +01:00
Paul Chaignon
20b82e4bc9 Remove grammar for AutoHotkey 2015-01-31 12:14:36 +01:00
Steinway Wu
513347911e update ats grammar to the latest commits 2015-01-30 13:25:22 -05:00
Arfon Smith
1a3960e95d Merge pull request #2056 from github/cut-release-v4.3.0
Cut release v4.3.0
2015-01-30 12:15:55 -06:00
Steinway Wu
7d9a47b7c3 remove atxt support 2015-01-30 13:02:37 -05:00
Steinway Wu
c80d085e33 revise git module 2015-01-30 12:56:17 -05:00
Steinway Wu
98518e5c8c add ats mode from sublime package 2015-01-30 12:52:38 -05:00
Arfon Smith
2b7a488d64 Bumping version to 4.3.0 2015-01-30 11:25:50 -06:00
Arfon Smith
25aa6669be Updating grammars 2015-01-30 11:17:13 -06:00
Arfon Smith
ef9e1c4e4f Merge pull request #2055 from github/cp-cpp
Adding cp as a C++ extension
2015-01-30 11:04:21 -06:00
Arfon Smith
cf483c28e3 Adding cp as a C++ extension 2015-01-30 10:56:06 -06:00
Arfon Smith
fecc39d97d Merge pull request #1652 from github/objc-mercury
Disambiguate Matlab, Objective-C, Mathematica, M, and Mercury
2015-01-30 08:51:51 -06:00
Arfon Smith
339370a703 Extracting ObjectiveC regex into a constant 2015-01-30 08:48:26 -06:00
Arfon Smith
359e5157a8 Merge branch 'master' into objc-mercury 2015-01-30 08:34:50 -06:00
Arfon Smith
207bd8d77c Merge pull request #2001 from rusthon/master
added Rusthon to languages list for syntax highlighting in md.
2015-01-30 08:28:27 -06:00
Arfon Smith
ba5454808e Merge pull request #1916 from Mailaender/dot-desktop-files
Added support for Linux .desktop and .emacs.desktop files
2015-01-30 08:24:27 -06:00
Arfon Smith
9196ba91bb Merge pull request #2041 from github/emacs-vim-mode-lines
Emacs vim modelines
2015-01-29 20:12:47 -06:00
Arfon Smith
5ff1b02e49 1.9.3 reprieve 2015-01-29 20:03:18 -06:00
Brandon Keepers
4f92d620eb Simplify detect 2015-01-29 16:28:54 -06:00
Brandon Keepers
e7f5779659 Break modelines into two regular expressions
This makes them easier to read and maintains Ruby 1.9 compatibility
2015-01-29 16:28:54 -06:00
Arfon Smith
512cfc4858 Dropping 1.9.3 2015-01-29 15:45:07 -06:00
Brandon Keepers
437ba70b9e Find modeline anywhere in the data 2015-01-29 13:14:06 -06:00
Brandon Keepers
fadca563bc Move regex to a constant 2015-01-29 13:09:12 -06:00
Brandon Keepers
7a601b196e Fix regex syntax 2015-01-29 13:07:18 -06:00
Arfon Smith
bf6bd246fd Syntax tweak 2015-01-28 16:52:26 -06:00
Arfon Smith
168ff4c050 Merge pull request #2048 from infininight/master
Update Swift grammar
2015-01-28 07:02:49 -06:00
Arfon Smith
d6fdbafa3c Merge pull request #2047 from vors/powershell
Replace vendor powershell.tmBundle by SublimeText powershell
2015-01-28 07:00:18 -06:00
Michael Sheets
3e1570a716 Update Swift grammar 2015-01-28 04:12:05 -06:00
Sergei Vorobev
160c0b4ac0 Replace vendor powershell.tmBundle by SublimeText powershell. Improve samples for PS. 2015-01-27 18:31:17 -08:00
Arfon Smith
cf0bc3914f Merge pull request #2044 from pchaigno/liquid-grammar
Grammar for Liquid
2015-01-27 09:45:24 -06:00
Arfon Smith
96154627d3 Clearer regex 2015-01-27 09:42:24 -06:00
Arfon Smith
6f07b62a3f New year. 2015-01-27 08:18:34 -06:00
Paul Chaignon
cae17b91b8 Grammar (TextMate bundle) for Liquid 2015-01-27 13:02:13 +01:00
Arfon Smith
69b68f3a44 Extracting common methods into helper. 2015-01-26 16:22:55 -06:00
Arfon Smith
20a3e7e4b8 Update docs 2015-01-26 16:12:20 -06:00
Arfon Smith
119a8fff1e Emacs modeline fixtures 2015-01-26 15:38:19 -06:00
Arfon Smith
8094b1bd92 Test strategy and language 2015-01-26 15:38:07 -06:00
Arfon Smith
98fc4d78aa Slightly reworked regex. 2015-01-26 15:37:45 -06:00
Arfon Smith
d773c2e90d Escape the * 2015-01-26 15:18:40 -06:00
Arfon Smith
7929e7ab9c Adding Emacs modes 2015-01-26 15:11:55 -06:00
Arfon Smith
e8e95f113c Modeline should come first (as it's an override) 2015-01-26 15:03:22 -06:00
Arfon Smith
429c791377 Testing Vim modeline support 2015-01-26 14:39:07 -06:00
Arfon Smith
e536eea5b6 Basic Vim modeline detection strategy 2015-01-26 14:22:09 -06:00
Adam Roben
0a5b5eadeb Merge pull request #1986 from pchaigno/remove-lexer
Remove last mentions of lexer
2015-01-26 10:28:11 -05:00
Arfon Smith
12351d3a8a Merge pull request #2037 from pchaigno/asp-tm_scope
Change TextMate scope for ASP
2015-01-25 19:24:15 -06:00
Paul Chaignon
7421b2e553 Change TextMate scope for ASP 2015-01-25 19:04:00 +01:00
Arfon Smith
8aa4dce6f4 Merge pull request #1723 from techhat/pythonmultiline
Python also supports triple single-quotes for comments
2015-01-24 10:54:30 -06:00
Joseph Hall
feeceefe99 Merge pull request #1 from pchaigno/pythonmultiline
Multiline comments for Python during tokenization
2015-01-24 09:00:00 -07:00
Arfon Smith
60483e3216 Merge pull request #1143 from pchaigno/newlisp
NewLisp language added with some heuristics
2015-01-24 08:58:07 -06:00
Arfon Smith
49837e0c20 Merge pull request #2031 from pchaigno/fix-pascal-samples
Better samples for Pascal
2015-01-24 08:51:26 -06:00
Paul Chaignon
c7668ad882 Better samples for Pascal 2015-01-24 01:32:56 +01:00
Paul Chaignon
4f37563be1 Remove mention of lexer for GAP in languages.yml 2015-01-24 01:02:58 +01:00
Paul Chaignon
9c3ab95048 Merge branch 'master' into remove-lexer 2015-01-24 00:57:20 +01:00
Paul Chaignon
bf5651e127 Merge branch 'master' into newlisp 2015-01-24 00:15:52 +01:00
Arfon Smith
f854a12043 Merge pull request #2025 from hyperair/scad
Specify ace_mode for OpenSCAD
2015-01-23 15:29:57 -06:00
Arfon Smith
dd09f02f53 Adding some samples to verify new heuristics 2015-01-23 15:25:36 -06:00
Arfon Smith
268f43d668 Merge pull request #2030 from github/tst
GAP and Scilab
2015-01-23 14:50:31 -06:00
Arfon Smith
d95b7504ab Merge branch 'master' into tst
Conflicts:
	lib/linguist/heuristics.rb
	lib/linguist/language.rb
	lib/linguist/languages.yml
2015-01-23 14:35:27 -06:00
Arfon Smith
4d2b6ee99e Updating heuristic order 2015-01-23 13:22:45 -06:00
Arfon Smith
6ad6984fe7 Merge branch 'master' into mmmmmm
Conflicts:
	lib/linguist/heuristics.rb
2015-01-23 11:02:25 -06:00
Chow Loong Jin
97d48a204a Specify ace_mode for OpenSCAD 2015-01-23 14:14:15 +08:00
Arfon Smith
cc56ddb354 Merge pull request #2015 from github/cut-release-v4.2.7
Cut release v4.2.7
2015-01-20 10:40:38 -06:00
Arfon Smith
3ce527b0b9 Updating version.rb 2015-01-20 09:29:26 -06:00
Arfon Smith
94d4d92cc0 Merge branch 'master' into cut-release-v4.2.7 2015-01-20 09:22:55 -06:00
Arfon Smith
72b268b253 Updating grammars 2015-01-20 09:17:46 -06:00
Arfon Smith
2c7885bbc1 Merge pull request #2014 from github/revert-1976-path-for-fileblob
Revert "Use path for Generated?"
2015-01-20 09:12:33 -06:00
Arfon Smith
36120a9122 Revert "Use path for Generated?" 2015-01-20 08:58:11 -06:00
Arfon Smith
6305ec3f31 Merge pull request #2005 from JJ/master
Adds Rexfile
2015-01-18 08:44:00 -06:00
Juan Julián Merelo Guervós
b319731a2d Adds Rexfile
That's a Vagrantfile-like file for the Rex provisioning tool.
2015-01-18 13:16:33 +01:00
Arfon Smith
885740dad6 Merge pull request #1944 from pchaigno/revert-1438
Revert #1438: add file extensions with multiple segments
2015-01-17 16:12:56 -06:00
hartsantler
b178268cbc changed to make rusthon a python alias.
https://github.com/github/linguist/pull/2001
2015-01-17 13:34:12 -08:00
Arfon Smith
3ae556893f Merge pull request #1900 from dalehenrich/master
Use source.smalltalk to hightlight STON file
2015-01-17 09:15:57 -06:00
Arfon Smith
43b297636d Merge pull request #2003 from pchaigno/cs-heuristics
Heuristics for .cs files: C# and Smalltalk
2015-01-17 09:10:58 -06:00
Paul Chaignon
8cd17698fe Slightly improve heuristic for C# 2015-01-17 13:34:06 +01:00
hartsantler
7abcc39c8c added Rusthon to languages list for syntax highlighting in md. 2015-01-16 04:32:52 -08:00
Jay Phelps
a5b915d571 Added htmlbars as an alias for Handlebars
See tildeio/htmlbars
2015-01-15 22:13:06 -08:00
Arfon Smith
0fcdca653a Merge pull request #1995 from phuicy/master
Support for web ontology language (OWL)
2015-01-15 20:26:19 -06:00
Adam Roben
9ec801d495 Merge pull request #1976 from github/path-for-fileblob
Use path for Generated?
2015-01-15 21:20:00 -05:00
Arfon Smith
4ccbdcb93c Merge pull request #1999 from github/more-relative-paths
More relative paths
2015-01-15 16:41:36 -06:00
Arfon Smith
53f909f2a1 Merge pull request #1998 from Frigolit/master
Added interpreter "pike" for Pike.
2015-01-15 16:07:22 -06:00
Pontus Rodling
f8603705a8 Added shebang sample for Pike. 2015-01-16 09:56:13 +13:00
Pontus Rodling
3bc1b97a68 Added interpreter "pike" for Pike. 2015-01-16 09:18:55 +13:00
Arfon Smith
27ed17e62e Merge pull request #1996 from aoetk/fxml
Add FXML to languages.yml
2015-01-15 09:30:02 -06:00
Arfon Smith
f3d5090d51 Merge pull request #1841 from danmichaelo/turtle
Add support for Turtle and SPARQL
2015-01-15 09:28:26 -06:00
AOE Takashi
d030f9be99 Add support for FXML files. 2015-01-15 17:42:47 +09:00
Dan Michael O. Heggø
774d18ed8f Add support for Turtle and SPARQL 2015-01-14 23:45:25 +01:00
Guy Burroughes
d39f5eedf1 Fixed issues for web ontology to pass tests 2015-01-14 19:18:25 +00:00
phuicy
1a1e21f344 Added Web Ontology Language Support
As syntax is only xml, it is a very simple fix.
2015-01-14 18:11:33 +00:00
Adam Roben
96c7bc30d8 Simplify blob tests
Now that FileBlobs with relative paths can still access their files on
disk, we can use relative paths for all FileBlobs in the test. This more
closely matches the behavior in github.com's codebase, where all blobs
use relative paths.
2015-01-14 08:57:53 -05:00
Adam Roben
0328b1cb3c Use the original FileBlob path for filesystem access
FileBlob now remembers the full path that was passed to its constructor,
and uses that for performing filesystem access. FileBlob#path continues
to return a relative path as before. This ensures that you can call
methods like #size and #mode on FileBlobs with relative paths,
regardless of the current working directory.
2015-01-14 08:54:49 -05:00
Arfon Smith
ad0cc7f39d Merge pull request #1989 from davidabian/patch-1
Update languages.yml with *.sagews
2015-01-12 08:12:45 -06:00
David Abián
42a491ab8b Merge branch 'patch-1' of https://github.com/davidabian/linguist into patch-1 2015-01-11 22:09:48 +01:00
David Abián
ef4b25591b Sample sagews file, as requested 2015-01-11 22:08:56 +01:00
David Abián
fbc99cf7e6 Update languages.yml with *.sagews 2015-01-11 19:43:20 +01:00
Paul Chaignon
5d0e9484ce Remove last mentions of lexer 2015-01-11 10:02:52 +01:00
Brandon Keepers
1bc6a6dfe5 Merge pull request #1955 from pchaigno/zephir-generated-samples
Reclassify three samples files generated from Zephir code
2015-01-10 10:43:00 -08:00
Arfon Smith
30be3265fb Merge pull request #1982 from pchaigno/racket-grammar
New grammar for Racket
2015-01-10 09:18:45 -06:00
Paul Chaignon
ecaad7979f New grammar for Racket 2015-01-10 14:40:58 +01:00
Paul Chaignon
d638edbeae Remove grammar for Racket 2015-01-10 13:10:04 +01:00
Brandon Keepers
91779b6de9 Merge pull request #1978 from larsbrinkhoff/cmake_samples
More CMake samples
2015-01-09 15:53:25 -08:00
Adam Roben
3abb0e80d5 Merge pull request #1963 from github/aroben-patch-1
Recommend updating grammars when releasing
2015-01-09 16:20:30 -05:00
Arfon Smith
f4c1cc576b Modifying BlobHelper and FileBlob to use path 2015-01-09 15:15:34 -06:00
Paul Chaignon
986235dce7 Sample file for .cmake.in 2015-01-09 20:02:01 +01:00
Lars Brinkhoff
1f0c88a934 Restore the .cmake.in extension. 2015-01-09 20:02:01 +01:00
Lars Brinkhoff
94f7dd2238 More CMake samples. 2015-01-09 20:02:01 +01:00
Arfon Smith
79fd12eb75 Merge branch 'master' into path-for-fileblob 2015-01-09 11:56:11 -06:00
Arfon Smith
05a98be1e5 Merge pull request #1977 from github/auto
Auto
2015-01-09 11:50:01 -06:00
Arfon Smith
24eb1d3fe2 Updating file regex to support unlicense.txt 2015-01-09 11:43:34 -06:00
Brandon Keepers
75d1bcdc69 Merge pull request #1912 from 0a-/master
vendor.yml: improved & added more regex for auto-generated stylesheets
2015-01-09 08:09:57 -08:00
Arfon Smith
7549eff9c1 Merge branch 'master' into auto
Conflicts:
	.gitmodules
2015-01-08 17:00:45 -06:00
Arfon Smith
6e2b4f7514 Updating ref to include license 2015-01-08 16:59:43 -06:00
Arfon Smith
846cff5721 Remove pry 2015-01-08 15:10:05 -06:00
Arfon Smith
efd25ec4d2 Start using path with LazyBlob 2015-01-08 15:08:28 -06:00
Arfon Smith
5c94b50386 Merge pull request #1975 from github/http
Http
2015-01-08 14:19:38 -06:00
Arfon Smith
c0fbc9ef8c Updating Sublime-HTTP reference 2015-01-08 14:11:04 -06:00
Arfon Smith
1f429fb488 Whitespace 2015-01-08 14:06:00 -06:00
Arfon Smith
ec28ea299f Use path for Generated? 2015-01-08 14:03:35 -06:00
Arfon Smith
08558aa118 Merge branch 'master' into http
Conflicts:
	.gitmodules
2015-01-08 13:13:07 -06:00
Adam Roben
7e319b797f Merge pull request #1970 from larsbrinkhoff/fr
Add missing ^ in regexp for Frege heuristic
2015-01-08 13:43:14 -05:00
Lars Brinkhoff
3957a11f25 Add to sample to show that a false positive goes away. 2015-01-08 19:35:02 +01:00
Adam Roben
743922d45a Merge pull request #1974 from github/hy-grammar
Update Hy support by adding a reference to an Atom grammar.
2015-01-08 11:13:59 -05:00
Bob Tolbert
5f70776cf3 Update Hy support by adding a reference to an Atom grammar. 2015-01-08 11:09:28 -05:00
Adam Roben
289f91997c Merge pull request #1973 from github/nit-grammar
Add a language grammar for Nit
2015-01-08 11:01:05 -05:00
Lucas Bajolet
163ea9ecdd Added a few samples for the Nit language
Signed-off-by: Lucas Bajolet <r4pass@hotmail.com>
2015-01-08 10:52:45 -05:00
Lucas Bajolet
9be941acc8 Added support for the nit language in grammars.yml, updated languages.yml for .nit source files highlighting
Signed-off-by: Lucas Bajolet <r4pass@hotmail.com>
2015-01-08 10:51:51 -05:00
Adam Roben
e95314f072 Recognize WTFPL-licensed grammars 2015-01-08 10:49:54 -05:00
Adam Roben
6fef6b578a Fix script/convert-grammars --add 2015-01-08 10:49:01 -05:00
Adam Roben
dd59814563 Merge pull request #1905 from joaquincasares/cql_support
Add support for cql and ddl files
2015-01-08 10:45:15 -05:00
Adam Roben
b704b20695 Merge pull request #1888 from MattDMo/master
reorganized Sublime Text extensions, added some
2015-01-08 10:42:56 -05:00
Lars Brinkhoff
71885b8a79 Add missing ^ in Frege heuristic regexp. 2015-01-08 13:02:04 +01:00
Adam Roben
59a6963a89 Say that you should commit the submodule update 2015-01-07 14:26:23 -05:00
Adam Roben
6e9dfdff30 Recommend updating grammars when releasing
This will help ensure we keep pulling in fixes from the grammar repos.
2015-01-07 14:16:23 -05:00
Paul Chaignon
96e6b3f53e Reclassify samples generated from Zephir code 2015-01-06 17:00:16 +01:00
Paul Chaignon
14740e8a89 Grammar for HTTP 2015-01-06 11:57:28 +01:00
Paul Chaignon
b357257f4d Grammar for AutoHotkey 2015-01-06 09:51:51 +01:00
Joaquin Casares
5ad9deb199 Added sample files 2015-01-05 13:50:54 -06:00
MattDMo
e99f6edb56 Put .sublime_* after .sublime-* 2015-01-05 13:41:58 -05:00
MattDMo
3149d1232b Moved .sublime_metrics after .sublime-mousemap due to failing Travis test 2015-01-05 13:36:57 -05:00
Paul Chaignon
3c6218f20e Heuristics for .cs files: Smalltalk and C# 2015-01-04 22:07:29 +01:00
Matthias Mailänder
68f04a50aa add support for Emacs desktop files 2015-01-03 19:22:54 +01:00
Matthias Mailänder
dc96f62f9e add support for Linux .desktop files 2015-01-03 14:00:07 +01:00
Paul Chaignon
2f86bd8bda Sample file for .html.hl 2015-01-03 09:37:50 +01:00
Paul Chaignon
fbe43b61d4 Sample file for .cmake.in 2015-01-03 09:37:40 +01:00
Paul Chaignon
546d4163a9 Remove unnecessary file extensions 2015-01-03 09:27:43 +01:00
Paul Chaignon
401067f637 Revert #1438: add file extensions with multiple segments 2015-01-02 22:57:20 +01:00
archy
084a9ab976 vendor.yml: added stylesheets imported from packages 2014-12-26 04:45:37 +08:00
archy
356b942114 vendor.yml: added less, scss, styl suffixes for popular stylesheets 2014-12-26 04:45:30 +08:00
archy
2c5d720146 vendor.yml: added imported bootstrap files 2014-12-26 04:45:18 +08:00
archy
64f83eee07 vendor.yml: added styl 2014-12-26 04:45:09 +08:00
archy
542cf9c52b vendor.yml: added custom bootstrap 2014-12-26 04:44:47 +08:00
Joaquin Casares
0bbccc1bc1 Properly order extensions 2014-12-22 16:48:22 -06:00
Joaquin Casares
f4208cb27d Add support for cql and ddl files 2014-12-19 17:51:59 -06:00
Dale Henrichs
034137f533 Use source.smalltalk to hightlight STON files.
While not perfect, source.smalltalk is a better fit for highlighting STON files than source.json. When STON departs from pure JSON (often) the hightlighting is pretty bad.
2014-12-18 21:19:38 -08:00
Dale Henrichs
6f75e18bfa Merge pull request #1 from github/master
update with latest master
2014-12-18 20:29:28 -08:00
MattDMo
97cd1e3886 Reorganized Sublime files, added more extensions. Moved those in JSON to JavaScript, as comments are allowed, and added several. Added 2 to XML. 2014-12-17 11:49:54 -05:00
Paul Chaignon
1363af0317 Remove defactor keyword for .lsp heuristic 2014-12-16 11:02:23 -05:00
Paul Chaignon
2418356eff Merge branch 'master' into newlisp 2014-12-16 10:52:17 -05:00
Paul Chaignon
b63423ce37 Merge branch 'master' into newlisp 2014-12-06 19:56:29 -05:00
Paul Chaignon
0b02b68538 Heuristic for .lsp and .lisp (Common Lisp, NewLisp) 2014-12-06 19:51:45 -05:00
Paul Chaignon
bbd1646ae5 Add .lisp as a NewLisp file extension 2014-12-06 19:51:20 -05:00
Paul Chaignon
c4da2dd557 Merge branch 'master' into newlisp 2014-12-06 17:50:37 -05:00
Paul Chaignon
c5a654e692 Tests for Python multiline comments during tokenization 2014-11-25 20:01:24 -05:00
Paul Chaignon
3ac69ed4e0 Merge branch 'master' into pythonmultiline 2014-11-25 19:53:40 -05:00
Max Horn
f9ad5dda56 Add heuristic distinguishing GAP and Scilab .tst files 2014-11-20 12:52:13 -07:00
Max Horn
ff6a10698e Pass name of file being analyzed to find_by_heuristics
Some languages are sensitive to file names in the sense that
different kinds of files contain somewhat different data.
Example: GAP .tst files contain test cases, which add some
extra data compared to regular code, and as a consequence are
not directly interchangeable with regular source code.

Heuristics may need to take this into account, thus may need
to know the name of the file being analyzed.
2014-11-20 12:48:28 -07:00
Max Horn
6072a63f99 Add GAP .tst extension plus two sample files 2014-11-20 12:47:15 -07:00
Max Horn
1bd935b2b4 Set 'lexer: GAP' for GAP
My Pygments patch adding support for GAP was merged in April, and I
verified that it is live on GitHub.
2014-11-20 12:47:15 -07:00
Joseph Hall
7702583314 Python also supports triple single-quotes for comments 2014-11-16 07:19:55 -07:00
Arfon Smith
5ffc4c0158 Starting work on Disambiguate Matlab, Objective-C, Mathematica, M, and Mercury method 2014-10-31 17:22:32 -05:00
Paul Chaignon
7b44baa417 Merge branch 'master' into newlisp 2014-09-25 10:47:16 -04:00
Paul Chaignon
e4975fc476 Heuristics for Common Lisp and NewLisp 2014-04-28 11:45:24 +02:00
Paul Chaignon
e1064b13c0 NewLisp language added 2014-04-28 11:30:12 +02:00
121 changed files with 17599 additions and 414 deletions

57
.gitmodules vendored
View File

@@ -112,9 +112,6 @@
[submodule "vendor/grammars/nesC.tmbundle"]
path = vendor/grammars/nesC.tmbundle
url = https://github.com/cdwilson/nesC.tmbundle
[submodule "vendor/grammars/racket-tmbundle"]
path = vendor/grammars/racket-tmbundle
url = https://github.com/christophevg/racket-tmbundle
[submodule "vendor/grammars/haxe-sublime-bundle"]
path = vendor/grammars/haxe-sublime-bundle
url = https://github.com/clemos/haxe-sublime-bundle
@@ -124,9 +121,9 @@
[submodule "vendor/grammars/Handlebars"]
path = vendor/grammars/Handlebars
url = https://github.com/daaain/Handlebars
[submodule "vendor/grammars/powershell.tmbundle"]
path = vendor/grammars/powershell.tmbundle
url = https://github.com/davidpeckham/powershell.tmbundle
[submodule "vendor/grammars/powershell"]
path = vendor/grammars/powershell
url = https://github.com/SublimeText/PowerShell
[submodule "vendor/grammars/jade-tmbundle"]
path = vendor/grammars/jade-tmbundle
url = https://github.com/davidrios/jade-tmbundle
@@ -340,6 +337,9 @@
[submodule "vendor/grammars/ini.tmbundle"]
path = vendor/grammars/ini.tmbundle
url = https://github.com/textmate/ini.tmbundle
[submodule "vendor/grammars/desktop.tmbundle"]
path = vendor/grammars/desktop.tmbundle
url = https://github.com/Mailaender/desktop.tmbundle.git
[submodule "vendor/grammars/io.tmbundle"]
path = vendor/grammars/io.tmbundle
url = https://github.com/textmate/io.tmbundle
@@ -528,9 +528,54 @@
[submodule "vendor/grammars/sublime-bsv"]
path = vendor/grammars/sublime-bsv
url = https://github.com/thotypous/sublime-bsv
[submodule "vendor/grammars/Sublime-HTTP"]
path = vendor/grammars/Sublime-HTTP
url = https://github.com/httpspec/sublime-highlighting
[submodule "vendor/grammars/sass-textmate-bundle"]
path = vendor/grammars/sass-textmate-bundle
url = https://github.com/nathos/sass-textmate-bundle
[submodule "vendor/grammars/carto-atom"]
path = vendor/grammars/carto-atom
url = https://github.com/yohanboniface/carto-atom
[submodule "vendor/grammars/Sublime-Nit"]
path = vendor/grammars/Sublime-Nit
url = https://github.com/R4PaSs/Sublime-Nit
[submodule "vendor/grammars/language-hy"]
path = vendor/grammars/language-hy
url = https://github.com/rwtolbert/language-hy
[submodule "vendor/grammars/Racket"]
path = vendor/grammars/Racket
url = https://github.com/soegaard/racket-highlight-for-github
[submodule "vendor/grammars/turtle.tmbundle"]
path = vendor/grammars/turtle.tmbundle
url = https://github.com/peta/turtle.tmbundle
[submodule "vendor/grammars/liquid.tmbundle"]
path = vendor/grammars/liquid.tmbundle
url = https://github.com/bastilian/validcode-textmate-bundles
[submodule "vendor/grammars/AutoHotkey"]
path = vendor/grammars/AutoHotkey
url = https://github.com/ahkscript/AutoHotkey
[submodule "vendor/grammars/ats.sublime"]
path = vendor/grammars/ats.sublime
url = https://github.com/steinwaywhw/ats-mode-sublimetext
[submodule "vendor/grammars/Modelica"]
path = vendor/grammars/Modelica
url = https://github.com/BorisChumichev/modelicaSublimeTextPackage
[submodule "vendor/grammars/sublime-apl"]
path = vendor/grammars/sublime-apl
url = https://github.com/StoneCypher/sublime-apl
[submodule "vendor/grammars/CLIPS-sublime"]
path = vendor/grammars/CLIPS-sublime
url = https://github.com/psicomante/CLIPS-sublime
[submodule "vendor/grammars/Creole"]
path = vendor/grammars/Creole
url = https://github.com/Siddley/Creole
[submodule "vendor/grammars/GDScript-sublime"]
path = vendor/grammars/GDScript-sublime
url = https://github.com/beefsack/GDScript-sublime
[submodule "vendor/grammars/sublime-golo"]
path = vendor/grammars/sublime-golo
url = https://github.com/TypeUnsafe/sublime-golo
[submodule "vendor/grammars/JSyntax"]
path = vendor/grammars/JSyntax
url = https://github.com/bcj/JSyntax

View File

@@ -1,4 +1,4 @@
Copyright (c) 2011-2014 GitHub, Inc.
Copyright (c) 2011-2015 GitHub, Inc.
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation

View File

@@ -182,6 +182,7 @@ If you are the current maintainer of this gem:
0. Create a branch for the release: `git checkout -b cut-release-vxx.xx.xx`
0. Make sure your local dependencies are up to date: `script/bootstrap`
0. If grammar submodules have not been updated recently, update them: `git submodule update --remote && git commit -a`
0. Ensure that samples are updated: `bundle exec rake samples`
0. Ensure that tests are green: `bundle exec rake test`
0. Bump gem version in `lib/linguist/version.rb`. For example, [like this](https://github.com/github/linguist/commit/8d2ea90a5ba3b2fe6e1508b7155aa4632eea2985).

View File

@@ -24,15 +24,23 @@ vendor/grammars/Agda.tmbundle:
- source.agda
vendor/grammars/Alloy.tmbundle:
- source.alloy
vendor/grammars/AutoHotkey:
- source.ahk
vendor/grammars/CLIPS-sublime:
- source.clips
vendor/grammars/ColdFusion:
- source.cfscript
- source.cfscript.cfc
- text.cfml.basic
- text.html.cfm
vendor/grammars/Creole:
- text.html.creole
vendor/grammars/Docker.tmbundle:
- source.dockerfile
vendor/grammars/Elm.tmLanguage:
- source.elm
vendor/grammars/GDScript-sublime/:
- source.gdscript
vendor/grammars/Handlebars:
- text.html.handlebars
vendor/grammars/IDL-Syntax:
@@ -40,10 +48,14 @@ vendor/grammars/IDL-Syntax:
vendor/grammars/Isabelle.tmbundle:
- source.isabelle.root
- source.isabelle.theory
vendor/grammars/JSyntax/:
- source.j
vendor/grammars/Julia.tmbundle:
- source.julia
vendor/grammars/LiveScript.tmbundle:
- source.livescript
vendor/grammars/Modelica/:
- source.modelica
vendor/grammars/NSIS:
- source.nsis
vendor/grammars/NimLime:
@@ -54,6 +66,8 @@ vendor/grammars/PHP-Twig.tmbundle:
- text.html.twig
vendor/grammars/RDoc.tmbundle:
- text.rdoc
vendor/grammars/Racket:
- source.racket
vendor/grammars/SCSS.tmbundle:
- source.scss
vendor/grammars/Scalate.tmbundle:
@@ -66,6 +80,8 @@ vendor/grammars/Stata.tmbundle:
- source.stata
vendor/grammars/Sublime-Coq:
- source.coq
vendor/grammars/Sublime-HTTP:
- source.httpspec
vendor/grammars/Sublime-Inform:
- source.Inform7
vendor/grammars/Sublime-Lasso:
@@ -74,6 +90,8 @@ vendor/grammars/Sublime-Logos:
- source.logos
vendor/grammars/Sublime-Loom:
- source.loomscript
vendor/grammars/Sublime-Nit:
- source.nit
vendor/grammars/Sublime-QML:
- source.qml
vendor/grammars/Sublime-REBOL:
@@ -120,6 +138,8 @@ vendor/grammars/assembly.tmbundle:
vendor/grammars/atom-salt:
- source.python.salt
- source.yaml.salt
vendor/grammars/ats.sublime:
- source.ats
vendor/grammars/autoitv3-tmbundle:
- source.autoit.3
vendor/grammars/awk-sublime:
@@ -162,6 +182,8 @@ vendor/grammars/dart-sublime-bundle:
- source.dart
- source.pubspec
- text.dart-doccomments
vendor/grammars/desktop.tmbundle:
- source.desktop
vendor/grammars/diff.tmbundle:
- source.diff
vendor/grammars/dylan.tmbundle:
@@ -255,6 +277,8 @@ vendor/grammars/language-csharp:
- source.nant-build
vendor/grammars/language-gfm:
- source.gfm
vendor/grammars/language-hy:
- source.hy
vendor/grammars/language-javascript:
- source.js
- source.js.regexp
@@ -279,6 +303,8 @@ vendor/grammars/less.tmbundle:
- source.css.less
vendor/grammars/lilypond.tmbundle:
- source.lilypond
vendor/grammars/liquid.tmbundle:
- text.html.liquid
vendor/grammars/lisp.tmbundle:
- source.lisp
vendor/grammars/llvm.tmbundle:
@@ -338,7 +364,7 @@ vendor/grammars/pike-textmate:
- source.pike
vendor/grammars/postscript.tmbundle:
- source.postscript
vendor/grammars/powershell.tmbundle:
vendor/grammars/powershell:
- source.powershell
vendor/grammars/processing.tmbundle:
- source.processing
@@ -354,8 +380,6 @@ vendor/grammars/python-django.tmbundle:
vendor/grammars/r.tmbundle:
- source.r
- text.tex.latex.rd
vendor/grammars/racket-tmbundle:
- source.racket
vendor/grammars/restructuredtext.tmbundle:
- text.restructuredtext
vendor/grammars/ruby-haml.tmbundle:
@@ -392,6 +416,8 @@ vendor/grammars/standard-ml.tmbundle:
- source.ml
vendor/grammars/sublime-MuPAD:
- source.mupad
vendor/grammars/sublime-apl/:
- source.apl
vendor/grammars/sublime-befunge:
- source.befunge
vendor/grammars/sublime-better-typescript:
@@ -403,6 +429,8 @@ vendor/grammars/sublime-cirru:
vendor/grammars/sublime-glsl:
- source.essl
- source.glsl
vendor/grammars/sublime-golo/:
- source.golo
vendor/grammars/sublime-idris:
- source.idris
vendor/grammars/sublime-mask:
@@ -444,6 +472,9 @@ vendor/grammars/thrift.tmbundle:
- source.thrift
vendor/grammars/toml.tmbundle:
- source.toml
vendor/grammars/turtle.tmbundle:
- source.sparql
- source.turtle
vendor/grammars/verilog.tmbundle:
- source.verilog
vendor/grammars/x86-assembly-textmate-bundle:

View File

@@ -61,6 +61,9 @@ module Linguist
@heuristic.call(data)
end
# Common heuristics
ObjectiveCRegex = /^[ \t]*@(interface|class|protocol|property|end|synchronised|selector|implementation)\b/
disambiguate "BitBake", "BlitzBasic" do |data|
if /^\s*; /.match(data) || data.include?("End Function")
Language["BlitzBasic"]
@@ -69,8 +72,16 @@ module Linguist
end
end
disambiguate "C#", "Smalltalk" do |data|
if /![\w\s]+methodsFor: /.match(data)
Language["Smalltalk"]
elsif /^\s*namespace\s*[\w\.]+\s*{/.match(data) || /^\s*\/\//.match(data)
Language["C#"]
end
end
disambiguate "Objective-C", "C++", "C" do |data|
if (/^[ \t]*@(interface|class|protocol|property|end|synchronised|selector|implementation)\b/.match(data))
if ObjectiveCRegex.match(data)
Language["Objective-C"]
elsif (/^\s*#\s*include <(cstdint|string|vector|map|list|array|bitset|queue|stack|forward_list|unordered_map|unordered_set|(i|o|io)stream)>/.match(data) ||
/^\s*template\s*</.match(data) || /^[ \t]*try/.match(data) || /^[ \t]*catch\s*\(/.match(data) || /^[ \t]*(class|(using[ \t]+)?namespace)\s+\w+/.match(data) || /^[ \t]*(private|public|protected):$/.match(data) || /std::\w+/.match(data))
@@ -104,6 +115,15 @@ module Linguist
end
end
disambiguate "GAP", "Scilab" do |data|
if (data.include?("gap> "))
Language["GAP"]
# Heads up - we don't usually write heuristics like this (with no regex match)
else
Language["Scilab"]
end
end
disambiguate "Common Lisp", "OpenCL", "Cool" do |data|
if data.include?("(defun ")
Language["Common Lisp"]
@@ -152,6 +172,20 @@ module Linguist
end
end
disambiguate "M", "Mathematica", "Matlab", "Mercury", "Objective-C" do |data|
if ObjectiveCRegex.match(data)
Language["Objective-C"]
elsif data.include?(":- module")
Language["Mercury"]
elsif /^\s*;/.match(data)
Language["M"]
elsif /^\s*\(\*/.match(data)
Language["Mathematica"]
elsif /^\s*%/.match(data)
Language["Matlab"]
end
end
disambiguate "Gosu", "JavaScript" do |data|
Language["Gosu"] if /^uses java\./.match(data)
end
@@ -164,6 +198,14 @@ module Linguist
end
end
disambiguate "Common Lisp", "NewLisp" do |data|
if /^\s*\((defun|in-package|defpackage) /.match(data)
Language["Common Lisp"]
elsif /^\s*\(define /.match(data)
Language["NewLisp"]
end
end
disambiguate "TypeScript", "XML" do |data|
if data.include?("<TS ")
Language["XML"]
@@ -175,7 +217,7 @@ module Linguist
disambiguate "Frege", "Forth", "Text" do |data|
if /^(: |also |new-device|previous )/.match(data)
Language["Forth"]
elsif /\s*(import|module|package|data|type) /.match(data)
elsif /^\s*(import|module|package|data|type) /.match(data)
Language["Frege"]
else
Language["Text"]

View File

@@ -11,6 +11,7 @@ require 'linguist/samples'
require 'linguist/file_blob'
require 'linguist/blob_helper'
require 'linguist/strategy/filename'
require 'linguist/strategy/modeline'
require 'linguist/shebang'
module Linguist
@@ -94,6 +95,7 @@ module Linguist
end
STRATEGIES = [
Linguist::Strategy::Modeline,
Linguist::Strategy::Filename,
Linguist::Shebang,
Linguist::Heuristics,
@@ -155,7 +157,7 @@ module Linguist
# Language.find_by_alias('cpp')
# # => #<Language name="C++">
#
# Returns the Lexer or nil if none was found.
# Returns the Language or nil if none was found.
def self.find_by_alias(name)
name && @alias_index[name.downcase]
end
@@ -219,7 +221,7 @@ module Linguist
end
# Public: Look up Language by its name or lexer.
# Public: Look up Language by its name.
#
# name - The String name of the Language
#
@@ -243,7 +245,7 @@ module Linguist
#
# This list is configured in "popular.yml".
#
# Returns an Array of Lexers.
# Returns an Array of Languages.
def self.popular
@popular ||= all.select(&:popular?).sort_by { |lang| lang.name.downcase }
end
@@ -255,7 +257,7 @@ module Linguist
#
# This list is created from all the languages not listed in "popular.yml".
#
# Returns an Array of Lexers.
# Returns an Array of Languages.
def self.unpopular
@unpopular ||= all.select(&:unpopular?).sort_by { |lang| lang.name.downcase }
end
@@ -375,11 +377,6 @@ module Linguist
# Returns the name String
attr_reader :search_term
# Public: Get Lexer
#
# Returns the Lexer
attr_reader :lexer
# Public: Get the name of a TextMate-compatible scope
#
# Returns the scope
@@ -495,16 +492,6 @@ module Linguist
@searchable
end
# Public: Highlight syntax of text
#
# text - String of code to be highlighted
# options - A Hash of options (defaults to {})
#
# Returns html String
def colorize(text, options = {})
lexer.highlight(text, options)
end
# Public: Return name as String representation
def to_s
name
@@ -580,7 +567,6 @@ module Linguist
:color => options['color'],
:type => options['type'],
:aliases => options['aliases'],
:lexer => options['lexer'],
:tm_scope => options['tm_scope'],
:ace_mode => options['ace_mode'],
:wrap => options['wrap'],

View File

@@ -54,13 +54,14 @@ APL:
extensions:
- .apl
- .dyalog
tm_scope: none
tm_scope: source.apl
ace_mode: text
ASP:
type: programming
color: "#6a40fd"
search_term: aspx-vb
tm_scope: text.html.asp
aliases:
- aspx
- aspx-vb
@@ -81,10 +82,9 @@ ATS:
- ats2
extensions:
- .dats
- .atxt
- .hats
- .sats
tm_scope: source.ocaml
tm_scope: source.ats
ace_mode: ocaml
ActionScript:
@@ -224,7 +224,7 @@ AutoHotkey:
extensions:
- .ahk
- .ahkl
tm_scope: none
tm_scope: source.ahk
ace_mode: autohotkey
AutoIt:
@@ -381,6 +381,7 @@ C++:
- .cpp
- .c++
- .cc
- .cp
- .cxx
- .h
- .h++
@@ -413,13 +414,13 @@ CLIPS:
type: programming
extensions:
- .clp
tm_scope: none
tm_scope: source.clips
ace_mode: text
CMake:
extensions:
- .cmake
- .in
- .cmake.in
filenames:
- CMakeLists.txt
ace_mode: text
@@ -504,10 +505,10 @@ Clojure:
- .cl2
- .cljc
- .cljs
- .cljs.hl
- .cljscm
- .cljx
- .hic
- .hl
filenames:
- riemann.config
@@ -622,7 +623,7 @@ Creole:
wrap: true
extensions:
- .creole
tm_scope: none
tm_scope: text.html.creole
ace_mode: text
Crystal:
@@ -811,9 +812,11 @@ Emacs Lisp:
- emacs
filenames:
- .emacs
- .emacs.desktop
extensions:
- .el
- .emacs
- .emacs.desktop
ace_mode: lisp
EmberScript:
@@ -915,6 +918,7 @@ Forth:
color: "#341708"
extensions:
- .fth
- .4TH
- .4th
- .F
- .f
@@ -956,6 +960,7 @@ GAP:
- .gap
- .gd
- .gi
- .tst
tm_scope: none
ace_mode: text
@@ -972,7 +977,7 @@ GDScript:
type: programming
extensions:
- .gd
tm_scope: none
tm_scope: source.gdscript
ace_mode: text
GLSL:
@@ -1070,7 +1075,7 @@ Golo:
color: "#f6a51f"
extensions:
- .golo
tm_scope: none
tm_scope: source.golo
ace_mode: text
Gosu:
@@ -1172,6 +1177,7 @@ HTML:
extensions:
- .html
- .htm
- .html.hl
- .st
- .xht
- .xhtml
@@ -1197,7 +1203,7 @@ HTML+ERB:
- erb
extensions:
- .erb
- .deface
- .erb.deface
ace_mode: html_ruby
HTML+PHP:
@@ -1212,7 +1218,7 @@ HTTP:
type: data
extensions:
- .http
tm_scope: none
tm_scope: source.httpspec
ace_mode: text
Hack:
@@ -1228,13 +1234,14 @@ Haml:
type: markup
extensions:
- .haml
- .deface
- .haml.deface
ace_mode: haml
Handlebars:
type: markup
aliases:
- hbs
- htmlbars
extensions:
- .handlebars
- .hbs
@@ -1268,13 +1275,13 @@ Haxe:
Hy:
type: programming
ace_mode: clojure
ace_mode: text
color: "#7891b1"
extensions:
- .hy
aliases:
- hylang
tm_scope: none
tm_scope: source.hy
IDL:
type: programming
@@ -1370,7 +1377,7 @@ J:
type: programming
extensions:
- .ijs
tm_scope: none
tm_scope: source.j
ace_mode: text
JSON:
@@ -1382,13 +1389,6 @@ JSON:
extensions:
- .json
- .lock
- .sublime-keymap
- .sublime-mousemap
- .sublime-project
- .sublime-settings
- .sublime-workspace
- .sublime_metrics
- .sublime_session
filenames:
- .jshintrc
- composer.lock
@@ -1472,6 +1472,19 @@ JavaScript:
- .pac
- .sjs
- .ssjs
- .sublime-build
- .sublime-commands
- .sublime-completions
- .sublime-keymap
- .sublime-macro
- .sublime-menu
- .sublime-mousemap
- .sublime-project
- .sublime-settings
- .sublime-theme
- .sublime-workspace
- .sublime_metrics
- .sublime_session
- .xsjs
- .xsjslib
filenames:
@@ -1589,7 +1602,7 @@ Liquid:
type: markup
extensions:
- .liquid
tm_scope: none
tm_scope: text.html.liquid
ace_mode: liquid
Literate Agda:
@@ -1828,6 +1841,13 @@ Mirah:
tm_scope: source.ruby
ace_mode: ruby
Modelica:
type: programming
extensions:
- .mo
tm_scope: source.modelica
ace_mode: text
Monkey:
type: programming
extensions:
@@ -1876,6 +1896,19 @@ NetLogo:
tm_scope: source.lisp
ace_mode: lisp
NewLisp:
type: programming
lexer: NewLisp
color: "#eedd66"
extensions:
- .nl
- .lisp
- .lsp
interpreters:
- newlisp
tm_scope: source.lisp
ace_mode: lisp
Nginx:
type: markup
extensions:
@@ -1906,7 +1939,7 @@ Nit:
color: "#0d8921"
extensions:
- .nit
tm_scope: none
tm_scope: source.nit
ace_mode: text
Nix:
@@ -2048,7 +2081,7 @@ OpenSCAD:
extensions:
- .scad
tm_scope: none
ace_mode: text
ace_mode: scad
Org:
type: prose
@@ -2205,6 +2238,8 @@ Perl6:
- .pm
- .pm6
- .t
filenames:
- Rexfile
interpreters:
- perl6
tm_scope: none
@@ -2224,6 +2259,8 @@ Pike:
extensions:
- .pike
- .pmod
interpreters:
- pike
ace_mode: text
Pod:
@@ -2368,6 +2405,8 @@ Python:
- python
- python2
- python3
aliases:
- rusthon
Python traceback:
type: data
@@ -2415,7 +2454,6 @@ R:
RAML:
type: data
lexer: YAML
ace_mode: yaml
tm_scope: source.yaml
color: "#77d9fb"
@@ -2607,6 +2645,14 @@ SCSS:
extensions:
- .scss
SPARQL:
type: data
tm_scope: source.sparql
ace_mode: text
extensions:
- .sparql
- .rq
SQF:
type: programming
color: "#FFCB1F"
@@ -2622,6 +2668,8 @@ SQL:
ace_mode: sql
extensions:
- .sql
- .cql
- .ddl
- .prc
- .tab
- .udf
@@ -2632,14 +2680,15 @@ STON:
group: Smalltalk
extensions:
- .ston
tm_scope: source.json
ace_mode: lisp
tm_scope: source.smalltalk
ace_mode: text
Sage:
type: programming
group: Python
extensions:
- .sage
- .sagews
tm_scope: source.python
ace_mode: python
@@ -2961,6 +3010,13 @@ Turing:
tm_scope: none
ace_mode: text
Turtle:
type: data
extensions:
- .ttl
tm_scope: source.turtle
ace_mode: text
Twig:
type: markup
group: PHP
@@ -3075,6 +3131,14 @@ Volt:
tm_scope: source.d
ace_mode: d
Web Ontology Language:
type: markup
color: "#3994bc"
extensions:
- .owl
tm_scope: text.xml
ace_mode: xml
WebIDL:
type: programming
extensions:
@@ -3111,6 +3175,7 @@ XML:
- .dll.config
- .filters
- .fsproj
- .fxml
- .glade
- .grxml
- .ivy
@@ -3131,6 +3196,8 @@ XML:
- .rss
- .scxml
- .srdf
- .stTheme
- .sublime-snippet
- .svg
- .targets
- .tmCommand
@@ -3251,6 +3318,14 @@ Zimpl:
tm_scope: none
ace_mode: text
desktop:
type: data
extensions:
- .desktop
- .desktop.in
tm_scope: source.desktop
ace_mode: text
eC:
type: programming
search_term: ec

View File

@@ -0,0 +1,30 @@
module Linguist
module Strategy
class Modeline
EmacsModeline = /-\*-\s*mode:\s*(\w+);?\s*-\*-/
VimModeline = /\/\*\s*vim:\s*set\s*(?:ft|filetype)=(\w+):\s*\*\//
# Public: Detects language based on Vim and Emacs modelines
#
# blob - An object that quacks like a blob.
#
# Examples
#
# Modeline.call(FileBlob.new("path/to/file"))
#
# Returns an Array with one Language if the blob has a Vim or Emacs modeline
# that matches a Language name or alias. Returns an empty array if no match.
def self.call(blob, _ = nil)
Array(Language.find_by_alias(modeline(blob.data)))
end
# Public: Get the modeline from the first n-lines of the file
#
# Returns a String or nil
def self.modeline(data)
match = data.match(EmacsModeline) || data.match(VimModeline)
match[1] if match
end
end
end
end

View File

@@ -33,7 +33,8 @@ module Linguist
['<!--', '-->'], # XML
['{-', '-}'], # Haskell
['(*', '*)'], # Coq
['"""', '"""'] # Python
['"""', '"""'], # Python
["'''", "'''"] # Python
]
START_SINGLE_LINE_COMMENT = Regexp.compile(SINGLE_LINE_COMMENTS.map { |c|

View File

@@ -40,24 +40,27 @@
# Minified JavaScript and CSS
- (\.|-)min\.(js|css)$
#Stylesheets imported from packages
- ([^\s]*)import\.(css|less|scss|styl)$
# Bootstrap css and js
- (^|/)bootstrap([^.]*)\.(js|css)$
- (^|/)bootstrap([^.]*)\.(js|css|less|scss|styl)$
- (^|/)custom\.bootstrap([^\s]*)(js|css|less|scss|styl)$
# Font Awesome
- font-awesome.css
- (^|/)font-awesome\.(css|less|scss|styl)$
# Foundation css
- foundation.css
- (^|/)foundation\.(css|less|scss|styl)$
# Normalize.css
- normalize.css
- (^|/)normalize\.(css|less|scss|styl)$
# Bourbon SCSS
- (^|/)[Bb]ourbon/.*\.css$
- (^|/)[Bb]ourbon/.*\.scss$
# Bourbon css
- (^|/)[Bb]ourbon/.*\.(css|less|scss|styl)$
# Animate.css
- animate.css
- (^|/)animate\.(css|less|scss|styl)$
# Vendored dependencies
- third[-_]?party/

View File

@@ -1,3 +1,3 @@
module Linguist
VERSION = "4.2.6"
VERSION = "4.3.1"
end

View File

@@ -1,215 +0,0 @@
%{
#include "./../ATEXT/atextfun.hats"
%}
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="content-type" content="text/html; charset=UTF-8" />
<title>EFFECTIVATS-DiningPhil2</title>
#patscode_style()
</head>
<body>
<h1>
Effective ATS: Dining Philosophers
</h1>
In this article, I present an implementation of a slight variant of the
famous problem of 5-Dining-Philosophers by Dijkstra that makes simple but
convincing use of linear types.
<h2>
The Original Problem
</h2>
There are five philosophers sitting around a table and there are also 5
forks placed on the table such that each fork is located between the left
hand of a philosopher and the right hand of another philosopher. Each
philosopher does the following routine repeatedly: thinking and dining. In
order to dine, a philosopher needs to first acquire two forks: one located
on his left-hand side and the other on his right-hand side. After
finishing dining, a philosopher puts the two acquired forks onto the table:
one on his left-hand side and the other on his right-hand side.
<h2>
A Variant of the Original Problem
</h2>
The following twist is added to the original version:
<p>
After a fork is used, it becomes a "dirty" fork and needs to be put in a
tray for dirty forks. There is a cleaner who cleans dirty forks and then
puts them back on the table.
<h2>
Channels for Communication
</h2>
A channel is just a shared queue of fixed capacity. The following two
functions are for inserting an element into and taking an element out of a
given channel:
<pre
class="patsyntax">
#pats2xhtml_sats("\
fun{a:vt0p} channel_insert (channel (a), a): void
fun{a:vt0p} channel_takeout (chan: channel (a)): (a)
")</pre>
If [channel_insert] is called on a channel that is full, then the caller is
blocked until an element is taken out of the channel. If [channel_takeout]
is called on a channel that is empty, then the caller is blocked until an
element is inserted into the channel.
<h2>
A Channel for Each Fork
</h2>
Forks are resources given a linear type. Each fork is initially stored in a
channel, which can be obtained by calling the following function:
<pre
class="patsyntax">
#pats2xhtml_sats("\
fun fork_changet (n: nphil): channel(fork)
")</pre>
where the type [nphil] is defined to be [natLt(5)] (for natural numbers
less than 5). The channels for storing forks are chosen to be of capacity
2. The reason that channels of capacity 2 are chosen to store at most one
element (in each of them) is to guarantee that these channels can never be
full (so that there is no attempt made to send signals to awake callers
supposedly being blocked due to channels being full).
<h2>
A Channel for the Fork Tray
</h2>
A tray for storing "dirty" forks is also a channel, which can be obtained
by calling the following function:
<pre
class="patsyntax">
#pats2xhtml_sats("\
fun forktray_changet ((*void*)): channel(fork)
")</pre>
The capacity chosen for the channel is 6 (instead of 5) so that it can
never become full (as there are only 5 forks in total).
<h2>
Philosopher Loop
</h2>
Each philosopher is implemented as a loop:
<pre
class="patsyntax">
#pats2xhtml_dats('\
implement
phil_loop (n) = let
//
val () = phil_think (n)
//
val nl = phil_left (n) // = n
val nr = phil_right (n) // = (n+1) % 5
//
val ch_lfork = fork_changet (nl)
val ch_rfork = fork_changet (nr)
//
val lf = channel_takeout (ch_lfork)
val () = println! ("phil_loop(", n, ") picks left fork")
//
val () = randsleep (2) // sleep up to 2 seconds
//
val rf = channel_takeout (ch_rfork)
val () = println! ("phil_loop(", n, ") picks right fork")
//
val () = phil_dine (n, lf, rf)
//
val ch_forktray = forktray_changet ()
val () = channel_insert (ch_forktray, lf) // left fork to dirty tray
val () = channel_insert (ch_forktray, rf) // right fork to dirty tray
//
in
phil_loop (n)
end // end of [phil_loop]
')</pre>
It should be straighforward to follow the code for [phil_loop].
<h2>
Fork Cleaner Loop
</h2>
A cleaner is implemented as a loop:
<pre
class="patsyntax">
#pats2xhtml_dats('\
implement
cleaner_loop () = let
//
val ch = forktray_changet ()
val f0 = channel_takeout (ch) // [f0] is dirty
//
val () = cleaner_wash (f0) // washes dirty [f0]
val () = cleaner_return (f0) // puts back cleaned [f0]
//
in
cleaner_loop ()
end // end of [cleaner_loop]
')</pre>
The function [cleaner_return] first finds out the number of a given fork
and then uses the number to locate the channel for storing the fork. Its
actual implementation is given as follows:
<pre
class="patsyntax">
#pats2xhtml_dats('\
implement
cleaner_return (f) =
{
val n = fork_get_num (f)
val ch = fork_changet (n)
val () = channel_insert (ch, f)
}
')</pre>
It should now be straighforward to follow the code for [cleaner_loop].
<h2>
Testing
</h2>
The entire code of this implementation is stored in the following files:
<pre>
DiningPhil2.sats
DiningPhil2.dats
DiningPhil2_fork.dats
DiningPhil2_thread.dats
</pre>
There is also a Makefile available for compiling the ATS source code into
an excutable for testing. One should be able to encounter a deadlock after
running the simulation for a while.
<hr size="2">
This article is written by <a href="http://www.cs.bu.edu/~hwxi/">Hongwei Xi</a>.
</body>
</html>
%{
implement main () = fprint_filsub (stdout_ref, "main_atxt.txt")
%}

116
samples/C++/qsciprinter.cp Normal file
View File

@@ -0,0 +1,116 @@
// This module defines interface to the QsciPrinter class.
//
// Copyright (c) 2011 Riverbank Computing Limited <info@riverbankcomputing.com>
//
// This file is part of QScintilla.
//
// This file may be used under the terms of the GNU General Public
// License versions 2.0 or 3.0 as published by the Free Software
// Foundation and appearing in the files LICENSE.GPL2 and LICENSE.GPL3
// included in the packaging of this file. Alternatively you may (at
// your option) use any later version of the GNU General Public
// License if such license has been publicly approved by Riverbank
// Computing Limited (or its successors, if any) and the KDE Free Qt
// Foundation. In addition, as a special exception, Riverbank gives you
// certain additional rights. These rights are described in the Riverbank
// GPL Exception version 1.1, which can be found in the file
// GPL_EXCEPTION.txt in this package.
//
// If you are unsure which license is appropriate for your use, please
// contact the sales department at sales@riverbankcomputing.com.
//
// This file is provided AS IS with NO WARRANTY OF ANY KIND, INCLUDING THE
// WARRANTY OF DESIGN, MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
#ifndef QSCIPRINTER_H
#define QSCIPRINTER_H
#ifdef __APPLE__
extern "C++" {
#endif
#include <qprinter.h>
#include <Qsci/qsciglobal.h>
#include <Qsci/qsciscintilla.h>
QT_BEGIN_NAMESPACE
class QRect;
class QPainter;
QT_END_NAMESPACE
class QsciScintillaBase;
//! \brief The QsciPrinter class is a sub-class of the Qt QPrinter class that
//! is able to print the text of a Scintilla document.
//!
//! The class can be further sub-classed to alter to layout of the text, adding
//! headers and footers for example.
class QSCINTILLA_EXPORT QsciPrinter : public QPrinter
{
public:
//! Constructs a printer paint device with mode \a mode.
QsciPrinter(PrinterMode mode = ScreenResolution);
//! Destroys the QsciPrinter instance.
virtual ~QsciPrinter();
//! Format a page, by adding headers and footers for example, before the
//! document text is drawn on it. \a painter is the painter to be used to
//! add customised text and graphics. \a drawing is true if the page is
//! actually being drawn rather than being sized. \a painter drawing
//! methods must only be called when \a drawing is true. \a area is the
//! area of the page that will be used to draw the text. This should be
//! modified if it is necessary to reserve space for any customised text or
//! graphics. By default the area is relative to the printable area of the
//! page. Use QPrinter::setFullPage() because calling printRange() if you
//! want to try and print over the whole page. \a pagenr is the number of
//! the page. The first page is numbered 1.
virtual void formatPage(QPainter &painter, bool drawing, QRect &area,
int pagenr);
//! Return the number of points to add to each font when printing.
//!
//! \sa setMagnification()
int magnification() const {return mag;}
//! Sets the number of points to add to each font when printing to \a
//! magnification.
//!
//! \sa magnification()
virtual void setMagnification(int magnification);
//! Print a range of lines from the Scintilla instance \a qsb. \a from is
//! the first line to print and a negative value signifies the first line
//! of text. \a to is the last line to print and a negative value
//! signifies the last line of text. true is returned if there was no
//! error.
virtual int printRange(QsciScintillaBase *qsb, int from = -1, int to = -1);
//! Return the line wrap mode used when printing. The default is
//! QsciScintilla::WrapWord.
//!
//! \sa setWrapMode()
QsciScintilla::WrapMode wrapMode() const {return wrap;}
//! Sets the line wrap mode used when printing to \a wmode.
//!
//! \sa wrapMode()
virtual void setWrapMode(QsciScintilla::WrapMode wmode);
private:
int mag;
QsciScintilla::WrapMode wrap;
QsciPrinter(const QsciPrinter &);
QsciPrinter &operator=(const QsciPrinter &);
};
#ifdef __APPLE__
}
#endif
#endif

343
samples/CLIPS/demo.clp Normal file
View File

@@ -0,0 +1,343 @@
;;;***************************
;;;* DEFFACTS KNOWLEDGE BASE *
;;;***************************
(deffacts MAIN::knowledge-base
(welcome (message WelcomeMessage))
(goal (variable type.animal))
(legalanswers (values yes no))
(displayanswers (values "Yes" "No"))
(rule (if backbone is yes)
(then superphylum is backbone))
(rule (if backbone is no)
(then superphylum is jellyback))
(question (variable backbone)
(query backbone.query))
(rule (if superphylum is backbone and
warm.blooded is yes)
(then phylum is warm))
(rule (if superphylum is backbone and
warm.blooded is no)
(then phylum is cold))
(question (variable warm.blooded)
(query warm.blooded.query))
(rule (if superphylum is jellyback and
live.prime.in.soil is yes)
(then phylum is soil))
(rule (if superphylum is jellyback and
live.prime.in.soil is no)
(then phylum is elsewhere))
(question (variable live.prime.in.soil)
(query live.prime.in.soil.query))
(rule (if phylum is warm and
has.breasts is yes)
(then class is breasts))
(rule (if phylum is warm and
has.breasts is no)
(then type.animal is bird))
(question (variable has.breasts)
(query has.breasts.query))
(rule (if phylum is cold and
always.in.water is yes)
(then class is water))
(rule (if phylum is cold and
always.in.water is no)
(then class is dry))
(question (variable always.in.water)
(query always.in.water.query))
(rule (if phylum is soil and
flat.bodied is yes)
(then type.animal is flatworm))
(rule (if phylum is soil and
flat.bodied is no)
(then type.animal is worm.leech))
(question (variable flat.bodied)
(query flat.bodied.query))
(rule (if phylum is elsewhere and
body.in.segments is yes)
(then class is segments))
(rule (if phylum is elsewhere and
body.in.segments is no)
(then class is unified))
(question (variable body.in.segments)
(query body.in.segments.query))
(rule (if class is breasts and
can.eat.meat is yes)
(then order is meat))
(rule (if class is breasts and
can.eat.meat is no)
(then order is vegy))
(question (variable can.eat.meat)
(query can.eat.meat.query))
(rule (if class is water and
boney is yes)
(then type.animal is fish))
(rule (if class is water and
boney is no)
(then type.animal is shark.ray))
(question (variable boney)
(query boney.query))
(rule (if class is dry and
scaly is yes)
(then order is scales))
(rule (if class is dry and
scaly is no)
(then order is soft))
(question (variable scaly)
(query scaly.query))
(rule (if class is segments and
shell is yes)
(then order is shell))
(rule (if class is segments and
shell is no)
(then type.animal is centipede.millipede.insect))
(question (variable shell)
(query shell.query))
(rule (if class is unified and
digest.cells is yes)
(then order is cells))
(rule (if class is unified and
digest.cells is no)
(then order is stomach))
(question (variable digest.cells)
(query digest.cells.query))
(rule (if order is meat and
fly is yes)
(then type.animal is bat))
(rule (if order is meat and
fly is no)
(then family is nowings))
(question (variable fly)
(query fly.query))
(rule (if order is vegy and
hooves is yes)
(then family is hooves))
(rule (if order is vegy and
hooves is no)
(then family is feet))
(question (variable hooves)
(query hooves.query))
(rule (if order is scales and
rounded.shell is yes)
(then type.animal is turtle))
(rule (if order is scales and
rounded.shell is no)
(then family is noshell))
(question (variable rounded.shell)
(query rounded.shell.query))
(rule (if order is soft and
jump is yes)
(then type.animal is frog))
(rule (if order is soft and
jump is no)
(then type.animal is salamander))
(question (variable jump)
(query jump.query))
(rule (if order is shell and
tail is yes)
(then type.animal is lobster))
(rule (if order is shell and
tail is no)
(then type.animal is crab))
(question (variable tail)
(query tail.query))
(rule (if order is cells and
stationary is yes)
(then family is stationary))
(rule (if order is cells and
stationary is no)
(then type.animal is jellyfish))
(question (variable stationary)
(query stationary.query))
(rule (if order is stomach and
multicelled is yes)
(then family is multicelled))
(rule (if order is stomach and
multicelled is no)
(then type.animal is protozoa))
(question (variable multicelled)
(query multicelled.query))
(rule (if family is nowings and
opposing.thumb is yes)
(then genus is thumb))
(rule (if family is nowings and
opposing.thumb is no)
(then genus is nothumb))
(question (variable opposing.thumb)
(query opposing.thumb.query))
(rule (if family is hooves and
two.toes is yes)
(then genus is twotoes))
(rule (if family is hooves and
two.toes is no)
(then genus is onetoe))
(question (variable two.toes)
(query two.toes.query))
(rule (if family is feet and
live.in.water is yes)
(then genus is water))
(rule (if family is feet and
live.in.water is no)
(then genus is dry))
(question (variable live.in.water)
(query live.in.water.query))
(rule (if family is noshell and
limbs is yes)
(then type.animal is crocodile.alligator))
(rule (if family is noshell and
limbs is no)
(then type.animal is snake))
(question (variable limbs)
(query limbs.query))
(rule (if family is stationary and
spikes is yes)
(then type.animal is sea.anemone))
(rule (if family is stationary and
spikes is no)
(then type.animal is coral.sponge))
(question (variable spikes)
(query spikes.query))
(rule (if family is multicelled and
spiral.shell is yes)
(then type.animal is snail))
(rule (if family is multicelled and
spiral.shell is no)
(then genus is noshell))
(question (variable spiral.shell)
(query spiral.shell.query))
(rule (if genus is thumb and
prehensile.tail is yes)
(then type.animal is monkey))
(rule (if genus is thumb and
prehensile.tail is no)
(then species is notail))
(question (variable prehensile.tail)
(query prehensile.tail.query))
(rule (if genus is nothumb and
over.400 is yes)
(then species is 400))
(rule (if genus is nothumb and
over.400 is no)
(then species is under400))
(question (variable over.400)
(query over.400.query))
(rule (if genus is twotoes and
horns is yes)
(then species is horns))
(rule (if genus is twotoes and
horns is no)
(then species is nohorns))
(question (variable horns)
(query horns.query))
(rule (if genus is onetoe and
plating is yes)
(then type.animal is rhinoceros))
(rule (if genus is onetoe and
plating is no)
(then type.animal is horse.zebra))
(question (variable plating)
(query plating.query))
(rule (if genus is water and
hunted is yes)
(then type.animal is whale))
(rule (if genus is water and
hunted is no)
(then type.animal is dolphin.porpoise))
(question (variable hunted)
(query hunted.query))
(rule (if genus is dry and
front.teeth is yes)
(then species is teeth))
(rule (if genus is dry and
front.teeth is no)
(then species is noteeth))
(question (variable front.teeth)
(query front.teeth.query))
(rule (if genus is noshell and
bivalve is yes)
(then type.animal is clam.oyster))
(rule (if genus is noshell and
bivalve is no)
(then type.animal is squid.octopus))
(question (variable bivalve)
(query bivalve.query))
(rule (if species is notail and
nearly.hairless is yes)
(then type.animal is man))
(rule (if species is notail and
nearly.hairless is no)
(then subspecies is hair))
(question (variable nearly.hairless)
(query nearly.hairless.query))
(rule (if species is 400 and
land.based is yes)
(then type.animal is bear.tiger.lion))
(rule (if species is 400 and
land.based is no)
(then type.animal is walrus))
(question (variable land.based)
(query land.based.query))
(rule (if species is under400 and
thintail is yes)
(then type.animal is cat))
(rule (if species is under400 and
thintail is no)
(then type.animal is coyote.wolf.fox.dog))
(question (variable thintail)
(query thintail.query))
(rule (if species is nohorns and
lives.in.desert is yes)
(then type.animal is camel))
(rule (if species is nohorns and
lives.in.desert is no and
semi.aquatic is no)
(then type.animal is giraffe))
(rule (if species is nohorns and
lives.in.desert is no and
semi.aquatic is yes)
(then type.animal is hippopotamus))
(question (variable lives.in.desert)
(query lives.in.desert.query))
(question (variable semi.aquatic)
(query semi.aquatic.query))
(rule (if species is teeth and
large.ears is yes)
(then type.animal is rabbit))
(rule (if species is teeth and
large.ears is no)
(then type.animal is rat.mouse.squirrel.beaver.porcupine))
(question (variable large.ears)
(query large.ears.query))
(rule (if species is noteeth and
pouch is yes)
(then type.animal is kangaroo.koala.bear))
(rule (if species is noteeth and
pouch is no)
(then type.animal is mole.shrew.elephant))
(question (variable pouch)
(query pouch.query))
(rule (if subspecies is hair and
long.powerful.arms is yes)
(then type.animal is orangutan.gorilla.chimpanzee))
(rule (if subspecies is hair and
long.powerful.arms is no)
(then type.animal is baboon))
(question (variable long.powerful.arms)
(query long.powerful.arms.query))
(rule (if species is horns and
fleece is yes)
(then type.animal is sheep.goat))
(rule (if species is horns and
fleece is no)
(then subsubspecies is nofleece))
(question (variable fleece)
(query fleece.query))
(rule (if subsubspecies is nofleece and
domesticated is yes)
(then type.animal is cow))
(rule (if subsubspecies is nofleece and
domesticated is no)
(then type.animal is deer.moose.antelope))
(question (variable domesticated)
(query domesticated.query))
(answer (prefix "I think your animal is a ") (variable type.animal) (postfix ".")))

281
samples/CLIPS/sudoku.clp Normal file
View File

@@ -0,0 +1,281 @@
;;; http://www.angusj.com/sudoku/hints
;;; http://www.scanraid.com/BasicStrategies.htm
;;; http://www.sudokuoftheday.com/pages/techniques-overview
;;; http://www.sudokuonline.us/sudoku_solving_techniques
;;; http://www.sadmansoftware.com/sudoku/techniques.htm
;;; http://www.krazydad.com/blog/2005/09/29/an-index-of-sudoku-strategies/
;;; #######################
;;; DEFTEMPLATES & DEFFACTS
;;; #######################
(deftemplate possible
(slot row)
(slot column)
(slot value)
(slot group)
(slot id))
(deftemplate impossible
(slot id)
(slot value)
(slot priority)
(slot reason))
(deftemplate technique-employed
(slot reason)
(slot priority))
(deftemplate technique
(slot name)
(slot priority))
(deffacts startup
(phase grid-values))
(deftemplate size-value
(slot size)
(slot value))
(deffacts values
(size-value (size 1) (value 1))
(size-value (size 2) (value 2))
(size-value (size 2) (value 3))
(size-value (size 2) (value 4))
(size-value (size 3) (value 5))
(size-value (size 3) (value 6))
(size-value (size 3) (value 7))
(size-value (size 3) (value 8))
(size-value (size 3) (value 9))
(size-value (size 4) (value 10))
(size-value (size 4) (value 11))
(size-value (size 4) (value 12))
(size-value (size 4) (value 13))
(size-value (size 4) (value 14))
(size-value (size 4) (value 15))
(size-value (size 4) (value 16))
(size-value (size 5) (value 17))
(size-value (size 5) (value 18))
(size-value (size 5) (value 19))
(size-value (size 5) (value 20))
(size-value (size 5) (value 21))
(size-value (size 5) (value 22))
(size-value (size 5) (value 23))
(size-value (size 5) (value 24))
(size-value (size 5) (value 25)))
;;; ###########
;;; SETUP RULES
;;; ###########
;;; ***********
;;; stress-test
;;; ***********
(defrule stress-test
(declare (salience 10))
(phase match)
(stress-test)
(priority ?last)
(not (priority ?p&:(> ?p ?last)))
(technique (priority ?next&:(> ?next ?last)))
(not (technique (priority ?p&:(> ?p ?last)&:(< ?p ?next))))
=>
(assert (priority ?next)))
;;; *****************
;;; enable-techniques
;;; *****************
(defrule enable-techniques
(declare (salience 10))
(phase match)
(size ?)
(not (possible (value any)))
=>
(assert (priority 1)))
;;; **********
;;; expand-any
;;; **********
(defrule expand-any
(declare (salience 10))
(phase expand-any)
?f <- (possible (row ?r) (column ?c) (value any) (group ?g) (id ?id))
(not (possible (value any) (id ?id2&:(< ?id2 ?id))))
(size ?s)
(size-value (size ?as&:(<= ?as ?s)) (value ?v))
(not (possible (row ?r) (column ?c) (value ?v)))
(not (and (size-value (value ?v2&:(< ?v2 ?v)))
(not (possible (row ?r) (column ?c) (value ?v2)))))
=>
(assert (possible (row ?r) (column ?c) (value ?v) (group ?g) (id ?id))))
;;; *****************
;;; position-expanded
;;; *****************
(defrule position-expanded
(declare (salience 10))
(phase expand-any)
?f <- (possible (row ?r) (column ?c) (value any) (group ?g) (id ?id))
(size ?s)
(not (and (size-value (size ?as&:(<= ?as ?s)) (value ?v))
(not (possible (row ?r) (column ?c) (value ?v)))))
=>
(retract ?f))
;;; ###########
;;; PHASE RULES
;;; ###########
;;; ***************
;;; expand-any-done
;;; ***************
(defrule expand-any-done
(declare (salience 10))
?f <- (phase expand-any)
(not (possible (value any)))
=>
(retract ?f)
(assert (phase initial-output))
(assert (print-position 1 1)))
;;; ***********
;;; begin-match
;;; ***********
(defrule begin-match
(declare (salience -20))
?f <- (phase initial-output)
=>
(retract ?f)
(assert (phase match)))
;;; *****************
;;; begin-elimination
;;; *****************
(defrule begin-elimination
(declare (salience -20))
?f <- (phase match)
(not (not (impossible)))
=>
(retract ?f)
(assert (phase elimination)))
;;; *************
;;; next-priority
;;; *************
(defrule next-priority
(declare (salience -20))
(phase match)
(not (impossible))
(priority ?last)
(not (priority ?p&:(> ?p ?last)))
(technique (priority ?next&:(> ?next ?last)))
(not (technique (priority ?p&:(> ?p ?last)&:(< ?p ?next))))
=>
(assert (priority ?next)))
;;; ************
;;; begin-output
;;; ************
(defrule begin-output
(declare (salience -20))
?f <- (phase match)
(not (impossible))
(priority ?last)
(not (priority ?p&:(> ?p ?last)))
(not (technique (priority ?next&:(> ?next ?last))))
=>
(retract ?f)
(assert (phase final-output))
(assert (print-position 1 1)))

View File

@@ -0,0 +1,15 @@
cmake_minimum_required(VERSION 2.6)
enable_testing()
set(CMAKE_BUILD_TYPE debug)
include_directories("/usr/local/include")
find_library(ssl_LIBRARY NAMES ssl PATHS "/usr/local/lib")
add_custom_command(OUTPUT "ver.c" "ver.h" COMMAND ./ver.sh)
add_executable(foo foo.c bar.c baz.c ver.c)
target_link_libraries(foo ${ssl_LIBRARY})

View File

@@ -0,0 +1,25 @@
cmake_minimum_required(VERSION 2.8 FATAL_ERROR)
project(PCLVisualizer)
target_link_libraries (PCLVisualizer ${PCL_LIBRARIES})
#it seems it's needed only on OS X 10.9
find_package(GLEW REQUIRED)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -I/usr/include -v")
find_package(PCL 1.7 REQUIRED)
include_directories(${PCL_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS})
set(PCL_BUILD_TYPE Release)
file(GLOB PCL_openni_viewer_SRC
"src/*.h"
"src/*.cpp"
)
add_executable(PCLVisualizer ${PCL_openni_viewer_SRC})
#add this line to solve probem in mac os x 10.9
target_link_libraries(PCLVisualizer ${PCL_COMMON_LIBRARIES} ${PCL_IO_LIBRARIES} ${PCL_VISUALIZATION_LIBRARIES} ${PCL_FEATURES_LIBRARIES})

View File

@@ -0,0 +1,33 @@
# Specifications for building user and development documentation.
#
# ====================================================================
# Copyright (c) 2009 Ian Blumel. All rights reserved.
#
# This software is licensed as described in the file LICENSE, which
# you should have received as part of this distribution.
# ====================================================================
CMAKE_MINIMUM_REQUIRED(VERSION 2.6)
FIND_FILE( SPHINX sphinx-build.exe)
# If we are windows call to the make.bat file, otherwise rely on the Makefile
# to handle the processing.
IF(WIN32)
SET(SPHINX_MAKE make.bat)
ELSE(WIN32)
SET(SPHINX_MAKE make)
ENDIF(WIN32)
ADD_CUSTOM_TARGET(
doc_usr
COMMAND ${SPHINX_MAKE} html
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/usr
)
ADD_CUSTOM_TARGET(
doc_dev
COMMAND ${SPHINX_MAKE} html
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/dev
)

View File

@@ -0,0 +1,33 @@
cmake_minimum_required (VERSION 2.6)
set (CMAKE_RUNTIME_OUTPUT_DIRECTORY "bin")
list(APPEND CMAKE_MODULE_PATH ${CMAKE_SOURCE_DIR}/cmake/vala)
find_package(Vala REQUIRED)
include(ValaPrecompile)
include(ValaVersion)
ensure_vala_version("0.11.0" MINIMUM)
project (template C)
find_package(PkgConfig)
pkg_check_modules(GOBJECT REQUIRED gobject-2.0)
add_definitions(${GOBJECT_CFLAGS} ${GOBJECT_CFLAGS_OTHER})
link_libraries(${GOBJECT_LIBRARIES})
link_directories(${GOBJECT_LIBRARY_DIRS})
vala_precompile(VALA_C
src/template.vala
PACKAGES
OPTIONS
--thread
CUSTOM_VAPIS
GENERATE_VAPI
GENERATE_HEADER
DIRECTORY
gen
)
add_executable("template" ${VALA_C})

View File

@@ -0,0 +1,89 @@
# - Check if the STDCALL function exists.
# This works for non-cdecl functions (kernel32 functions, for example)
# CHECK_STDCALL_FUNCTION_EXISTS(FUNCTION FUNCTION_DUMMY_ARGS VARIABLE)
# - macro which checks if the stdcall function exists
# FUNCTION_DECLARATION - the definition of the function ( e.g.: Sleep(500) )
# VARIABLE - variable to store the result
#
# The following variables may be set before calling this macro to
# modify the way the check is run:
#
# CMAKE_REQUIRED_FLAGS = string of compile command line flags
# CMAKE_REQUIRED_DEFINITIONS = list of macros to define (-DFOO=bar)
# CMAKE_REQUIRED_INCLUDES = list of include directories
# CMAKE_REQUIRED_LIBRARIES = list of libraries to link
# CMAKE_EXTRA_INCLUDE_FILES = list of extra includes to check in
MACRO(CHECK_STDCALL_FUNCTION_EXISTS FUNCTION_DECLARATION VARIABLE)
IF("${VARIABLE}" MATCHES "^${VARIABLE}$")
#get includes
SET(CHECK_STDCALL_FUNCTION_PREMAIN)
FOREACH(def ${CMAKE_EXTRA_INCLUDE_FILES})
SET(CHECK_STDCALL_FUNCTION_PREMAIN "${CHECK_STDCALL_FUNCTION_PREMAIN}#include \"${def}\"\n")
ENDFOREACH(def)
#add some default includes
IF ( HAVE_WINDOWS_H )
SET(CHECK_STDCALL_FUNCTION_PREMAIN "${CHECK_STDCALL_FUNCTION_PREMAIN}#include \"windows.h\"\n")
ENDIF ( HAVE_WINDOWS_H )
IF ( HAVE_UNISTD_H )
SET(CHECK_STDCALL_FUNCTION_PREMAIN "${CHECK_STDCALL_FUNCTION_PREMAIN}#include \"unistd.h\"\n")
ENDIF ( HAVE_UNISTD_H )
IF ( HAVE_DIRECT_H )
SET(CHECK_STDCALL_FUNCTION_PREMAIN "${CHECK_STDCALL_FUNCTION_PREMAIN}#include \"direct.h\"\n")
ENDIF ( HAVE_DIRECT_H )
IF ( HAVE_IO_H )
SET(CHECK_STDCALL_FUNCTION_PREMAIN "${CHECK_STDCALL_FUNCTION_PREMAIN}#include \"io.h\"\n")
ENDIF ( HAVE_IO_H )
IF ( HAVE_SYS_TIMEB_H )
SET(CHECK_STDCALL_FUNCTION_PREMAIN "${CHECK_STDCALL_FUNCTION_PREMAIN}#include \"sys/timeb.h\"\n")
ENDIF ( HAVE_SYS_TIMEB_H )
STRING(REGEX REPLACE "(\\(.*\\))" "" CHECK_STDCALL_FUNCTION_EXISTS_FUNCTION ${FUNCTION_DECLARATION} )
SET(MACRO_CHECK_STDCALL_FUNCTION_DEFINITIONS "${CMAKE_REQUIRED_FLAGS}")
MESSAGE(STATUS "Looking for ${CHECK_STDCALL_FUNCTION_EXISTS_FUNCTION}")
IF(CMAKE_REQUIRED_LIBRARIES)
SET(CHECK_STDCALL_FUNCTION_EXISTS_ADD_LIBRARIES
"-DLINK_LIBRARIES:STRING=${CMAKE_REQUIRED_LIBRARIES}")
ELSE(CMAKE_REQUIRED_LIBRARIES)
SET(CHECK_STDCALL_FUNCTION_EXISTS_ADD_LIBRARIES)
ENDIF(CMAKE_REQUIRED_LIBRARIES)
IF(CMAKE_REQUIRED_INCLUDES)
SET(CHECK_STDCALL_FUNCTION_EXISTS_ADD_INCLUDES
"-DINCLUDE_DIRECTORIES:STRING=${CMAKE_REQUIRED_INCLUDES}")
ELSE(CMAKE_REQUIRED_INCLUDES)
SET(CHECK_STDCALL_FUNCTION_EXISTS_ADD_INCLUDES)
ENDIF(CMAKE_REQUIRED_INCLUDES)
SET(CHECK_STDCALL_FUNCTION_DECLARATION ${FUNCTION_DECLARATION})
CONFIGURE_FILE("${clucene-shared_SOURCE_DIR}/cmake/CheckStdCallFunctionExists.cpp.in"
"${CMAKE_BINARY_DIR}${CMAKE_FILES_DIRECTORY}/CMakeTmp/CheckStdCallFunctionExists.cpp" IMMEDIATE @ONLY)
FILE(READ "${CMAKE_BINARY_DIR}${CMAKE_FILES_DIRECTORY}/CMakeTmp/CheckStdCallFunctionExists.cpp"
CHECK_STDCALL_FUNCTION_CONTENT)
TRY_COMPILE(${VARIABLE}
${CMAKE_BINARY_DIR}
"${CMAKE_BINARY_DIR}${CMAKE_FILES_DIRECTORY}/CMakeTmp/CheckStdCallFunctionExists.cpp"
COMPILE_DEFINITIONS ${CMAKE_REQUIRED_DEFINITIONS}
CMAKE_FLAGS -DCOMPILE_DEFINITIONS:STRING=${MACRO_CHECK_STDCALL_FUNCTION_DEFINITIONS}
"${CHECK_STDCALL_FUNCTION_EXISTS_ADD_LIBRARIES}"
"${CHECK_STDCALL_FUNCTION_EXISTS_ADD_INCLUDES}"
OUTPUT_VARIABLE OUTPUT)
IF(${VARIABLE})
SET(${VARIABLE} 1 CACHE INTERNAL "Have function ${FUNCTION_DECLARATION}")
MESSAGE(STATUS "Looking for ${FUNCTION_DECLARATION} - found")
FILE(APPEND ${CMAKE_BINARY_DIR}${CMAKE_FILES_DIRECTORY}/CMakeOutput.log
"Determining if the stdcall function ${FUNCTION_DECLARATION} exists passed with the following output:\n"
"${OUTPUT}\nCheckStdCallFunctionExists.cpp:\n${CHECK_STDCALL_FUNCTION_CONTENT}\n\n")
ELSE(${VARIABLE})
MESSAGE(STATUS "Looking for ${FUNCTION_DECLARATION} - not found")
SET(${VARIABLE} "" CACHE INTERNAL "Have function ${FUNCTION_DECLARATION}")
FILE(APPEND ${CMAKE_BINARY_DIR}${CMAKE_FILES_DIRECTORY}/CMakeError.log
"Determining if the stdcall function ${FUNCTION_DECLARATION} exists failed with the following output:\n"
"${OUTPUT}\nCheckStdCallFunctionExists.cpp:\n${CHECK_STDCALL_FUNCTION_CONTENT}\n\n")
ENDIF(${VARIABLE})
ENDIF("${VARIABLE}" MATCHES "^${VARIABLE}$")
ENDMACRO(CHECK_STDCALL_FUNCTION_EXISTS)

View File

@@ -0,0 +1,22 @@
IF (NOT EXISTS "@PROJECT_BINARY_DIR@/install_manifest.txt")
MESSAGE (FATAL_ERROR "Cannot find install manifest: \"@PROJECT_BINARY_DIR@/install_manifest.txt\"")
ENDIF (NOT EXISTS "@PROJECT_BINARY_DIR@/install_manifest.txt")
FILE (READ "@PROJECT_BINARY_DIR@/install_manifest.txt" files)
STRING (REGEX REPLACE "\n" ";" files "${files}")
FOREACH (file ${files})
MESSAGE (STATUS "Uninstalling \"$ENV{DESTDIR}${file}\"")
IF (EXISTS "$ENV{DESTDIR}${file}")
EXEC_PROGRAM (
"@CMAKE_COMMAND@" ARGS "-E remove \"$ENV{DESTDIR}${file}\""
OUTPUT_VARIABLE rm_out
RETURN_VALUE rm_retval
)
IF (NOT "${rm_retval}" STREQUAL 0)
MESSAGE (FATAL_ERROR "Problem when removing \"$ENV{DESTDIR}${file}\"")
ENDIF (NOT "${rm_retval}" STREQUAL 0)
ELSE (EXISTS "$ENV{DESTDIR}${file}")
MESSAGE (STATUS "File \"$ENV{DESTDIR}${file}\" does not exist.")
ENDIF (EXISTS "$ENV{DESTDIR}${file}")
ENDFOREACH (file)

View File

@@ -0,0 +1,21 @@
;;;; -*- lisp -*-
(in-package :foo)
;;; Header comment.
(defvar *foo*)
(eval-when (:execute :compile-toplevel :load-toplevel)
(defun add (x &optional y &key z)
(declare (ignore z))
;; Inline comment.
(+ x (or y 1))))
#|
Multi-line comment.
|#
(defmacro foo (x &body b)
(if x
`(1+ ,x) ;After-line comment.
42))

View File

@@ -0,0 +1,29 @@
;; -*- mode: emacs-lisp; coding: emacs-mule; -*-
;; --------------------------------------------------------------------------
;; Desktop File for Emacs
;; --------------------------------------------------------------------------
;; Created Sat Jan 3 12:46:35 2015
;; Desktop file format version 206
;; Emacs version 24.3.1
;; Global section:
(setq desktop-missing-file-warning nil)
(setq tags-file-name nil)
(setq tags-table-list nil)
(setq search-ring nil)
(setq regexp-search-ring nil)
(setq register-alist nil)
(setq file-name-history nil)
;; Buffer section -- buffers listed in same order as in buffer list:
(desktop-create-buffer 206
"/home/foo/bar"
"bar"
'fundamental-mode
nil
11572
'(11554 nil)
nil
nil
'((buffer-file-coding-system . undecided-unix)))

133
samples/Forth/tools.4TH Normal file
View File

@@ -0,0 +1,133 @@
\ -*- forth -*- Copyright 2004, 2013 Lars Brinkhoff
( Tools words. )
: .s ( -- )
[char] < emit depth (.) ." > "
'SP @ >r r@ depth 1- cells +
begin
dup r@ <>
while
dup @ .
/cell -
repeat r> 2drop ;
: ? @ . ;
: c? c@ . ;
: dump bounds do i ? /cell +loop cr ;
: cdump bounds do i c? loop cr ;
: again postpone branch , ; immediate
: see-find ( caddr -- end xt )
>r here lastxt @
begin
dup 0= abort" Undefined word"
dup r@ word= if r> drop exit then
nip dup >nextxt
again ;
: cabs ( char -- |char| ) dup 127 > if 256 swap - then ;
: xt. ( xt -- )
( >name ) count cabs type ;
: xt? ( xt -- flag )
>r lastxt @ begin
?dup
while
dup r@ = if r> 2drop -1 exit then
>nextxt
repeat r> drop 0 ;
: disassemble ( x -- )
dup xt? if
( >name ) count
dup 127 > if ." postpone " then
cabs type
else
.
then ;
: .addr dup . ;
: see-line ( addr -- )
cr ." ( " .addr ." ) " @ disassemble ;
: see-word ( end xt -- )
>r ." : " r@ xt.
r@ >body do i see-line /cell +loop
." ;" r> c@ 127 > if ." immediate" then ;
: see bl word see-find see-word cr ;
: #body bl word see-find >body - ;
: type-word ( end xt -- flag )
xt. space drop 0 ;
: traverse-dictionary ( in.. xt -- out.. )
\ xt execution: ( in.. end xt2 -- in.. 0 | in.. end xt2 -- out.. true )
>r here lastxt @ begin
?dup
while
r> 2dup >r >r execute
if r> r> 2drop exit then
r> dup >nextxt
repeat r> 2drop ;
: words ( -- )
['] type-word traverse-dictionary cr ;
\ ----------------------------------------------------------------------
( Tools extension words. )
\ ;code
\ assembler
\ in kernel: bye
\ code
\ cs-pick
\ cs-roll
\ editor
: forget ' dup >nextxt lastxt ! 'here ! reveal ;
\ Kernel: state
\ [else]
\ [if]
\ [then]
\ ----------------------------------------------------------------------
( Forth2012 tools extension words. )
\ TODO: n>r
\ TODO: nr>
\ TODO: synonym
: [undefined] bl-word find nip 0= ; immediate
: [defined] postpone [undefined] invert ; immediate
\ ----------------------------------------------------------------------
: @+ ( addr -- addr+/cell x ) dup cell+ swap @ ;
: !+ ( x addr -- addr+/cell ) tuck ! cell+ ;
: -rot swap >r swap r> ;

161
samples/GAP/bugfix.tst Normal file
View File

@@ -0,0 +1,161 @@
gap> START_TEST("Test for various former bugs");
gap> # The following used to trigger an error starting with:
gap> # "SolutionMat: matrix and vector incompatible called from"
gap> K:=AbelianPcpGroup([3,3,3]);;
gap> A:=Subgroup(K,[K.1]);;
gap> cr:=CRRecordBySubgroup(K,A);;
gap> ExtensionsCR(cr);;
# Comparing homomorphisms used to be broken
gap> K:=AbelianPcpGroup(1,[3]);;
gap> hom1:=GroupHomomorphismByImages(K,K,[K.1],[K.1]);;
gap> hom2:=GroupHomomorphismByImages(K,K,[K.1^2],[K.1^2]);;
gap> hom1=hom2;
true
gap> hom1=IdentityMapping(K);
true
gap> hom2=IdentityMapping(K);
true
gap> # The following incorrectly triggered an error at some point
gap> IsTorsionFree(ExamplesOfSomePcpGroups(5));
true
gap> # Verify IsGeneratorsOfMagmaWithInverses warnings are silenced
gap> IsGeneratorsOfMagmaWithInverses(GeneratorsOfGroup(ExamplesOfSomePcpGroups(5)));
true
gap> # Check for a bug reported 2012-01-19 by Robert Morse
gap> g := PcGroupToPcpGroup(SmallGroup(48,1));
Pcp-group with orders [ 2, 2, 2, 2, 3 ]
gap> # The next two commands used to trigger errors
gap> NonAbelianTensorSquare(Centre(g));
Pcp-group with orders [ 8 ]
gap> NonAbelianExteriorSquare(Centre(g));
Pcp-group with orders [ ]
gap> # Check for a bug reported 2012-01-19 by Robert Morse
gap> F := FreeGroup("x","y");
<free group on the generators [ x, y ]>
gap> x := F.1;; y := F.2;;
gap> G := F/[x^2/y^24, y^24, y^x/y^23];
<fp group on the generators [ x, y ]>
gap> iso := IsomorphismPcGroup(G);
[ x, y ] -> [ f1, f2*f5 ]
gap> iso1 := IsomorphismPcpGroup(Image(iso));
[ f1, f2, f3, f4, f5 ] -> [ g1, g2, g3, g4, g5 ]
gap> G := Image(iso*iso1);
Pcp-group with orders [ 2, 2, 2, 2, 3 ]
gap> # The next command used to trigger an error
gap> NonAbelianTensorSquare(Image(iso*iso1));
Pcp-group with orders [ 2, 2, 3, 2, 2, 2, 2 ]
gap> # The problem with the previous example is/was that Igs(G)
gap> # is set to a non-standard value:
gap> Igs(G);
[ g1, g2*g5, g3*g4*g5^2, g4*g5, g5 ]
gap> # Unfortunately, it seems that a lot of code that
gap> # really should be using Ngs or Cgs is using Igs incorrectly.
gap> # For example, direct products could return *invalid* embeddings:
gap> D := DirectProduct(G, G);
Pcp-group with orders [ 2, 2, 2, 2, 3, 2, 2, 2, 2, 3 ]
gap> hom:=Embedding(D,1);;
gap> mapi:=MappingGeneratorsImages(hom);;
gap> GroupHomomorphismByImages(Source(hom),Range(hom),mapi[1],mapi[2]) <> fail;
true
gap> hom:=Projection(D,1);;
gap> mapi:=MappingGeneratorsImages(hom);;
gap> GroupHomomorphismByImages(Source(hom),Range(hom),mapi[1],mapi[2]) <> fail;
true
gap> # Check for bug computing Schur extension of infinite cyclic groups,
gap> # found by Max Horn 2012-05-25
gap> G:=AbelianPcpGroup(1,[0]);
Pcp-group with orders [ 0 ]
gap> # The next command used to trigger an error
gap> SchurExtension(G);
Pcp-group with orders [ 0 ]
gap> # Check for bug computing Schur extensions of subgroups, found by MH 2012-05-25.
gap> G:=HeisenbergPcpGroup(2);
Pcp-group with orders [ 0, 0, 0, 0, 0 ]
gap> H:=Subgroup(G,[G.2^3*G.3^2, G.1^9]);
Pcp-group with orders [ 0, 0, 0 ]
gap> # The next command used to trigger an error
gap> SchurExtension(H);
Pcp-group with orders [ 0, 0, 0, 0, 0, 0 ]
gap> # Check for bug computing Schur extensions of subgroups, found by MH 2012-05-25.
gap> G:=HeisenbergPcpGroup(2);
Pcp-group with orders [ 0, 0, 0, 0, 0 ]
gap> H:=Subgroup(G,[G.1, G.2]);
Pcp-group with orders [ 0, 0 ]
gap> # The next command used to trigger an error
gap> SchurExtension(H);
Pcp-group with orders [ 0, 0, 0 ]
gap> # Check for bug computing normalizer of two subgroups, found by MH 2012-05-30.
gap> # The problem was caused by incorrect resp. overly restrictive use of Parent().
gap> G:=HeisenbergPcpGroup(2);
Pcp-group with orders [ 0, 0, 0, 0, 0 ]
gap> A:=Subgroup(Subgroup(G,[G.2,G.3,G.4,G.5]), [G.3]);
Pcp-group with orders [ 0 ]
gap> B:=Subgroup(Subgroup(G,[G.1,G.4,G.5]), [G.4]);
Pcp-group with orders [ 0 ]
gap> Normalizer(A,B);
Pcp-group with orders [ 0 ]
gap> # The following used to trigger the error "arguments must have a common parent group"
gap> Normalizer(B,A);
Pcp-group with orders [ 0 ]
gap> # In polycyclic 2.9 and 2.10, the code for 2-cohomology computations was broken.
gap> G := UnitriangularPcpGroup(3,0);
Pcp-group with orders [ 0, 0, 0 ]
gap> mats := G!.mats;
[ [ [ 1, 1, 0 ], [ 0, 1, 0 ], [ 0, 0, 1 ] ],
[ [ 1, 0, 0 ], [ 0, 1, 1 ], [ 0, 0, 1 ] ],
[ [ 1, 0, 1 ], [ 0, 1, 0 ], [ 0, 0, 1 ] ] ]
gap> C := CRRecordByMats(G,mats);;
gap> cc := TwoCohomologyCR(C);;
gap> cc.factor.rels;
[ 2, 0, 0 ]
gap> c := cc.factor.prei[2];
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, -1, 1 ]
gap> cc.gcb;
[ [ 0, 0, 1, 0, 0, -1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 ],
[ 0, 0, -1, 0, 0, 0, 0, 0, 1, 0, 0, -1, 0, 0, 0, 0, 0, 0 ],
[ 0, -1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, -1 ],
[ -1, 0, 1, 1, 0, 0, 0, -1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0 ],
[ 0, -1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, 0, 1 ] ]
gap> cc.gcc;
[ [ 1, 0, 0, 0, 0, -2, -1, 0, 1, 1, -1, -1, 0, 0, 0, 0, 0, 0 ],
[ 0, 1, 0, 0, -1, -1, 0, 0, 1, 0, 0, -1, 0, 0, 0, 0, 0, 0 ],
[ 0, 0, 1, 0, 0, -2, 0, 0, 1, 0, 0, -1, 0, 0, 0, 0, 0, 0 ],
[ 0, 0, 0, 1, 0, 0, -1, -1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0 ],
[ 0, 0, 0, 0, 0, 1, 0, 0, -1, 0, 0, 1, 0, 0, 0, 0, 0, 0 ],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, -1, 1 ],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, -1 ] ]
gap> # LowerCentralSeriesOfGroup for non-nilpotent pcp-groups used to trigger
gap> # an infinite recursion
gap> G := PcGroupToPcpGroup(SmallGroup(6,1));
Pcp-group with orders [ 2, 3 ]
gap> LowerCentralSeriesOfGroup(G);
[ Pcp-group with orders [ 2, 3 ], Pcp-group with orders [ 3 ] ]
gap> STOP_TEST( "bugfix.tst", 10000000);

21
samples/GAP/factor.tst Normal file
View File

@@ -0,0 +1,21 @@
gap> START_TEST("Test of factor groups and natural homomorphisms");
gap> G:=HeisenbergPcpGroup(2);
Pcp-group with orders [ 0, 0, 0, 0, 0 ]
gap> H:=Subgroup(G,[G.2,G.3,G.4,G.5]);
gap> K:=G/H;
gap> NaturalHomomorphism(K);
gap> A:=Subgroup(H, [G.3]);
Pcp-group with orders [ 0 ]
gap> B:=Subgroup(Subgroup(G,[G.1,G.4,G.5]), [G.4]);
Pcp-group with orders [ 0 ]
gap> Normalizer(A,B);
Pcp-group with orders [ 0 ]
gap> # The following used to trigger the error "arguments must have a common parent group"
gap> Normalizer(B,A);
Pcp-group with orders [ 0 ]
gap> STOP_TEST( "factor.tst", 10000000);

328
samples/HTML/index.html.hl Normal file
View File

@@ -0,0 +1,328 @@
<script type="text/hoplon">
(page "index.html")
(defn mouse-loc->vec
"Given a Google Closure normalized DOM mouse event return the
mouse x and y position as a two element vector."
[e]
[(.-clientX e) (.-clientY e)])
;; =============================================================================
;; Example 1
(defc ex1-content ["Waiting for a click ...."])
(defc ex1-click-count 0)
(defn ex1 []
(when (< @ex1-click-count 1)
(swap! ex1-click-count inc)
(swap! ex1-content conj "Got a click!")))
;; =============================================================================
;; Example 2
(defc ex2-content ["Waiting for a click ...."])
(defc ex2-click-count 0)
(defn ex2 []
(when (= @ex2-click-count 1)
(swap! ex2-click-count inc)
(swap! ex2-content conj "Done"))
(when (= @ex2-click-count 0)
(swap! ex2-click-count inc)
(swap! ex2-content conj "Got a Click!" "Waiting for another click ....")))
;; =============================================================================
;; Example 3
(defc ex3-content ["Waiting for a click from Button A ....."])
(defc ex3-click-count-a 0)
(defc ex3-click-count-b 0)
(defn ex3a []
(when (= @ex3-click-count-a 0)
(swap! ex3-click-count-a inc)
(swap! ex3-content conj "Got a click!" "Waiting for a click from Button B ....")) )
(defn ex3b []
(when (and (= @ex3-click-count-a 1) (= @ex3-click-count-b 0))
(swap! ex3-click-count-b inc)
(swap! ex3-content conj "Done!")))
;; =============================================================================
;; Example 6
(defc ex6-content ["Click the button to start tracking the mouse."])
(defc ex6-button-name "GO!")
(defn ex6-toggle []
(let [new-name (if (= @ex6-button-name "GO!") "STOP!" "GO!")]
(reset! ex6-button-name new-name)))
(defn ex6 [e]
(when (= @ex6-button-name "STOP!")
(swap! ex6-content conj (str (mouse-loc->vec e)))))
;; =============================================================================
;; Example 7
(defc ex7-content ["Click the button to start tracking the mouse."])
(defc ex7-button-name "GO!")
(defn ex7-toggle []
(let [new-name (if (= @ex7-button-name "GO!") "STOP!" "GO!")]
(reset! ex7-button-name new-name)))
(defn ex7 [e]
(when (= @ex7-button-name "STOP!")
(let [[x y :as m] (mouse-loc->vec e)]
(when (zero? (mod y 5))
(swap! ex7-content conj (str m))))))
;; =============================================================================
;; Example 8
(defc ex8-content ["Click the button ten times."])
(defc ex8-click-count 0)
(defn ex8 []
(when (< @ex8-click-count 10)
(swap! ex8-click-count inc)
(when (= @ex8-click-count 1)
(swap! ex8-content conj "1 Click!"))
(when (> @ex8-click-count 1)
(swap! ex8-content conj (str @ex8-click-count " clicks!")))
(when (= @ex8-click-count 10)
(swap! ex8-content conj "Done."))))
;; =============================================================================
;; Example 9
(defc ex9-index 0)
(defc ex9-animals [:aardvark :beetle :cat :dog :elk :ferret
:goose :hippo :ibis :jellyfish :kangaroo])
(defc= ex9-card (nth ex9-animals ex9-index))
(defn ex9-prev []
(when (> @ex9-index 0)
(swap! ex9-index dec)))
(defn ex9-next []
(when (< @ex9-index (dec (count @ex9-animals)))
(swap! ex9-index inc)))
;; =============================================================================
;; Example 10
(defc ex10-button-name "START!")
(defc ex10-index 0)
(defn ex10 []
(let [the-name @ex10-button-name]
(when (= the-name"START!")
(reset! ex10-button-name "STOP!"))
(when (= the-name"STOP!")
(reset! ex10-button-name "DONE!"))))
(defc ex10-animals [:aardvark :beetle :cat :dog :elk :ferret
:goose :hippo :ibis :jellyfish :kangaroo])
(defc= ex10-max (dec (count ex10-animals)))
(defc= ex10-card (nth ex10-animals ex10-index))
(defn ex10-prev []
(if (> @ex10-index 0)
(swap! ex10-index dec)
(reset! ex10-index @ex10-max)))
(defn ex10-next []
(if (< @ex10-index @ex10-max)
(swap! ex10-index inc)
(reset! ex10-index 0)))
(defn ex10-nav [k]
(when (= @ex10-button-name "STOP!")
(when (= k :next)
(ex10-next))
(when (= k :prev)
(ex10-prev))))
(defn ex10-keys [e]
(when (= @ex10-button-name "STOP!")
(if (= (.-keyCode e) 39) (ex10-nav :next))
(if (= (.-keyCode e) 37) (ex10-nav :prev))
)
)
</script>
<html>
<head>
<link rel="stylesheet" type="text/css" href="css/main.css" />
</head>
<body>
<!-- Example 1 -->
<div id="ex1" class="example">
<h2>Example 1</h2>
<table>
<tr>
<td class="left">
<button id="ex1-button" on-click='{{ #(ex1) }}'>Click me</button>
</td>
<td id="ex1-display" class="display">
<div id="ex1-messages">
<loop-tpl bindings='{{ [x ex1-content] }}'>
<p><text>~{x}</text></p>
</loop-tpl>
</div>
</td>
</tr>
</table>
</div>
<!-- Example 2 -->
<div id="ex2" class="example">
<h2>Example 2</h2>
<table>
<tr>
<td class="left">
<button id="ex2-button" on-click='{{ #(ex2) }}'>Click me</button>
</td>
<td id="ex2-display" class="display">
<div id="ex2-messages">
<loop-tpl bindings='{{ [x ex2-content] }}'>
<p><text>~{x}</text></p>
</loop-tpl>
</div>
</td>
</tr>
</table>
</div>
<!-- Example 3 -->
<div id="ex3" class="example">
<h2>Example 3</h2>
<table>
<tr>
<td class="left">
<button id="ex3-button-a" on-click='{{ #(ex3a) }}'>Button A</button>
<button id="ex3-button-b" on-click='{{ #(ex3b) }}'>Button B</button>
</td>
<td id="ex3-display" class="display">
<div id="ex3-messages">
<loop-tpl bindings='{{ [x ex3-content] }}'>
<p><text>~{x}</text></p>
</loop-tpl>
</div>
</td>
</tr>
</table>
</div>
<!-- Example 4 -->
<div id="ex4" class="example">
<h2>Example 4</h2>
<table>
<tr>
<td class="left">
<button id="ex4-button-a">Go!</button>
</td>
<td id="ex4-display" class="display">
<div id="ex4-messages"></div>
</td>
</tr>
</table>
</div>
<!-- Example 5 -->
<div id="ex5" class="example">
<h2>Example 5</h2>
<table>
<tr>
<td class="left">
<button id="ex5-button">Go!</button>
</td>
<td id="ex5-display" class="display">
<div id="ex5-messages"></div>
</td>
</tr>
</table>
</div>
<!-- Example 6 -->
<div id="ex6" class="example" on-mousemove='{{ #(ex6 %) }}' >
<h2>Example 6</h2>
<table>
<tr>
<td class="left">
<button id="ex6-button" do-text='{{ ex6-button-name }}' on-click='{{ #(ex6-toggle)}}' ></button>
</td>
<td id="ex6-display" class="display">
<div class="scrolling">
<div id="ex6-messages">
<loop-tpl bindings='{{ [x ex6-content] }}'>
<p><text>~{x}</text></p>
</loop-tpl>
</div>
</div>
</td>
</tr>
</table>
</div>
<!-- Example 7 -->
<div id="ex7" class="example" on-mousemove='{{ #(ex7 %) }}'>
<h2>Example 7</h2>
<table>
<tr>
<td class="left">
<button id="ex7-button" do-text='{{ ex7-button-name }}' on-click='{{ #(ex7-toggle)}}'></button>
</td>
<td id="ex7-display" class="display">
<div class="scrolling">
<div id="ex7-messages">
<loop-tpl bindings='{{ [x ex7-content] }}'>
<p><text>~{x}</text></p>
</loop-tpl>
</div>
</div>
</td>
</tr>
</table>
</div>
<!-- Example 8 -->
<div id="ex8" class="example">
<h2>Example 8</h2>
<table>
<tr>
<td class="left">
<button id="ex8-button" on-click='{{ #(ex8) }}'>Click me!</button>
</td>
<td id="ex8-display" class="display card">
<div class="scrolling">
<div id="ex8-messages">
<loop-tpl bindings='{{ [x ex8-content] }}'>
<p><text>~{x}</text></p>
</loop-tpl>
</div>
</div>
</td>
</tr>
</table>
</div>
<!-- Example 9 -->
<div id="ex9" class="example">
<h2>Example 9</h2>
<table>
<tr>
<td class="left">
<button id="ex9-button-prev" on-click='{{ #(ex9-prev) }}' do-class='{{ (cell= {:disabled (= ex9-index 0)})}}'>Previous</button>
<button id="ex9-button-next" on-click='{{ #(ex9-next) }}' do-class='{{ (cell= {:disabled (= ex9-index (dec (count ex9-animals)))}) }}'>Next</button>
</td>
<td id="ex9-card" class="display card" do-text='{{ ex9-card }}'></td>
</tr>
</table>
</div>
<!-- Example 10 -->
<div id="ex10" class="example" on-keydown='{{ #(ex10-keys %) }}'>
<h2>Example 10</h2>
<table>
<tr>
<td class="left">
<button id="ex10-button-start-stop" do-text='{{ ex10-button-name}}' on-click='{{ #(ex10) }}'></button>
<button id="ex10-button-prev" on-click='{{ #(ex10-nav :prev) }}'
do-class='{{ (cell= {:disabled (not= ex10-button-name "STOP!")}) }}'>Previous
</button>
<button id="ex10-button-next" on-click='{{ #(ex10-nav :next) }}' do-class='{{ (cell= {:disabled (not= ex10-button-name "STOP!")}) }}'>Next</button>
</td>
<td id="ex10-card" class="display card" do-text='{{ ex10-card }}'></td>
</tr>
</table>
</div>
</body>
</html>

73
samples/J/stwij.ijs Normal file
View File

@@ -0,0 +1,73 @@
NB. From "Continuing to write in J".
NB. See http://www.jsoftware.com/help/jforc/continuing_to_write_in_j.htm
empno=: 316 317 319 320
payrate=: 60 42 44 54
billrate=: 120 90 90 108
clientlist=: 10011 10012 10025
emp_client=: 10012 10025 10012 10025
hoursworked=: 4 31 $ 8 0 3 10 9 8 8 9 4 0 8 7 10 10 12 9 0 6 8 9 9 9 0 0 10 11 9 7 10 2 0 8 0 0 9 9 8 9 10 0 0 8 8 10 7 10 0 0 7 8 9 8 9 0 4 9 8 9 8 9 0 0 5 0 0 8 9 9 9 9 0 0 8 7 0 0 9 0 2 10 10 9 11 8 0 0 8 9 10 8 9 0 0 9 0 0 9 10 8 6 6 8 0 9 8 10 6 9 7 0 6 8 8 8 9 0 5 8 9 8 8 12 0 0
NB. Finds the number of hours each employee worked in the given month.
emphours=: 3 : '+/"1 hoursworked'
NB. Determines the wages earned by each employee in the given month.
empearnings=: 3 : 'payrate * +/"1 hoursworked'
NB. Determines the profit brought in by each employee.
empprofit=: 3 : 0
(billrate - payrate) * +/"1 hoursworked
)
NB. Returns the amount to bill a given client.
billclient=: 3 : 0
mask=. emp_client = y
+/ (mask # billrate) * +/"1 mask # hoursworked
)
NB. Finds for each day of the month the employee who billed the most hours.
dailydrudge=: 3 : 0
((|: hoursworked) i."1 0 >./ hoursworked) { empno
)
NB. Returns the employees, in descending order of the profit brought in by each.
producers=: 3 : 'empno \: empprofit 0'
NB. Returns the clients, in descending order of the profit generated by each.
custbyprofit=: 3 : 0
clientlist \: +/ (clientlist ="1 0 emp_client) * empprofit 0
)
NB. Calculates withholding tax on each employee's earnings.
renderuntocaesar=: 3 : 0
bktmin=. 0 6000 10000 20000 NB. Four brackets, 0..6000..10000..20000.._
bktrate=. 0.05 0.10 0.20 0.30
bktearns=. 0 >. ((1 |.!._ bktmin) <."1 0 empearnings'') -"1 bktmin
+/"1 bktrate *"1 bktearns
)
NB. Main
echo 'Problem 1'
echo emphours''
echo 'Problem 2'
echo empearnings''
echo 'Problem 3'
echo empprofit''
echo 'Problem 4'
echo billclient 10025
echo 'Problem 5'
echo dailydrudge''
echo 'Problem 6'
echo producers''
echo 'Problem 7'
echo custbyprofit''
echo 'Problem 8'
echo 0j2 ": renderuntocaesar''

View File

@@ -0,0 +1,344 @@
(* Mathematica Package *)
(* Created with IntelliJ IDEA and the Mathematica Language plugin *)
(* :Title: Importer for the RAW data-format of the Heidelberg Eye Explorer (known as HEYEX) *)
(* :Context: HeyexImport` *)
(* :Author: Patrick Scheibe pscheibe@trm.uni-leipzig.de *)
(* :Package Version: 1.0 *)
(* :Mathematica Version: 8.0 *)
(* :Copyright: Patrick Scheibe, 2013-2015 *)
(* :Discussion: This package registers a new importer which can load the RAW data-format exported by a
Heidelberg Spectralis OCT. The import-functionality can access different information contained
in a file:
1. The file header which contains meta data like when the patient was scanned etc
2. The scanned volume data
3. Images which represent slices of the scanned volume
4. The Scanning laser ophthalmoscopy (SLO) image which is taken with every scanned patient
5. The segmentation data for different retina layers provided by the software
*)
(* :Keywords: Import, Heyex, OCT, Spectralis, Heidelberg Engineering *)
BeginPackage[ "HeyexImport`" ]
HeyexEyePosition::usage = "HeyexEyePosition[file] tries to extract which eye was scanned, left or right.";
HeyexImport::wrongHdr = "Error importing OCT data. Broken/Wrong file?";
Begin[ "`Private`" ];
(*
Registration of all import possibilities for the Heidelberg OCT.
*)
ImportExport`RegisterImport[
"Heyex" ,
{
"FileHeader" :> importHeader,
{ "Data" , n_Integer} :> (importData[n][##]&),
"Data" :> importData,
{ "Images" , n_Integer} :> (importImages[n][##]&),
"Images" :> importImages,
"SLOImage" :> importSLOImage,
"SegmentationData" :> importSegmentation,
{ "SegmentationData" , n_Integer} :> (importSegmentation[n][##]&),
"DataSize" :> importDataSize,
importData
},
{
"Image3D" :> (Image3D["Data" /. #1]&)
},
"AvailableElements" -> {"FileHeader", "Data", "DataSize", "Images", "SLOImage", "SegmentationData", "Image3D"}
];
If[Quiet[Check[TrueQ[Compile[{}, 0, CompilationTarget -> "C"][] == 0], False]],
$compileTarget = CompilationTarget -> "C",
$compileTarget = CompilationTarget -> "MVM"
];
(*
Helper function which reads data from a stream. This is
only a unification so I can map the read function over a
list.
*)
read[{id_String, type_String}, str_] :=
id -> BinaryRead[str, type];
read[{type_String, n_Integer}, str_] := BinaryReadList[str, type, n];
read[{id_String, {type_String, n_Integer}}, str_] := id -> BinaryReadList[str, type, n];
(*
Note that when reading bytes explicitly I convert them to
a string and remove any zeroes at the end.
*)
read[{id_String, { "Byte" , n_Integer}}, str_] :=
id -> StringJoin[
FromCharacterCode /@ (Rest[
NestList[BinaryRead[str, "Byte" ] &, Null,
n]] /. {chars___Integer, Longest[0 ...]} :> {chars})];
(*
The layout of a file exported with "Raw Export"
*****************
* File Header *
*****************
* SLO Image *
*****************
* B-Scan #0 *
*****************
* ..... *
*****************
* B-Scan #n-1 *
*****************
*)
With[{i = "Integer32", f = "Real32", d = "Real64", b = "Byte"},
$fileHeaderInfo = Transpose[{
{
"Version" , "SizeX" , "NumBScans" , "SizeZ" , "ScaleX" , "Distance" ,
"ScaleZ" , "SizeXSlo" , "SizeYSlo" , "ScaleXSlo" , "ScaleYSlo" ,
"FieldSizeSlo" , "ScanFocus" , "ScanPosition" , "ExamTime" ,
"ScanPattern" , "BScanHdrSize" , "ID" , "ReferenceID" , "PID" ,
"PatientID" , "Padding" , "DOB" , "VID" , "VisitID" , "VisitDate" ,
"Spare"
},
{
{b, 12}, i, i, i, d, d, d, i, i, d, d, i, d, {b, 4}, {i, 2}, i, i,
{b, 16}, {b, 16}, i, {b, 21}, {b, 3}, d, i, {b, 24}, d, {b, 1840}
}
}];
$bScanHeaderInfo = Transpose[{
{
"Version" , "BScanHdrSize" , "StartX" , "StartY" , "EndX" , "EndY" ,
"NumSeg" , "OffSeg" , "Quality" , "Spare"
},
{{b, 12}, i, d, d, d, d, i, i, f, {b, 196}}
}];
];
isHeyexRawFormat[{"Version" -> version_String, "SizeX" -> _Integer, "NumBScans" -> _Integer, _Rule..}] /; StringMatchQ[version, "HSF-OCT" ~~__] := True ;
isHeyexRawFormat[___] := False;
readFileHeader[str_InputStream] := With[{hdr = Quiet[read[#, str]] & /@ $fileHeaderInfo},
hdr /; TrueQ[isHeyexRawFormat[hdr]]
];
readFileHeader[___] := (Message[HeyexImport::wrongHdr]; Throw[$Failed]);
(* Reads the camera image of the retina. Note that you must have the
information from the fileheader and you must be at the right position
of the file stream for this.*)
readSLOImage[str_InputStream, fileHdr : {(_String -> _) ..}] :=
Image[Partition[
BinaryReadList[str, "Byte" , "SizeXSlo" * "SizeYSlo" /. fileHdr],
"SizeXSlo" /. fileHdr], "Byte" ];
skipSLOImage[str_InputStream, fileHdr : {(_String -> _) ..}] :=
Skip[str, "Byte" , "SizeXSlo" * "SizeYSlo" /. fileHdr];
(* One single BScan consists itself again of a header and a data part *)
readBScanHeader[str_InputStream, fileHdr : {(_String -> _) ..}] :=
Module[{i = "Integer32", f = "Real32", d = "Real64", b = "Byte",
bScanHdr},
bScanHdr = read[#, str] & /@ Transpose[{
{ "Version" , "BScanHdrSize" , "StartX" , "StartY" , "EndX" , "EndY" ,
"NumSeg" , "OffSeg" , "Quality" , "Spare" },
{{b, 12}, i, d, d, d, d, i, i, f, {b, 196}}}
];
AppendTo[bScanHdr,
read[{ "SegArray" , { "Real32" ,
"NumSeg" * "SizeX" /. bScanHdr /. fileHdr}}, str]
];
(*
This is horrible slow, therefore I just skip the fillbytes
AppendTo[bScanHdr,
read[{"Fillbytes", {"Byte",
"BScanHdrSize" - 256 - "NumSeg"*"SizeX"*4 /. bScanHdr /.
fileHdr}}, str]
]
*)
Skip[str, "Byte" , "BScanHdrSize" - 256 - "NumSeg" * "SizeX" * 4 /. bScanHdr /. fileHdr];
AppendTo[bScanHdr, "FillBytes" -> None]
]
skipBScanHeader[str_InputStream, fileHdr : {(_String -> _) ..}] :=
Skip[str, "Byte" , "BScanHdrSize" /. fileHdr];
readBScanData[str_InputStream, fileHdr : {(_String -> _) ..}] :=
Module[{},
Developer`ToPackedArray[
Partition[read[{ "Real32" , "SizeX" * "SizeZ" /. fileHdr}, str],
"SizeX" /. fileHdr]]
];
skipBScanData[str_InputStream, fileHdr : {(_String -> _) ..}] :=
Skip[str, "Byte" , "SizeX" * "SizeZ" * 4 /. fileHdr];
skipBScanBlocks[str_InputStream, fileHdr : {(_String -> _) ..}, n_Integer] :=
Skip[str, "Byte" , n * ("BScanHdrSize" + "SizeX" * "SizeZ" * 4) /. fileHdr];
importHeader[filename_String, ___] := Module[
{str, header},
str = OpenRead[filename, BinaryFormat -> True];
header = readFileHeader[str];
Close[str];
"FileHeader" -> header
];
(* Imports the dimension of the scanned volume. *)
importDataSize[filename_String, r___] := Module[{header = importHeader[filename]},
"DataSize" -> ({"NumBScans", "SizeZ", "SizeXSlo"} /. ("FileHeader" /. header))
]
importSLOImage[filename_String, ___] := Module[
{str, header, slo},
str = OpenRead[filename, BinaryFormat -> True];
header = readFileHeader[str];
slo = readSLOImage[str, header];
Close[str];
"SLOImage" -> slo
]
importData[filename_String, ___] := Module[
{str, header, nx, n, data},
str = OpenRead[filename, BinaryFormat -> True];
header = readFileHeader[str];
{nx, n} = { "SizeX" , "SizeX" * "SizeZ"} /. header;
skipSLOImage[str, header];
data = Table[
skipBScanHeader[str, header];
Partition[read[{ "Real32" , n}, str], nx],
{"NumBScans" /. header}
];
Close[str];
"Data" -> Developer`ToPackedArray[data]
];
importData[num_Integer][filename_String, ___] := Module[
{str, header, nx, n, data},
str = OpenRead[filename, BinaryFormat -> True];
header = readFileHeader[str];
{nx, n} = { "SizeX" , "SizeX" * "SizeZ"} /. header;
skipSLOImage[str, header];
skipBScanBlocks[str, header, Max[Min["NumBScans" /. header, num - 1], 0] ];
skipBScanHeader[str, header];
data = Partition[read[{ "Real32" , n}, str], nx];
Close[str];
{"Data" -> {num -> Developer`ToPackedArray[data]}}
];
(*
As suggested in the Heidelberg OCT Manual the importer will adjust
the graylevels when importing images. Since this is very time-consuming
for the whole scanned volume, I use an optimized version of this function.
*)
With[{$compileTarget = $compileTarget}, $adjustGraylevelFunc := ($adjustGraylevelFunc = Compile[{{values, _Real, 2}},
Map[Floor[255.0 * Min[Max[0.0, #], 1.0]^(0.25) + 0.5] &, values, {2}],
RuntimeAttributes -> {Listable},
Parallelization -> True,
RuntimeOptions -> "Speed",
$compileTarget
])];
importImages[filename_String, ___] := Module[
{data},
data = "Data" /. importData[filename];
"Images" -> (Image[#, "Byte" ]& /@ $adjustGraylevelFunc[data])
]
importImages[imageNumber_Integer][filename_String, ___] := Module[
{data},
data = {imageNumber /. ("Data" /. importData[imageNumber][filename])};
{"Images" -> {imageNumber -> (Image[#, "Byte" ]& @@ $adjustGraylevelFunc[data])}}
];
importSegmentation[filename_String, ___] := Module[
{str, header, data},
str = OpenRead[filename, BinaryFormat -> True];
header = readFileHeader[str];
skipSLOImage[str, header];
data = Table[
Module[{bScanHeader, t},
{t, bScanHeader} = Timing@readBScanHeader[str, header];
skipBScanData[str, header];
bScanHeader
], {"NumBScans" /. header}
];
Close[str];
(*
The BScanHeaderData contain the segmentation vectors as a single list
of numbers. Before returning the result, I check how many segmentations
there are inside the BScan an I transform the segmentation value list
into separate vectors and call them "ILM", "RPE" and "NFL" like described
in the manual
*)
"SegmentationData" -> Function[{bhdr},
Block[{numVecs = "NumSeg" /. bhdr, vecNames, nx = "SizeX" /. header},
If[numVecs > 0,
vecNames = Take[{ "ILM" , "RPE" , "NFL" }, numVecs];
bhdr /. ("SegArray" -> vec_) :> Sequence @@ (Rule @@@ Transpose[{vecNames, Partition[vec, nx]} ]),
bhdr
]
]] /@ data
]
importSegmentation[num_Integer][filename_String, ___] := Module[
{str, header, bhdr},
str = OpenRead[filename, BinaryFormat -> True];
header = readFileHeader[str];
skipSLOImage[str, header];
skipBScanBlocks[str, header, Max[Min["NumBScans" /. header, num - 1], 0] ];
bhdr = readBScanHeader[str, header];
Close[str];
(* See doc above *)
{"SegmentationData" -> {num -> Block[
{numVecs = "NumSeg" /. bhdr, vecNames, nx = "SizeX" /. header},
If[ numVecs > 0,
vecNames = Take[{ "ILM" , "RPE" , "NFL" }, numVecs];
bhdr /. ("SegArray" -> vec_) :> Sequence @@ (Rule @@@ Transpose[{vecNames, Partition[vec, nx]} ]),
bhdr
]
]
}}
]
(* Extracts which eye was scanned. This is stored in the header of the file *)
(* OD stands for oculus dexter which is latin for "right eye" and OS stands
for oculus sinister which is latin for "left eye" *)
HeyexEyePosition[file_String /; FileExistsQ[file]] := Module[{position},
Check[
position = "ScanPosition" /. Import[file, { "Heyex" , "FileHeader" }];
Switch[
position,
"OD" ,
Right,
"OS" ,
Left,
_,
$Failed
],
$Failed
]
];
End[]
EndPackage[]

View File

@@ -0,0 +1,46 @@
% This is a regression test for a bug in switch detection
% where it was preferring incomplete switches to complete
% one-case switches, and hence inferring the wrong determinism.
%------------------------------------------------------------------------------%
:- module switch_detection_bug.
:- interface.
:- type note ---> note(rank, modifier, octave).
:- type rank ---> c ; d ; e ; f ; g ; a ; b .
:- type modifier ---> natural ; sharp ; flat .
:- type octave == int.
:- type qualifier ---> maj ; min .
:- pred next_topnote(note, qualifier, note).
:- mode next_topnote(in, in, out) is multi.
%------------------------------------------------------------------------------%
:- implementation.
next_topnote(note(c, _, Oct), _, note(d, natural, Oct)).
next_topnote(note(d, _, Oct), _, note(c, natural, Oct)).
next_topnote(note(d, _, Oct), maj, note(e, natural, Oct)).
next_topnote(note(d, _, Oct), min, note(e, flat, Oct)).
next_topnote(note(e, _, Oct), _, note(d, natural, Oct)).
next_topnote(note(e, _, Oct), _, note(f, natural, Oct)).
next_topnote(note(f, _, Oct), maj, note(e, natural, Oct)).
next_topnote(note(f, _, Oct), min, note(e, flat, Oct)).
next_topnote(note(g, _, Oct), _, note(f, natural, Oct)).
next_topnote(note(g, _, Oct), min, note(a, flat, Oct)).
next_topnote(note(g, _, Oct), maj, note(a, natural, Oct)).
next_topnote(note(a, _, Oct), _, note(g, natural, Oct)).
next_topnote(note(a, _, Oct), min, note(b, flat, Oct)).
next_topnote(note(a, _, Oct), maj, note(b, natural, Oct)).
next_topnote(note(b, _, Oct), maj, note(a, natural, Oct)).
next_topnote(note(b, _, Oct), min, note(a, flat, Oct)).
%------------------------------------------------------------------------------%

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,285 @@
within Modelica.Electrical.Analog;
package Sensors "Potential, voltage, current, and power sensors"
extends Modelica.Icons.SensorsPackage;
model PotentialSensor "Sensor to measure the potential"
extends Modelica.Icons.RotationalSensor;
Interfaces.PositivePin p "pin to be measured" annotation (Placement(
transformation(extent={{-110,-10},{-90,10}}, rotation=0)));
Modelica.Blocks.Interfaces.RealOutput phi
"Absolute voltage potential as output signal"
annotation (Placement(transformation(extent={{100,-10},{120,10}},
rotation=0)));
equation
p.i = 0;
phi = p.v;
annotation (
Icon(coordinateSystem(
preserveAspectRatio=true,
extent={{-100,-100},{100,100}},
grid={1,1}), graphics={
Text(
extent={{-29,-11},{30,-70}},
lineColor={0,0,0},
textString="V"),
Line(points={{-70,0},{-90,0}}, color={0,0,0}),
Line(points={{100,0},{70,0}}, color={0,0,255}),
Text(
extent={{-150,80},{150,120}},
textString="%name",
lineColor={0,0,255})}),
Diagram(coordinateSystem(
preserveAspectRatio=true,
extent={{-100,-100},{100,100}},
grid={1,1}), graphics={Line(points={{-70,0},{-96,0}}, color={0,0,0}),
Line(points={{100,0},{70,0}}, color={0,0,255})}),
Documentation(revisions="<html>
<ul>
<li><i> 1998 </i>
by Christoph Clauss<br> initially implemented<br>
</li>
</ul>
</html>", info="<html>
<p>The potential sensor converts the voltage of a node (with respect to the ground node) into a real valued signal. It does not influence the current sum at the node which voltage is measured, therefore, the electrical behavior is not influenced by the sensor.</p>
</html>"));
end PotentialSensor;
model VoltageSensor "Sensor to measure the voltage between two pins"
extends Modelica.Icons.RotationalSensor;
Interfaces.PositivePin p "positive pin" annotation (Placement(
transformation(extent={{-110,-10},{-90,10}}, rotation=0)));
Interfaces.NegativePin n "negative pin" annotation (Placement(
transformation(extent={{90,-10},{110,10}}, rotation=0)));
Modelica.Blocks.Interfaces.RealOutput v
"Voltage between pin p and n (= p.v - n.v) as output signal"
annotation (Placement(transformation(
origin={0,-100},
extent={{10,-10},{-10,10}},
rotation=90)));
equation
p.i = 0;
n.i = 0;
v = p.v - n.v;
annotation (
Icon(coordinateSystem(
preserveAspectRatio=true,
extent={{-100,-100},{100,100}},
grid={1,1}), graphics={
Text(
extent={{-29,-11},{30,-70}},
lineColor={0,0,0},
textString="V"),
Line(points={{-70,0},{-90,0}}, color={0,0,0}),
Line(points={{70,0},{90,0}}, color={0,0,0}),
Line(points={{0,-90},{0,-70}}, color={0,0,255}),
Text(
extent={{-150,80},{150,120}},
textString="%name",
lineColor={0,0,255})}),
Diagram(coordinateSystem(
preserveAspectRatio=true,
extent={{-100,-100},{100,100}},
grid={1,1}), graphics={
Line(points={{-70,0},{-96,0}}, color={0,0,0}),
Line(points={{70,0},{96,0}}, color={0,0,0}),
Line(points={{0,-90},{0,-70}}, color={0,0,255})}),
Documentation(revisions="<html>
<ul>
<li><i> 1998 </i>
by Christoph Clauss<br> initially implemented<br>
</li>
</ul>
</html>", info="<html>
<p>The voltage sensor converts the voltage between the two connectors into a real valued signal. It does not influence the current sum at the nodes in between the voltage is measured, therefore, the electrical behavior is not influenced by the sensor.</p>
</html>"));
end VoltageSensor;
model CurrentSensor "Sensor to measure the current in a branch"
extends Modelica.Icons.RotationalSensor;
Interfaces.PositivePin p "positive pin" annotation (Placement(
transformation(extent={{-110,-10},{-90,10}}, rotation=0)));
Interfaces.NegativePin n "negative pin" annotation (Placement(
transformation(extent={{90,-10},{110,10}}, rotation=0)));
Modelica.Blocks.Interfaces.RealOutput i
"current in the branch from p to n as output signal"
annotation (Placement(transformation(
origin={0,-100},
extent={{10,-10},{-10,10}},
rotation=90)));
equation
p.v = n.v;
p.i = i;
n.i = -i;
annotation (
Icon(coordinateSystem(
preserveAspectRatio=true,
extent={{-100,-100},{100,100}},
grid={1,1}), graphics={
Text(
extent={{-29,-11},{30,-70}},
lineColor={0,0,0},
textString="A"),
Line(points={{-70,0},{-90,0}}, color={0,0,0}),
Text(
extent={{-150,80},{150,120}},
textString="%name",
lineColor={0,0,255}),
Line(points={{70,0},{90,0}}, color={0,0,0}),
Line(points={{0,-90},{0,-70}}, color={0,0,255})}),
Diagram(coordinateSystem(
preserveAspectRatio=true,
extent={{-100,-100},{100,100}},
grid={1,1}), graphics={
Text(
extent={{-153,79},{147,119}},
textString="%name",
lineColor={0,0,255}),
Line(points={{-70,0},{-96,0}}, color={0,0,0}),
Line(points={{70,0},{96,0}}, color={0,0,0}),
Line(points={{0,-90},{0,-70}}, color={0,0,255})}),
Documentation(revisions="<html>
<ul>
<li><i> 1998 </i>
by Christoph Clauss<br> initially implemented<br>
</li>
</ul>
</html>", info="<html>
<p>The current sensor converts the current flowing between the two connectors into a real valued signal. The two connectors are in the sensor connected like a short cut. The sensor has to be placed within an electrical connection in series. It does not influence the current sum at the connected nodes. Therefore, the electrical behavior is not influenced by the sensor.</p>
</html>"));
end CurrentSensor;
model PowerSensor "Sensor to measure the power"
Modelica.Electrical.Analog.Interfaces.PositivePin pc
"Positive pin, current path"
annotation (Placement(transformation(extent={{-90,-10},{-110,10}}, rotation=
0)));
Modelica.Electrical.Analog.Interfaces.NegativePin nc
"Negative pin, current path"
annotation (Placement(transformation(extent={{110,-10},{90,10}}, rotation=0)));
Modelica.Electrical.Analog.Interfaces.PositivePin pv
"Positive pin, voltage path"
annotation (Placement(transformation(extent={{-10,110},{10,90}}, rotation=0)));
Modelica.Electrical.Analog.Interfaces.NegativePin nv
"Negative pin, voltage path"
annotation (Placement(transformation(extent={{10,-110},{-10,-90}}, rotation=
0)));
Modelica.Blocks.Interfaces.RealOutput power
annotation (Placement(transformation(
origin={-80,-110},
extent={{-10,10},{10,-10}},
rotation=270)));
Modelica.Electrical.Analog.Sensors.VoltageSensor voltageSensor
annotation (Placement(transformation(
origin={0,-30},
extent={{10,-10},{-10,10}},
rotation=90)));
Modelica.Electrical.Analog.Sensors.CurrentSensor currentSensor
annotation (Placement(transformation(extent={{-50,-10},{-30,10}}, rotation=
0)));
Modelica.Blocks.Math.Product product
annotation (Placement(transformation(
origin={-30,-50},
extent={{-10,-10},{10,10}},
rotation=270)));
equation
connect(pv, voltageSensor.p) annotation (Line(points={{0,100},{0,-20},{
6.12323e-016,-20}}, color={0,0,255}));
connect(voltageSensor.n, nv) annotation (Line(points={{-6.12323e-016,-40},{
-6.12323e-016,-63},{0,-63},{0,-100}}, color={0,0,255}));
connect(pc, currentSensor.p)
annotation (Line(points={{-100,0},{-50,0}}, color={0,0,255}));
connect(currentSensor.n, nc)
annotation (Line(points={{-30,0},{100,0}}, color={0,0,255}));
connect(currentSensor.i, product.u2) annotation (Line(points={{-40,-10},{-40,
-30},{-36,-30},{-36,-38}}, color={0,0,127}));
connect(voltageSensor.v, product.u1) annotation (Line(points={{10,-30},{-24,
-30},{-24,-38}}, color={0,0,127}));
connect(product.y, power) annotation (Line(points={{-30,-61},{-30,-80},{-80,
-80},{-80,-110}}, color={0,0,127}));
annotation (Icon(coordinateSystem(
preserveAspectRatio=true,
extent={{-100,-100},{100,100}},
grid={2,2}), graphics={
Ellipse(
extent={{-70,70},{70,-70}},
lineColor={0,0,0},
fillColor={255,255,255},
fillPattern=FillPattern.Solid),
Line(points={{0,100},{0,70}}, color={0,0,255}),
Line(points={{0,-70},{0,-100}}, color={0,0,255}),
Line(points={{-80,-100},{-80,0}}, color={0,0,255}),
Line(points={{-100,0},{100,0}}, color={0,0,255}),
Text(
extent={{150,120},{-150,160}},
textString="%name",
lineColor={0,0,255}),
Line(points={{0,70},{0,40}}, color={0,0,0}),
Line(points={{22.9,32.8},{40.2,57.3}}, color={0,0,0}),
Line(points={{-22.9,32.8},{-40.2,57.3}}, color={0,0,0}),
Line(points={{37.6,13.7},{65.8,23.9}}, color={0,0,0}),
Line(points={{-37.6,13.7},{-65.8,23.9}}, color={0,0,0}),
Line(points={{0,0},{9.02,28.6}}, color={0,0,0}),
Polygon(
points={{-0.48,31.6},{18,26},{18,57.2},{-0.48,31.6}},
lineColor={0,0,0},
fillColor={0,0,0},
fillPattern=FillPattern.Solid),
Ellipse(
extent={{-5,5},{5,-5}},
lineColor={0,0,0},
fillColor={0,0,0},
fillPattern=FillPattern.Solid),
Text(
extent={{-29,-11},{30,-70}},
lineColor={0,0,0},
textString="P")}),
Diagram(coordinateSystem(
preserveAspectRatio=true,
extent={{-100,-100},{100,100}},
grid={2,2}), graphics),
Documentation(info="<html>
<p>This power sensor measures instantaneous electrical power of a singlephase system and has a separated voltage and current path. The pins of the voltage path are pv and nv, the pins of the current path are pc and nc. The internal resistance of the current path is zero, the internal resistance of the voltage path is infinite.</p>
</html>", revisions="<html>
<ul>
<li><i>January 12, 2006</i> by Anton Haumer implemented</li>
</ul>
</html>"));
end PowerSensor;
annotation (
Documentation(info="<html>
<p>This package contains potential, voltage, and current sensors. The sensors can be used to convert voltages or currents into real signal values o be connected to components of the Blocks package. The sensors are designed in such a way that they do not influence the electrical behavior.</p>
</html>",
revisions="<html>
<dl>
<dt>
<b>Main Authors:</b>
<dd>
Christoph Clau&szlig;
&lt;<a href=\"mailto:Christoph.Clauss@eas.iis.fraunhofer.de\">Christoph.Clauss@eas.iis.fraunhofer.de</a>&gt;<br>
Andr&eacute; Schneider
&lt;<a href=\"mailto:Andre.Schneider@eas.iis.fraunhofer.de\">Andre.Schneider@eas.iis.fraunhofer.de</a>&gt;<br>
Fraunhofer Institute for Integrated Circuits<br>
Design Automation Department<br>
Zeunerstra&szlig;e 38<br>
D-01069 Dresden<br>
<p>
<dt>
<b>Copyright:</b>
<dd>
Copyright &copy; 1998-2010, Modelica Association and Fraunhofer-Gesellschaft.<br>
<i>The Modelica package is <b>free</b> software; it can be redistributed and/or modified
under the terms of the <b>Modelica license</b>, see the license conditions
and the accompanying <b>disclaimer</b> in the documentation of package
Modelica in file \"Modelica/package.mo\".</i><br>
<p>
</dl>
</html>"));
end Sensors;

239
samples/NewLisp/irc.lsp Normal file
View File

@@ -0,0 +1,239 @@
#!/usr/bin/env newlisp
;; @module IRC
;; @description a basic irc library
;; @version early alpha! 0.1 2013-01-02 20:11:22
;; @author cormullion
;; Usage:
;; (IRC:init "newlithper") ; a username/nick (not that one obviously :-)
;; (IRC:connect "irc.freenode.net" 6667) ; irc/server
;; (IRC:join-channel {#newlisp}) ; join a room
;; either (IRC:read-irc-loop) ; loop - monitor only, no input
;; or (IRC:session) ; a command-line session, end with /QUIT
(context 'IRC)
(define Inickname)
(define Ichannels)
(define Iserver)
(define Iconnected)
(define Icallbacks '())
(define Idle-time 400) ; seconds
(define Itime-stamp) ; time since last message was processed
(define (register-callback callback-name callback-function)
(println {registering callback for } callback-name { : } (sym (term callback-function) (prefix callback-function)))
(push (list callback-name (sym (term callback-function) (prefix callback-function))) Icallbacks))
(define (deregister-callback callback-name)
(println {deregistering callback for } callback-name)
(setf (assoc "idle-event" Icallbacks) nil)
(println {current callbacks: } Icallbacks))
(define (do-callback callback-name data)
(when (set 'func (lookup callback-name Icallbacks)) ; find first callback
(if-not (catch (apply func (list data)) 'error)
(println {error in callback } callback-name {: } error))))
(define (do-callbacks callback-name data)
(dolist (rf (ref-all callback-name Icallbacks))
(set 'callback-entry (Icallbacks (first rf)))
(when (set 'func (last callback-entry))
(if-not (catch (apply func (list data)) 'error)
(println {error in callback } callback-name {: } error)))))
(define (init str)
(set 'Inickname str)
(set 'Iconnected nil)
(set 'Ichannels '())
(set 'Itime-stamp (time-of-day)))
(define (connect server port)
(set 'Iserver (net-connect server port))
(net-send Iserver (format "USER %s %s %s :%s\r\n" Inickname Inickname Inickname Inickname))
(net-send Iserver (format "NICK %s \r\n" Inickname))
(set 'Iconnected true)
(do-callbacks "connect" (list (list "server" server) (list "port" port))))
(define (identify password)
(net-send Iserver (format "PRIVMSG nickserv :identify %s\r\n" password)))
(define (join-channel channel)
(when (net-send Iserver (format "JOIN %s \r\n" channel))
(push channel Ichannels)
(do-callbacks "join-channel" (list (list "channel" channel) (list "nickname" Inickname)))))
(define (part chan)
(if-not (empty? chan)
; leave specified
(begin
(net-send Iserver (format "PART %s\r\n" chan))
(replace channel Ichannels)
(do-callbacks "part" (list (list "channel" channel))))
; leave all
(begin
(dolist (channel Ichannels)
(net-send Iserver (format "PART %s\r\n" channel))
(replace channel Ichannels)
(do-callbacks "part" (list (list "channel" channel)))))))
(define (do-quit message)
(do-callbacks "quit" '()) ; chance to do stuff before quit...
(net-send Iserver (format "QUIT :%s\r\n" message))
(sleep 1000)
(set 'Ichannels '())
(close Iserver)
(set 'Iconnected nil))
(define (privmsg user message)
(net-send Iserver (format "PRIVMSG %s :%s\r\n" user message)))
(define (notice user message)
(net-send Iserver (format "NOTICE %s :%s\r\n" user message)))
(define (send-to-server message (channel nil))
(cond
((starts-with message {/}) ; default command character
(set 'the-message (replace "^/" (copy message) {} 0)) ; keep original
(net-send Iserver (format "%s \r\n" the-message)) ; send it
; do a quit
(if (starts-with (lower-case the-message) "quit")
(do-quit { enough})))
(true
(if (nil? channel)
; say to all channels
(dolist (c Ichannels)
(net-send Iserver (format "PRIVMSG %s :%s\r\n" c message)))
; say to specified channel
(if (find channel Ichannels)
(net-send Iserver (format "PRIVMSG %s :%s\r\n" channel message))))))
(do-callbacks "send-to-server" (list (list "channel" channel) (list "message" message))))
(define (process-command sender command text)
(cond
((= sender "PING")
(net-send Iserver (format "PONG %s\r\n" command)))
((or (= command "NOTICE") (= command "PRIVMSG"))
(process-message sender command text))
((= command "JOIN")
(set 'username (first (clean empty? (parse sender {!|:} 0))))
(set 'channel (last (clean empty? (parse sender {!|:} 0))))
(println {username } username { joined } channel)
(do-callbacks "join" (list (list "channel" channel) (list "username" username))))
(true
nil)))
(define (process-message sender command text)
(let ((username {} target {} message {}))
(set 'username (first (clean empty? (parse sender {!|:} 0))))
(set 'target (trim (first (clean empty? (parse text {!|:} 0)))))
(set 'message (slice text (+ (find {:} text) 1)))
(cond
((starts-with message "\001")
(process-ctcp username target message))
((find target Ichannels)
(cond
((= command {PRIVMSG})
(do-callbacks "channel-message" (list (list "channel" target) (list "username" username) (list "message" message))))
((= command {NOTICE})
(do-callbacks "channel-notice" (list (list "channel" target) (list "username" username) (list "message" message))))))
((= target Inickname)
(cond
((= command {PRIVMSG})
(do-callbacks "private-message" (list (list "username" username) (list "message" message))))
((= command {NOTICE})
(do-callbacks "private-notice" (list (list "username" username) (list "message" message))))))
(true
nil))))
(define (process-ctcp username target message)
(cond
((starts-with message "\001VERSION\001")
(net-send Iserver (format "NOTICE %s :\001VERSION %s\001\r\n" username message)))
((starts-with message "\001PING")
(set 'data (first (rest (clean empty? (parse message { } 0)))))
(set 'data (trim data "\001" "\001"))
(net-send Iserver (format "NOTICE %s :\001PING %s\001\r\n" username data)))
((starts-with message "\001ACTION")
; (set 'data (first (rest (clean empty? (parse message { } 0)))))
; (set 'data (join data { }))
; (set 'data (trim data "\001" "\001"))
(if (find target Ichannels)
(do-callbacks "channel-action" (list (list "username" username) (list "message" message))))
(if (= target Inickname)
(do-callbacks "private-action" (list (list "username" username) (list "message" message)))))
((starts-with message "\001TIME\001")
(net-send Iserver (format "NOTICE %s:\001TIME :%s\001\r\n" username (date))))))
(define (parse-buffer raw-buffer)
(let ((messages (clean empty? (parse raw-buffer "\r\n" 0)))
(sender {} command {} text {}))
; check for elapsed time since last activity
(when (> (sub (time-of-day) Itime-stamp) (mul Idle-time 1000))
(do-callbacks "idle-event")
(set 'Itime-stamp (time-of-day)))
(dolist (message messages)
(set 'message-parts (parse message { }))
(unless (empty? message-parts)
(set 'sender (first message-parts))
(catch (set 'command (first (rest message-parts))) 'error)
(catch (set 'text (join (rest (rest message-parts)) { })) 'error))
(process-command sender command text))))
(define (read-irc)
(let ((buffer {}))
(when (!= (net-peek Iserver) 0)
(net-receive Iserver buffer 8192 "\n")
(unless (empty? buffer)
(parse-buffer buffer)))))
(define (read-irc-loop) ; monitoring
(let ((buffer {}))
(while Iconnected
(read-irc)
(sleep 1000))))
(define (print-raw-message data) ; example of using a callback
(set 'raw-data (lookup "message" data))
(set 'channel (lookup "channel" data))
(set 'message-text raw-data)
(println (date (date-value) 0 {%H:%M:%S }) username {> } message-text))
(define (print-outgoing-message data)
(set 'raw-data (lookup "message" data))
(set 'channel (lookup "channel" data))
(set 'message-text raw-data)
(println (date (date-value) 0 {%H:%M:%S }) Inickname {> } message-text))
(define (session); interactive terminal
; must add callbacks to display messages
(register-callback "channel-message" 'print-raw-message)
(register-callback "send-to-server" 'print-outgoing-message)
(while Iconnected
(while (zero? (peek 0))
(read-irc)
(sleep 1000))
(send-to-server (string (read-line 0))))
(println {finished session } (date))
(exit))
; end of IRC code
[text]
simple bot code:
(load (string (env {HOME}) {/projects/programming/newlisp-projects/irc.lsp}))
(context 'BOT)
(define bot-name "bot")
(define (join-channel data)
(println {in BOT:join-channel with data: } data))
(define (process-message data)
????)
(IRC:register-callback "join-channel" 'join-channel)
(IRC:register-callback "channel-message" 'process-message)
(IRC:register-callback "idle-event" 'do-idle-event)
(IRC:register-callback "send-to-server" 'do-send-event)
(IRC:init bot-name)
(IRC:connect "irc.freenode.net" 6667)
(IRC:join-channel {#newlisp})
(IRC:read-irc-loop)
[/text]

View File

@@ -0,0 +1,195 @@
(module "sqlite3.lsp") ; loads the SQLite3 database module
; FUNCTIONS-------------------------------------------------
(define (displayln str-to-display)
(println str-to-display)
)
(define (open-database sql-db-to-open)
(if (sql3:open (string sql-db-to-open ".db"))
(displayln "")
(displayln "There was a problem opening the database " sql-db-to-open ": " (sql3:error))))
(define (close-database)
(if (sql3:close)
(displayln "")
(displayln "There was a problem closing the database: " (sql3:error))))
;====== SAFE-FOR-SQL ===============================================================
; this function makes strings safe for inserting into SQL statements
; to avoid SQL injection issues
; it's simple right now but will add to it later
;===================================================================================
(define (safe-for-sql str-sql-query)
(if (string? str-sql-query) (begin
(replace "&" str-sql-query "&amp;")
(replace "'" str-sql-query "&apos;")
(replace "\"" str-sql-query "&quot;")
))
(set 'result str-sql-query))
(define (query sql-text)
(set 'sqlarray (sql3:sql sql-text)) ; results of query
(if sqlarray
(setq query-return sqlarray)
(if (sql3:error)
(displayln (sql3:error) " query problem ")
(setq query-return nil))))
(define-macro (create-record)
; first save the values
(set 'temp-record-values nil)
(set 'temp-table-name (first (args)))
;(displayln "<BR>Arguments: " (args))
(dolist (s (rest (args))) (push (eval s) temp-record-values -1))
; now save the arguments as symbols under the context "DB"
(dolist (s (rest (args)))
(set 'temp-index-num (string $idx)) ; we need to number the symbols to keep them in the correct order
(if (= (length temp-index-num) 1) (set 'temp-index-num (string "0" temp-index-num))) ; leading 0 keeps the max at 100.
(sym (string temp-index-num s) 'DB))
; now create the sql query
(set 'temp-sql-query (string "INSERT INTO " temp-table-name " ("))
;(displayln "<P>TABLE NAME: " temp-table-name)
;(displayln "<P>SYMBOLS: " (symbols DB))
;(displayln "<BR>VALUES: " temp-record-values)
(dolist (d (symbols DB)) (extend temp-sql-query (rest (rest (rest (rest (rest (string d)))))) ", "))
(set 'temp-sql-query (chop (chop temp-sql-query)))
(extend temp-sql-query ") VALUES (")
(dolist (q temp-record-values)
(if (string? q) (extend temp-sql-query "'")) ; only quote if value is non-numeric
(extend temp-sql-query (string (safe-for-sql q)))
(if (string? q) (extend temp-sql-query "'")) ; close quote if value is non-numeric
(extend temp-sql-query ", ")) ; all values are sanitized to avoid SQL injection
(set 'temp-sql-query (chop (chop temp-sql-query)))
(extend temp-sql-query ");")
;(displayln "<p>***** SQL QUERY: " temp-sql-query)
(displayln (query temp-sql-query)) ; actually run the query against the database
(delete 'DB) ; we're done, so delete all symbols in the DB context.
)
(define-macro (update-record)
; first save the values
(set 'temp-record-values nil)
(set 'temp-table-name (first (args)))
(set 'continue true) ; debugging
(dolist (s (rest (args))) (push (eval s) temp-record-values -1))
; now save the arguments as symbols under the context "D2"
(dolist (st (rest (args)))
(set 'temp-index-num (string $idx)) ; we need to number the symbols to keep them in the correct order
(if (= (length temp-index-num) 1) (set 'temp-index-num (string "0" temp-index-num))) ; leading 0 keeps the max at 100.
;(displayln "<br>SYMBOL>>>>" (string temp-index-num st) "<<<") ; debugging
(sym (string temp-index-num st) 'D2)
)
(if continue (begin ; --- temporary debugging
; now create the sql query
(set 'temp-sql-query (string "UPDATE " temp-table-name " SET "))
;(displayln "<P>TABLE NAME: " temp-table-name)
;(displayln "<P>SYMBOLS: " (symbols D2))
;(displayln "<BR>VALUES: " temp-record-values)
(dolist (d (rest (symbols D2))) ; ignore the first argument, as it will be the ConditionColumn for later
(extend temp-sql-query (rest (rest (rest (rest (rest (string d)))))) "=")
(set 'q (temp-record-values (+ $idx 1)))
(if (string? q) (extend temp-sql-query "'")) ; only quote if value is non-numeric
(extend temp-sql-query (string (safe-for-sql q)))
(if (string? q) (extend temp-sql-query "'")) ; close quote if value is non-numeric
(extend temp-sql-query ", ") ; all values are sanitized to avoid SQL injection
)
(set 'temp-sql-query (chop (chop temp-sql-query)))
; okay now add the ConditionColumn value
(extend temp-sql-query (string " WHERE " (rest (rest (rest (rest (rest (string (first (symbols D2)))))))) "="))
(if (string? (first temp-record-values)) (extend temp-sql-query "'"))
(extend temp-sql-query (string (safe-for-sql (first temp-record-values))))
(if (string? (first temp-record-values)) (extend temp-sql-query "'"))
(extend temp-sql-query ";")
;(displayln "<p>***** SQL QUERY: " temp-sql-query)
(query temp-sql-query) ; actually run the query against the database
(delete 'D2) ; we're done, so delete all symbols in the DB context.
)) ; --- end temporary debugging
)
(define-macro (delete-record)
(set 'temp-table-name (first (args)))
(set 'temp-record-values nil)
(dolist (s (rest (args))) (push (eval s) temp-record-values -1)) ; only one value for NOW...
(sym (first (rest (args))) 'DB) ; put the second argument (for now) into a symbol in the DB context
; this will have to be in a dolist loop of (rest (args)) when I add more
(set 'temp-sql-query (string "DELETE FROM " temp-table-name " WHERE "))
(dolist (d (symbols DB)) (extend temp-sql-query (rest (rest (rest (string d))))))
(extend temp-sql-query "=")
; why am I doing a loop here? There should be only one value, right? But maybe for future extension...
(dolist (q temp-record-values)
(if (string? q) (extend temp-sql-query "'")) ; only quote if value is non-numeric
(extend temp-sql-query (string (safe-for-sql q)))
(if (string? q) (extend temp-sql-query "'"))) ; close quote if value is non-numeric
(extend temp-sql-query ";")
;(displayln "TEMP-DELETE-QUERY: " temp-sql-query)
(query temp-sql-query)
(delete 'DB) ; we're done, so delete all symbols in the DB context.
)
(define-macro (get-record)
(set 'temp-table-name (first (args)))
; if you have more arguments than just the table name, they become the elements of the WHERE clause
(if (> (length (args)) 1) (begin
(set 'temp-record-values nil)
(dolist (s (rest (args))) (push (eval s) temp-record-values -1)) ; only one value for NOW...
(sym (first (rest (args))) 'DB) ; put the second argument (for now) into a symbol in the DB context
; this will have to be in a dolist loop of (rest (args)) when I add more
(set 'temp-sql-query (string "SELECT * FROM " temp-table-name " WHERE "))
(dolist (d (symbols DB)) (extend temp-sql-query (rest (rest (rest (string d))))))
(extend temp-sql-query "=")
; why am I doing a loop here? There should be only one value, right? But maybe for future extension...
(dolist (q temp-record-values)
(if (string? q) (extend temp-sql-query "'")) ; only quote if value is non-numeric
(extend temp-sql-query (string (safe-for-sql q)))
(if (string? q) (extend temp-sql-query "'"))) ; close quote if value is non-numeric
(extend temp-sql-query ";")
)
; otherwise, just get everything in that table
(set 'temp-sql-query (string "SELECT * FROM " temp-table-name ";"))
)
;(displayln "TEMP-GET-QUERY: " temp-sql-query)
(delete 'DB) ; we're done, so delete all symbols in the DB context.
(set 'return-value (query temp-sql-query)) ; this returns a list of everything in the record
)
; END FUNCTIONS ===================
(open-database "SERVER-LOGS")
(query "CREATE TABLE Logs (Id INTEGER PRIMARY KEY, IP TEXT, UserId TEXT, UserName TEXT, Date DATE, Request TEXT, Result TEXT, Size INTEGER, Referrer TEXT, UserAgent TEXT)")
;(print (query "SELECT * from SQLITE_MASTER;"))
(set 'access-log (read-file "/var/log/apache2/access.log"))
(set 'access-list (parse access-log "\n"))
(set 'max-items (integer (first (first (query "select count(*) from Logs")))))
(println "Number of items in database: " max-items)
(println "Number of lines in log: " (length access-list))
(dolist (line access-list)
(set 'line-list (parse line))
;(println "Line# " $idx " - " line-list)
;(println "Length of line: " (length line-list))
(if (> (length line-list) 0) (begin
(++ max-items)
(set 'Id max-items) (print $idx "/" (length access-list))
(set 'IP (string (line-list 0) (line-list 1) (line-list 2)))
(set 'UserId (line-list 3))
(set 'UserName (line-list 4))
(set 'Date (line-list 5))
(set 'Date (trim Date "["))
(set 'Date (trim Date "]"))
;(println "DATE: " Date)
(set 'date-parsed (date-parse Date "%d/%b/%Y:%H:%M:%S -0700"))
;(println "DATE-PARSED: " date-parsed)
(set 'Date (date date-parsed 0 "%Y-%m-%dT%H:%M:%S"))
(println " " Date)
(set 'Request (line-list 6))
(set 'Result (line-list 7))
(set 'Size (line-list 8))
(set 'Referrer (line-list 9))
(set 'UserAgent (line-list 10))
(create-record "Logs" Id IP UserId UserName Date Request Result Size Referrer UserAgent)
))
)
(close-database)
(exit)

798
samples/Nit/file.nit Normal file
View File

@@ -0,0 +1,798 @@
# This file is part of NIT ( http://www.nitlanguage.org ).
#
# Copyright 2004-2008 Jean Privat <jean@pryen.org>
# Copyright 2008 Floréal Morandat <morandat@lirmm.fr>
# Copyright 2008 Jean-Sébastien Gélinas <calestar@gmail.com>
#
# This file is free software, which comes along with NIT. This software is
# distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY;
# without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
# PARTICULAR PURPOSE. You can modify it is you want, provided this header
# is kept unaltered, and a notification of the changes is added.
# You are allowed to redistribute it and sell it, alone or is a part of
# another product.
# File manipulations (create, read, write, etc.)
module file
intrude import stream
intrude import ropes
import string_search
import time
in "C Header" `{
#include <dirent.h>
#include <string.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <unistd.h>
#include <stdio.h>
#include <poll.h>
#include <errno.h>
`}
# File Abstract Stream
abstract class FStream
super IOS
# The path of the file.
var path: nullable String = null
# The FILE *.
private var file: nullable NativeFile = null
fun file_stat: FileStat do return _file.file_stat
# File descriptor of this file
fun fd: Int do return _file.fileno
end
# File input stream
class IFStream
super FStream
super BufferedIStream
super PollableIStream
# Misc
# Open the same file again.
# The original path is reused, therefore the reopened file can be a different file.
fun reopen
do
if not eof and not _file.address_is_null then close
_file = new NativeFile.io_open_read(path.to_cstring)
if _file.address_is_null then
last_error = new IOError("Error: Opening file at '{path.as(not null)}' failed with '{sys.errno.strerror}'")
end_reached = true
return
end
end_reached = false
_buffer_pos = 0
_buffer.clear
end
redef fun close
do
if _file.address_is_null then return
var i = _file.io_close
_buffer.clear
end_reached = true
end
redef fun fill_buffer
do
var nb = _file.io_read(_buffer.items, _buffer.capacity)
if nb <= 0 then
end_reached = true
nb = 0
end
_buffer.length = nb
_buffer_pos = 0
end
# End of file?
redef var end_reached: Bool = false
# Open the file at `path` for reading.
init open(path: String)
do
self.path = path
prepare_buffer(10)
_file = new NativeFile.io_open_read(path.to_cstring)
if _file.address_is_null then
last_error = new IOError("Error: Opening file at '{path}' failed with '{sys.errno.strerror}'")
end_reached = true
end
end
init from_fd(fd: Int) do
self.path = ""
prepare_buffer(10)
_file = fd_to_stream(fd, read_only)
if _file.address_is_null then
last_error = new IOError("Error: Converting fd {fd} to stream failed with '{sys.errno.strerror}'")
end_reached = true
end
end
end
# File output stream
class OFStream
super FStream
super OStream
redef fun write(s)
do
if last_error != null then return
if not _is_writable then
last_error = new IOError("Cannot write to non-writable stream")
return
end
if s isa FlatText then
write_native(s.to_cstring, s.length)
else
for i in s.substrings do write_native(i.to_cstring, i.length)
end
end
redef fun close
do
if _file.address_is_null then
last_error = new IOError("Cannot close non-existing write stream")
_is_writable = false
return
end
var i = _file.io_close
_is_writable = false
end
redef var is_writable = false
# Write `len` bytes from `native`.
private fun write_native(native: NativeString, len: Int)
do
if last_error != null then return
if not _is_writable then
last_error = new IOError("Cannot write to non-writable stream")
return
end
if _file.address_is_null then
last_error = new IOError("Writing on a null stream")
_is_writable = false
return
end
var err = _file.io_write(native, len)
if err != len then
# Big problem
last_error = new IOError("Problem in writing : {err} {len} \n")
end
end
# Open the file at `path` for writing.
init open(path: String)
do
_file = new NativeFile.io_open_write(path.to_cstring)
if _file.address_is_null then
last_error = new IOError("Error: Opening file at '{path}' failed with '{sys.errno.strerror}'")
self.path = path
is_writable = false
end
self.path = path
_is_writable = true
end
# Creates a new File stream from a file descriptor
init from_fd(fd: Int) do
self.path = ""
_file = fd_to_stream(fd, wipe_write)
_is_writable = true
if _file.address_is_null then
last_error = new IOError("Error: Opening stream from file descriptor {fd} failed with '{sys.errno.strerror}'")
_is_writable = false
end
end
end
redef interface Object
private fun read_only: NativeString do return "r".to_cstring
private fun wipe_write: NativeString do return "w".to_cstring
private fun fd_to_stream(fd: Int, mode: NativeString): NativeFile `{
return fdopen(fd, mode);
`}
# returns first available stream to read or write to
# return null on interruption (possibly a signal)
protected fun poll( streams : Sequence[FStream] ) : nullable FStream
do
var in_fds = new Array[Int]
var out_fds = new Array[Int]
var fd_to_stream = new HashMap[Int,FStream]
for s in streams do
var fd = s.fd
if s isa IFStream then in_fds.add( fd )
if s isa OFStream then out_fds.add( fd )
fd_to_stream[fd] = s
end
var polled_fd = intern_poll( in_fds, out_fds )
if polled_fd == null then
return null
else
return fd_to_stream[polled_fd]
end
end
private fun intern_poll(in_fds: Array[Int], out_fds: Array[Int]) : nullable Int is extern import Array[Int].length, Array[Int].[], Int.as(nullable Int) `{
int in_len, out_len, total_len;
struct pollfd *c_fds;
sigset_t sigmask;
int i;
int first_polled_fd = -1;
int result;
in_len = Array_of_Int_length( in_fds );
out_len = Array_of_Int_length( out_fds );
total_len = in_len + out_len;
c_fds = malloc( sizeof(struct pollfd) * total_len );
/* input streams */
for ( i=0; i<in_len; i ++ ) {
int fd;
fd = Array_of_Int__index( in_fds, i );
c_fds[i].fd = fd;
c_fds[i].events = POLLIN;
}
/* output streams */
for ( i=0; i<out_len; i ++ ) {
int fd;
fd = Array_of_Int__index( out_fds, i );
c_fds[i].fd = fd;
c_fds[i].events = POLLOUT;
}
/* poll all fds, unlimited timeout */
result = poll( c_fds, total_len, -1 );
if ( result > 0 ) {
/* analyse results */
for ( i=0; i<total_len; i++ )
if ( c_fds[i].revents & c_fds[i].events || /* awaited event */
c_fds[i].revents & POLLHUP ) /* closed */
{
first_polled_fd = c_fds[i].fd;
break;
}
return Int_as_nullable( first_polled_fd );
}
else if ( result < 0 )
fprintf( stderr, "Error in Stream:poll: %s\n", strerror( errno ) );
return null_Int();
`}
end
###############################################################################
class Stdin
super IFStream
init do
_file = new NativeFile.native_stdin
path = "/dev/stdin"
prepare_buffer(1)
end
redef fun poll_in: Bool is extern "file_stdin_poll_in"
end
class Stdout
super OFStream
init do
_file = new NativeFile.native_stdout
path = "/dev/stdout"
_is_writable = true
end
end
class Stderr
super OFStream
init do
_file = new NativeFile.native_stderr
path = "/dev/stderr"
_is_writable = true
end
end
###############################################################################
redef class Streamable
# Like `write_to` but take care of creating the file
fun write_to_file(filepath: String)
do
var stream = new OFStream.open(filepath)
write_to(stream)
stream.close
end
end
redef class String
# return true if a file with this names exists
fun file_exists: Bool do return to_cstring.file_exists
# The status of a file. see POSIX stat(2).
fun file_stat: FileStat do return to_cstring.file_stat
# The status of a file or of a symlink. see POSIX lstat(2).
fun file_lstat: FileStat do return to_cstring.file_lstat
# Remove a file, return true if success
fun file_delete: Bool do return to_cstring.file_delete
# Copy content of file at `self` to `dest`
fun file_copy_to(dest: String)
do
var input = new IFStream.open(self)
var output = new OFStream.open(dest)
while not input.eof do
var buffer = input.read(1024)
output.write buffer
end
input.close
output.close
end
# Remove the trailing extension `ext`.
#
# `ext` usually starts with a dot but could be anything.
#
# assert "file.txt".strip_extension(".txt") == "file"
# assert "file.txt".strip_extension("le.txt") == "fi"
# assert "file.txt".strip_extension("xt") == "file.t"
#
# if `ext` is not present, `self` is returned unmodified.
#
# assert "file.txt".strip_extension(".tar.gz") == "file.txt"
fun strip_extension(ext: String): String
do
if has_suffix(ext) then
return substring(0, length - ext.length)
end
return self
end
# Extract the basename of a path and remove the extension
#
# assert "/path/to/a_file.ext".basename(".ext") == "a_file"
# assert "path/to/a_file.ext".basename(".ext") == "a_file"
# assert "path/to".basename(".ext") == "to"
# assert "path/to/".basename(".ext") == "to"
# assert "path".basename("") == "path"
# assert "/path".basename("") == "path"
# assert "/".basename("") == "/"
# assert "".basename("") == ""
fun basename(ext: String): String
do
var l = length - 1 # Index of the last char
while l > 0 and self.chars[l] == '/' do l -= 1 # remove all trailing `/`
if l == 0 then return "/"
var pos = chars.last_index_of_from('/', l)
var n = self
if pos >= 0 then
n = substring(pos+1, l-pos)
end
return n.strip_extension(ext)
end
# Extract the dirname of a path
#
# assert "/path/to/a_file.ext".dirname == "/path/to"
# assert "path/to/a_file.ext".dirname == "path/to"
# assert "path/to".dirname == "path"
# assert "path/to/".dirname == "path"
# assert "path".dirname == "."
# assert "/path".dirname == "/"
# assert "/".dirname == "/"
# assert "".dirname == "."
fun dirname: String
do
var l = length - 1 # Index of the last char
while l > 0 and self.chars[l] == '/' do l -= 1 # remove all trailing `/`
var pos = chars.last_index_of_from('/', l)
if pos > 0 then
return substring(0, pos)
else if pos == 0 then
return "/"
else
return "."
end
end
# Return the canonicalized absolute pathname (see POSIX function `realpath`)
fun realpath: String do
var cs = to_cstring.file_realpath
var res = cs.to_s_with_copy
# cs.free_malloc # FIXME memory leak
return res
end
# Simplify a file path by remove useless ".", removing "//", and resolving ".."
# ".." are not resolved if they start the path
# starting "/" is not removed
# trainling "/" is removed
#
# Note that the method only wonrk on the string:
# * no I/O access is performed
# * the validity of the path is not checked
#
# assert "some/./complex/../../path/from/../to/a////file//".simplify_path == "path/to/a/file"
# assert "../dir/file".simplify_path == "../dir/file"
# assert "dir/../../".simplify_path == ".."
# assert "dir/..".simplify_path == "."
# assert "//absolute//path/".simplify_path == "/absolute/path"
# assert "//absolute//../".simplify_path == "/"
fun simplify_path: String
do
var a = self.split_with("/")
var a2 = new Array[String]
for x in a do
if x == "." then continue
if x == "" and not a2.is_empty then continue
if x == ".." and not a2.is_empty and a2.last != ".." then
a2.pop
continue
end
a2.push(x)
end
if a2.is_empty then return "."
if a2.length == 1 and a2.first == "" then return "/"
return a2.join("/")
end
# Correctly join two path using the directory separator.
#
# Using a standard "{self}/{path}" does not work in the following cases:
#
# * `self` is empty.
# * `path` ends with `'/'`.
# * `path` starts with `'/'`.
#
# This method ensures that the join is valid.
#
# assert "hello".join_path("world") == "hello/world"
# assert "hel/lo".join_path("wor/ld") == "hel/lo/wor/ld"
# assert "".join_path("world") == "world"
# assert "hello".join_path("/world") == "/world"
# assert "hello/".join_path("world") == "hello/world"
# assert "hello/".join_path("/world") == "/world"
#
# Note: You may want to use `simplify_path` on the result.
#
# Note: This method works only with POSIX paths.
fun join_path(path: String): String
do
if path.is_empty then return self
if self.is_empty then return path
if path.chars[0] == '/' then return path
if self.last == '/' then return "{self}{path}"
return "{self}/{path}"
end
# Convert the path (`self`) to a program name.
#
# Ensure the path (`self`) will be treated as-is by POSIX shells when it is
# used as a program name. In order to do that, prepend `./` if needed.
#
# assert "foo".to_program_name == "./foo"
# assert "/foo".to_program_name == "/foo"
# assert "".to_program_name == "./" # At least, your shell will detect the error.
fun to_program_name: String do
if self.has_prefix("/") then
return self
else
return "./{self}"
end
end
# Alias for `join_path`
#
# assert "hello" / "world" == "hello/world"
# assert "hel/lo" / "wor/ld" == "hel/lo/wor/ld"
# assert "" / "world" == "world"
# assert "/hello" / "/world" == "/world"
#
# This operator is quite useful for chaining changes of path.
# The next one being relative to the previous one.
#
# var a = "foo"
# var b = "/bar"
# var c = "baz/foobar"
# assert a/b/c == "/bar/baz/foobar"
fun /(path: String): String do return join_path(path)
# Returns the relative path needed to go from `self` to `dest`.
#
# assert "/foo/bar".relpath("/foo/baz") == "../baz"
# assert "/foo/bar".relpath("/baz/bar") == "../../baz/bar"
#
# If `self` or `dest` is relative, they are considered relatively to `getcwd`.
#
# In some cases, the result is still independent of the current directory:
#
# assert "foo/bar".relpath("..") == "../../.."
#
# In other cases, parts of the current directory may be exhibited:
#
# var p = "../foo/bar".relpath("baz")
# var c = getcwd.basename("")
# assert p == "../../{c}/baz"
#
# For path resolution independent of the current directory (eg. for paths in URL),
# or to use an other starting directory than the current directory,
# just force absolute paths:
#
# var start = "/a/b/c/d"
# var p2 = (start/"../foo/bar").relpath(start/"baz")
# assert p2 == "../../d/baz"
#
#
# Neither `self` or `dest` has to be real paths or to exist in directories since
# the resolution is only done with string manipulations and without any access to
# the underlying file system.
#
# If `self` and `dest` are the same directory, the empty string is returned:
#
# assert "foo".relpath("foo") == ""
# assert "foo/../bar".relpath("bar") == ""
#
# The empty string and "." designate both the current directory:
#
# assert "".relpath("foo/bar") == "foo/bar"
# assert ".".relpath("foo/bar") == "foo/bar"
# assert "foo/bar".relpath("") == "../.."
# assert "/" + "/".relpath(".") == getcwd
fun relpath(dest: String): String
do
var cwd = getcwd
var from = (cwd/self).simplify_path.split("/")
if from.last.is_empty then from.pop # case for the root directory
var to = (cwd/dest).simplify_path.split("/")
if to.last.is_empty then to.pop # case for the root directory
# Remove common prefixes
while not from.is_empty and not to.is_empty and from.first == to.first do
from.shift
to.shift
end
# Result is going up in `from` with ".." then going down following `to`
var from_len = from.length
if from_len == 0 then return to.join("/")
var up = "../"*(from_len-1) + ".."
if to.is_empty then return up
var res = up + "/" + to.join("/")
return res
end
# Create a directory (and all intermediate directories if needed)
fun mkdir
do
var dirs = self.split_with("/")
var path = new FlatBuffer
if dirs.is_empty then return
if dirs[0].is_empty then
# it was a starting /
path.add('/')
end
for d in dirs do
if d.is_empty then continue
path.append(d)
path.add('/')
path.to_s.to_cstring.file_mkdir
end
end
# Delete a directory and all of its content, return `true` on success
#
# Does not go through symbolic links and may get stuck in a cycle if there
# is a cycle in the filesystem.
fun rmdir: Bool
do
var ok = true
for file in self.files do
var file_path = self.join_path(file)
var stat = file_path.file_lstat
if stat.is_dir then
ok = file_path.rmdir and ok
else
ok = file_path.file_delete and ok
end
stat.free
end
# Delete the directory itself
if ok then to_cstring.rmdir
return ok
end
# Change the current working directory
#
# "/etc".chdir
# assert getcwd == "/etc"
# "..".chdir
# assert getcwd == "/"
#
# TODO: errno
fun chdir do to_cstring.file_chdir
# Return right-most extension (without the dot)
#
# Only the last extension is returned.
# There is no special case for combined extensions.
#
# assert "file.txt".file_extension == "txt"
# assert "file.tar.gz".file_extension == "gz"
#
# For file without extension, `null` is returned.
# Hoever, for trailing dot, `""` is returned.
#
# assert "file".file_extension == null
# assert "file.".file_extension == ""
#
# The starting dot of hidden files is never considered.
#
# assert ".file.txt".file_extension == "txt"
# assert ".file".file_extension == null
fun file_extension: nullable String
do
var last_slash = chars.last_index_of('.')
if last_slash > 0 then
return substring( last_slash+1, length )
else
return null
end
end
# returns files contained within the directory represented by self
fun files : Set[ String ] is extern import HashSet[String], HashSet[String].add, NativeString.to_s, String.to_cstring, HashSet[String].as(Set[String]) `{
char *dir_path;
DIR *dir;
dir_path = String_to_cstring( recv );
if ((dir = opendir(dir_path)) == NULL)
{
perror( dir_path );
exit( 1 );
}
else
{
HashSet_of_String results;
String file_name;
struct dirent *de;
results = new_HashSet_of_String();
while ( ( de = readdir( dir ) ) != NULL )
if ( strcmp( de->d_name, ".." ) != 0 &&
strcmp( de->d_name, "." ) != 0 )
{
file_name = NativeString_to_s( strdup( de->d_name ) );
HashSet_of_String_add( results, file_name );
}
closedir( dir );
return HashSet_of_String_as_Set_of_String( results );
}
`}
end
redef class NativeString
private fun file_exists: Bool is extern "string_NativeString_NativeString_file_exists_0"
private fun file_stat: FileStat is extern "string_NativeString_NativeString_file_stat_0"
private fun file_lstat: FileStat `{
struct stat* stat_element;
int res;
stat_element = malloc(sizeof(struct stat));
res = lstat(recv, stat_element);
if (res == -1) return NULL;
return stat_element;
`}
private fun file_mkdir: Bool is extern "string_NativeString_NativeString_file_mkdir_0"
private fun rmdir: Bool `{ return rmdir(recv); `}
private fun file_delete: Bool is extern "string_NativeString_NativeString_file_delete_0"
private fun file_chdir is extern "string_NativeString_NativeString_file_chdir_0"
private fun file_realpath: NativeString is extern "file_NativeString_realpath"
end
# This class is system dependent ... must reify the vfs
extern class FileStat `{ struct stat * `}
# Returns the permission bits of file
fun mode: Int is extern "file_FileStat_FileStat_mode_0"
# Returns the last access time
fun atime: Int is extern "file_FileStat_FileStat_atime_0"
# Returns the last status change time
fun ctime: Int is extern "file_FileStat_FileStat_ctime_0"
# Returns the last modification time
fun mtime: Int is extern "file_FileStat_FileStat_mtime_0"
# Returns the size
fun size: Int is extern "file_FileStat_FileStat_size_0"
# Returns true if it is a regular file (not a device file, pipe, sockect, ...)
fun is_reg: Bool `{ return S_ISREG(recv->st_mode); `}
# Returns true if it is a directory
fun is_dir: Bool `{ return S_ISDIR(recv->st_mode); `}
# Returns true if it is a character device
fun is_chr: Bool `{ return S_ISCHR(recv->st_mode); `}
# Returns true if it is a block device
fun is_blk: Bool `{ return S_ISBLK(recv->st_mode); `}
# Returns true if the type is fifo
fun is_fifo: Bool `{ return S_ISFIFO(recv->st_mode); `}
# Returns true if the type is a link
fun is_lnk: Bool `{ return S_ISLNK(recv->st_mode); `}
# Returns true if the type is a socket
fun is_sock: Bool `{ return S_ISSOCK(recv->st_mode); `}
end
# Instance of this class are standard FILE * pointers
private extern class NativeFile `{ FILE* `}
fun io_read(buf: NativeString, len: Int): Int is extern "file_NativeFile_NativeFile_io_read_2"
fun io_write(buf: NativeString, len: Int): Int is extern "file_NativeFile_NativeFile_io_write_2"
fun io_close: Int is extern "file_NativeFile_NativeFile_io_close_0"
fun file_stat: FileStat is extern "file_NativeFile_NativeFile_file_stat_0"
fun fileno: Int `{ return fileno(recv); `}
new io_open_read(path: NativeString) is extern "file_NativeFileCapable_NativeFileCapable_io_open_read_1"
new io_open_write(path: NativeString) is extern "file_NativeFileCapable_NativeFileCapable_io_open_write_1"
new native_stdin is extern "file_NativeFileCapable_NativeFileCapable_native_stdin_0"
new native_stdout is extern "file_NativeFileCapable_NativeFileCapable_native_stdout_0"
new native_stderr is extern "file_NativeFileCapable_NativeFileCapable_native_stderr_0"
end
redef class Sys
# Standard input
var stdin: PollableIStream = new Stdin is protected writable
# Standard output
var stdout: OStream = new Stdout is protected writable
# Standard output for errors
var stderr: OStream = new Stderr is protected writable
end
# Print `objects` on the standard output (`stdout`).
protected fun printn(objects: Object...)
do
sys.stdout.write(objects.to_s)
end
# Print an `object` on the standard output (`stdout`) and add a newline.
protected fun print(object: Object)
do
sys.stdout.write(object.to_s)
sys.stdout.write("\n")
end
# Read a character from the standard input (`stdin`).
protected fun getc: Char
do
return sys.stdin.read_char.ascii
end
# Read a line from the standard input (`stdin`).
protected fun gets: String
do
return sys.stdin.read_line
end
# Return the working (current) directory
protected fun getcwd: String do return file_getcwd.to_s
private fun file_getcwd: NativeString is extern "string_NativeString_NativeString_file_getcwd_0"

376
samples/Nit/meetup.nit Normal file
View File

@@ -0,0 +1,376 @@
# This file is part of NIT ( http://www.nitlanguage.org ).
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License
# Shows a meetup and allows to modify its participants
module meetup
import opportunity_model
import boilerplate
import welcome
import template
# Shows a meetup and allows to modify its participants
class OpportunityMeetupPage
super OpportunityPage
# Meetup the page is supposed to show
var meetup: nullable Meetup = null
# Answer mode for the meetup
var mode = 0
init from_id(id: String) do
var db = new OpportunityDB.open("opportunity")
meetup = db.find_meetup_by_id(id)
db.close
if meetup != null then mode = meetup.answer_mode
init
end
init do
header.page_js = "mode = {mode};\n"
header.page_js += """
function update_scores(){
var anss = $('.answer');
var count = {};
var scores = {};
var answers = [];
var maxscore = 0;
for(i=0; i < anss.length; i++){
var incscore = 0;
var inccount = 0;
var idparts = anss[i].id.split("_");
var ansid = idparts[1];
var html = anss[i].innerHTML;
if(html === "<center>✔</center>"){
inccount = 1;
incscore = 2;
}else if(html === "<center>❓</center>"){
incscore = 1;
}
var intansid = parseInt(ansid)
if(answers.indexOf(intansid) == -1){
answers.push(intansid);
}
if(ansid in count){
count[ansid] += inccount;
}else{
count[ansid] = inccount;
}
if(ansid in scores){
scores[ansid] += incscore;
}else{
scores[ansid] = incscore;
}
if(scores[ansid] > maxscore){
maxscore = scores[ansid];
}
}
for(i=0; i < answers.length; i++){
var ansid = answers[i].toString();
var el = $('#total'+ansid)[0];
var ins = "<center>"+count[ansid];
if(scores[ansid] >= maxscore){
ins += "<br/><span style=\\"color:blue\\">★</span>";
}
ins += "</center>";
el.innerHTML = ins;
}
}
function change_answer(ele, id){
// modify only the currently selected entry
if (in_modification_id != id) return;
var e = document.getElementById(ele.id);
var i = e.innerHTML;
var ans = true;"""
if mode == 0 then
header.page_js += """
if(i === "<center>✔</center>"){
ans = 0;
e.innerHTML = "<center>✘</center>"
e.style.color = "red";
}else{
ans = 1;
e.innerHTML = "<center>✔</center>";
e.style.color = "green";
}"""
else
header.page_js += """
if(i === "<center>✔</center>"){
ans = 1;
e.innerHTML = "<center>❓</center>"
e.style.color = "#B8860B";
}else if(i === "<center>❓</center>"){
ans = 0;
e.innerHTML = "<center>✘</center>"
e.style.color = "red";
}else{
ans = 2;
e.innerHTML = "<center>✔</center>";
e.style.color = "green";
}"""
end
header.page_js += """
var a = ele.id.split('_')
var pid = a[1]
var aid = a[2]
update_scores();
$.ajax({
type: "POST",
url: "./rest/answer",
data: {
answer_id: aid,
pers_id: pid,
answer: ans
}
});
}
function change_temp_answer(ele){
var e = document.getElementById(ele.id);
var i = e.innerHTML;"""
if mode == 0 then
header.page_js += """
if(i === "<center>✔</center>"){
e.innerHTML = "<center>✘</center>"
e.style.color = "red";
}else{
e.innerHTML = "<center>✔</center>";
e.style.color = "green";
}
"""
else
header.page_js += """
if(i === "<center>✔</center>"){
e.innerHTML = "<center>❓</center>";
e.style.color = "#B8860B";
}else if(i === "<center>❓</center>"){
e.innerHTML = "<center>✘</center>"
e.style.color = "red";
}else{
e.innerHTML = "<center>✔</center>";
e.style.color = "green";
}
"""
end
header.page_js += """
update_scores();
}
function add_part(ele){
var e = document.getElementById(ele.id);
var pname = document.getElementById("new_name").value;
var arr = e.id.split("_");
var mid = arr[1];
var ans = $('#' + ele.id).parent().parent().parent().children(".answer");
ansmap = {};
for(i=0;i<ans.length;i++){
var curr = ans.eq(i)
"""
if mode == 0 then
header.page_js += """
if(curr[0].innerHTML === "<center>✔</center>"){
ansmap[curr.attr('id')] = 1
}else{
ansmap[curr.attr('id')] = 0
}"""
else
header.page_js += """
if(curr[0].innerHTML === "<center>✔</center>"){
ansmap[curr.attr('id')] = 2
}else if(curr[0].innerHTML === "<center>❓</center>"){
ansmap[curr.attr('id')] = 1
}else{
ansmap[curr.attr('id')] = 0
}"""
end
header.page_js += """
}
$.ajax({
type: "POST",
url: "./rest/meetup/new_pers",
data: {
meetup_id: mid,
persname: pname,
answers: $.param(ansmap)
}
})
.done(function(data){
location.reload();
})
.fail(function(data){
//TODO: Notify of failure
});
}
function remove_people(ele){
var arr = ele.id.split("_")
var pid = arr[1]
$('#' + ele.id).parent().parent().parent().remove();
update_scores();
$.ajax({
type: "POST",
url: "./rest/people",
data: {
method: "DELETE",
p_id: pid
}
});
}
// ID of line currently open for modification
var in_modification_id = null;
function modify_people(ele, id){
if (in_modification_id != null) {
// reset to normal values
$('#modify_'+in_modification_id).text("Modify or delete");
$('#modify_'+in_modification_id).attr("class", "btn btn-xs btn-warning");
$('#line_'+in_modification_id).css("background-color", "");
$('#delete_'+in_modification_id).css("display", "none");
}
if (in_modification_id != id) {
// activate modifiable mode
$('#modify_'+id).text("Done");
$('#modify_'+id).attr("class", "btn btn-xs btn-success");
$('#line_'+id).css("background-color", "LightYellow");
$('#delete_'+id).show();
in_modification_id = id;
} else {
in_modification_id = null;
}
}
"""
end
redef fun rendering do
if meetup == null then
add((new OpportunityHomePage).write_to_string)
return
end
add header
var db = new OpportunityDB.open("opportunity")
add meetup.to_html(db)
db.close
add footer
end
end
redef class Meetup
# Build the HTML for `self`
fun to_html(db: OpportunityDB): Streamable do
var t = new Template
t.add """
<div class="container">
<div class="page-header">
<center><h1>{{{name}}}</h1></center>
"""
if not date.is_empty then t.add """
<center><h4>When: {{{date}}}</h4></center>"""
if not place.is_empty then t.add """
<center><h4>Where: {{{place}}}</h4></center>"""
t.add """
</div>
<table class="table">
"""
t.add "<th>Participant name</th>"
for i in answers(db) do
t.add "<th class=\"text-center\">"
t.add i.to_s
t.add "</th>"
end
t.add "<th></th>"
t.add "</tr>"
for i in participants(db) do
i.load_answers(db, self)
t.add "<tr id=\"line_{i.id}\">"
t.add "<td>"
t.add i.to_s
t.add "</td>"
for j, k in i.answers do
var color
if answer_mode == 0 then
if k == 1 then
color = "green"
else
color = "red"
end
else
if k == 2 then
color = "green"
else if k == 1 then
color = "#B8860B"
else
color = "red"
end
end
t.add """<td class="answer" onclick="change_answer(this, {{{i.id}}})" id="answer_{{{j.id}}}_{{{i.id}}}" style="color:{{{color}}}">"""
t.add "<center>"
if answer_mode == 0 then
if k == 1 then
t.add "✔"
else
t.add "✘"
end
else
if k == 2 then
t.add "✔"
else if k == 1 then
t.add "❓"
else
t.add "✘"
end
end
t.add "</center></td>"
end
t.add """<td class="opportunity-action"><center><button class="btn btn-xs btn-warning" type="button" onclick="modify_people(this, {{{i.id}}})" id="modify_{{{i.id}}}">Modify or delete</button>&nbsp;"""
t.add """<button class="btn btn-xs btn-danger" type="button" onclick="remove_people(this)" id="delete_{{{i.id}}}" style="display: none;">Delete</button></center></td>"""
t.add "</tr>"
end
t.add """
<tr id="newrow" style="background-color: LightYellow">
<td><input id="new_name" type="text" placeholder="Your name" class="input-large"></td>
"""
for i in answers(db) do
t.add "<td class=\"answer\" id=\"newans_{i.id}\" onclick=\"change_temp_answer(this)\" style=\"color:red;\"><center>✘</center></td>"
end
t.add """
<td><center><span id="add_{{{id}}}" onclick="add_part(this)" style="color:green;" class="action"><button class="btn btn-xs btn-success" type="button">Done</button></span></center></td>"""
t.add "</tr>"
# Compute score for each answer
var scores = new HashMap[Int, Int]
var maxsc = 0
for i in answers(db) do
scores[i.id] = i.score(db)
if scores[i.id] > maxsc then maxsc = scores[i.id]
end
t.add """
<tr id="total">
<th>Total</th>
"""
for i in answers(db) do
t.add """<th id="total{{{i.id}}}"><center>{{{i.count(db)}}}"""
if scores.has_key(i.id) and scores[i.id] >= maxsc then
t.add """<br/><span style="color:blue">★</span>"""
end
t.add "</center></th>"
end
t.add "</th>"
t.add """
<th></th>
</tr>"""
t.add "</table>"
t.add "</div>"
return t
end
end

121
samples/Pascal/cwindirs.pp Normal file
View File

@@ -0,0 +1,121 @@
unit cwindirs;
interface
uses
windows,
strings;
Const
CSIDL_PROGRAMS = $0002;
CSIDL_PERSONAL = $0005;
CSIDL_FAVORITES = $0006;
CSIDL_STARTUP = $0007;
CSIDL_RECENT = $0008;
CSIDL_SENDTO = $0009;
CSIDL_STARTMENU = $000B;
CSIDL_MYMUSIC = $000D;
CSIDL_MYVIDEO = $000E;
CSIDL_DESKTOPDIRECTORY = $0010;
CSIDL_NETHOOD = $0013;
CSIDL_TEMPLATES = $0015;
CSIDL_COMMON_STARTMENU = $0016;
CSIDL_COMMON_PROGRAMS = $0017;
CSIDL_COMMON_STARTUP = $0018;
CSIDL_COMMON_DESKTOPDIRECTORY = $0019;
CSIDL_APPDATA = $001A;
CSIDL_PRINTHOOD = $001B;
CSIDL_LOCAL_APPDATA = $001C;
CSIDL_COMMON_FAVORITES = $001F;
CSIDL_INTERNET_CACHE = $0020;
CSIDL_COOKIES = $0021;
CSIDL_HISTORY = $0022;
CSIDL_COMMON_APPDATA = $0023;
CSIDL_WINDOWS = $0024;
CSIDL_SYSTEM = $0025;
CSIDL_PROGRAM_FILES = $0026;
CSIDL_MYPICTURES = $0027;
CSIDL_PROFILE = $0028;
CSIDL_PROGRAM_FILES_COMMON = $002B;
CSIDL_COMMON_TEMPLATES = $002D;
CSIDL_COMMON_DOCUMENTS = $002E;
CSIDL_COMMON_ADMINTOOLS = $002F;
CSIDL_ADMINTOOLS = $0030;
CSIDL_COMMON_MUSIC = $0035;
CSIDL_COMMON_PICTURES = $0036;
CSIDL_COMMON_VIDEO = $0037;
CSIDL_CDBURN_AREA = $003B;
CSIDL_PROFILES = $003E;
CSIDL_FLAG_CREATE = $8000;
Function GetWindowsSpecialDir(ID : Integer) : String;
implementation
uses
sysutils;
Type
PFNSHGetFolderPath = Function(Ahwnd: HWND; Csidl: Integer; Token: THandle; Flags: DWord; Path: PChar): HRESULT; stdcall;
var
SHGetFolderPath : PFNSHGetFolderPath = Nil;
CFGDLLHandle : THandle = 0;
Procedure InitDLL;
Var
pathBuf: array[0..MAX_PATH-1] of char;
pathLength: Integer;
begin
{ Load shfolder.dll using a full path, in order to prevent spoofing (Mantis #18185)
Don't bother loading shell32.dll because shfolder.dll itself redirects SHGetFolderPath
to shell32.dll whenever possible. }
pathLength:=GetSystemDirectory(pathBuf, MAX_PATH);
if (pathLength>0) and (pathLength<MAX_PATH-14) then
begin
StrLCopy(@pathBuf[pathLength],'\shfolder.dll',MAX_PATH-pathLength-1);
CFGDLLHandle:=LoadLibrary(pathBuf);
if (CFGDLLHandle<>0) then
begin
Pointer(ShGetFolderPath):=GetProcAddress(CFGDLLHandle,'SHGetFolderPathA');
If @ShGetFolderPath=nil then
begin
FreeLibrary(CFGDLLHandle);
CFGDllHandle:=0;
end;
end;
end;
If (@ShGetFolderPath=Nil) then
Raise Exception.Create('Could not determine SHGetFolderPath Function');
end;
Function GetWindowsSpecialDir(ID : Integer) : String;
Var
APath : Array[0..MAX_PATH] of char;
begin
Result:='';
if (CFGDLLHandle=0) then
InitDLL;
If (SHGetFolderPath<>Nil) then
begin
if SHGetFolderPath(0,ID or CSIDL_FLAG_CREATE,0,0,@APATH[0])=S_OK then
Result:=IncludeTrailingPathDelimiter(StrPas(@APath[0]));
end;
end;
Initialization
Finalization
if CFGDLLHandle<>0 then
FreeLibrary(CFGDllHandle);
end.

View File

@@ -1,51 +0,0 @@
{ $Id$ }
{
---------------------------------------------------------------------------
gtkextra.pp - GTK(2) widgetset - additional gdk/gtk functions
---------------------------------------------------------------------------
This unit contains missing gdk/gtk functions and defines for certain
versions of gtk or fpc.
---------------------------------------------------------------------------
@created(Sun Jan 28th WET 2006)
@lastmod($Date$)
@author(Marc Weustink <marc@@dommelstein.nl>)
*****************************************************************************
This file is part of the Lazarus Component Library (LCL)
See the file COPYING.modifiedLGPL.txt, included in this distribution,
for details about the license.
*****************************************************************************
}
unit GtkExtra;
{$mode objfpc}{$H+}
interface
{$I gtkdefines.inc}
{$ifdef gtk1}
{$I gtk1extrah.inc}
{$endif}
{$ifdef gtk2}
{$I gtk2extrah.inc}
{$endif}
implementation
{$ifdef gtk1}
{$I gtk1extra.inc}
{$endif}
{$ifdef gtk2}
{$I gtk2extra.inc}
{$endif}
end.

22
samples/Pascal/large.pp Normal file
View File

@@ -0,0 +1,22 @@
program large;
const
max = 100000000;
type
tlist = array[1..max] of longint;
var
data : tlist;
i : integer;
begin
i := 0;
while(i < max)
do
begin
data[i] := 0;
Writeln(data[i])
end
end.

26
samples/Pascal/tw27294.pp Normal file
View File

@@ -0,0 +1,26 @@
uses
uw27294;
var
p : procedure;
procedure test;
begin
p:=@test;
writeln('OK');
end;
procedure global;
begin
p:=nil;
test;
p();
end;
begin
global;
uw27294.global;
end.

View File

@@ -0,0 +1,6 @@
#!/usr/bin/env pike
int main(int argc, array argv) {
return 0;
}

View File

@@ -0,0 +1,116 @@
#
# Module manifest for module 'ZLocation'
#
# Generated by: sevoroby
#
# Generated on: 12/10/2014
#
@{
# Script module or binary module file associated with this manifest.
RootModule = 'ZLocation.psm1'
# Version number of this module.
ModuleVersion = '0.1'
# ID used to uniquely identify this module
GUID = '18e8ca17-7f67-4f1c-85ff-159373bf66f5'
# Author of this module
Author = 'Sergei Vorobev'
# Company or vendor of this module
CompanyName = 'Microsoft'
# Copyright statement for this module
Copyright = '(c) 2014 Sergei Vorobev. All rights reserved.'
# Description of the functionality provided by this module
# Description = ''
# Minimum version of the Windows PowerShell engine required by this module
# PowerShellVersion = ''
# Name of the Windows PowerShell host required by this module
# PowerShellHostName = ''
# Minimum version of the Windows PowerShell host required by this module
# PowerShellHostVersion = ''
# Minimum version of Microsoft .NET Framework required by this module
# DotNetFrameworkVersion = ''
# Minimum version of the common language runtime (CLR) required by this module
# CLRVersion = ''
# Processor architecture (None, X86, Amd64) required by this module
# ProcessorArchitecture = ''
# Modules that must be imported into the global environment prior to importing this module
# RequiredModules = @()
# Assemblies that must be loaded prior to importing this module
# RequiredAssemblies = @()
# Script files (.ps1) that are run in the caller's environment prior to importing this module.
# ScriptsToProcess = @()
# Type files (.ps1xml) to be loaded when importing this module
# TypesToProcess = @()
# Format files (.ps1xml) to be loaded when importing this module
# FormatsToProcess = @()
# Modules to import as nested modules of the module specified in RootModule/ModuleToProcess
NestedModules = @("ZLocation.Storage.psm1", "ZLocation.Search.psm1")
# Functions to export from this module
FunctionsToExport = '*'
# Cmdlets to export from this module
CmdletsToExport = '*'
# Variables to export from this module
VariablesToExport = '*'
# Aliases to export from this module
AliasesToExport = '*'
# List of all modules packaged with this module
# ModuleList = @()
# List of all files packaged with this module
# FileList = @()
# Private data to pass to the module specified in RootModule/ModuleToProcess. This may also contain a PSData hashtable with additional module metadata used by PowerShell.
PrivateData = @{
PSData = @{
# Tags applied to this module. These help with module discovery in online galleries.
# Tags = @()
# A URL to the license for this module.
# LicenseUri = ''
# A URL to the main website for this project.
# ProjectUri = ''
# A URL to an icon representing this module.
# IconUri = ''
# ReleaseNotes of this module
# ReleaseNotes = ''
} # End of PSData hashtable
} # End of PrivateData hashtable
# HelpInfo URI of this module
# HelpInfoURI = ''
# Default prefix for commands exported from this module. Override the default prefix using Import-Module -Prefix.
# DefaultCommandPrefix = ''
}

View File

@@ -0,0 +1,91 @@
#
# Weight function.
#
function Update-ZLocation([string]$path)
{
$now = [datetime]::Now
if (Test-Path variable:global:__zlocation_current)
{
$prev = $global:__zlocation_current
$weight = $now.Subtract($prev.Time).TotalSeconds
Add-ZWeight ($prev.Location) $weight
}
$global:__zlocation_current = @{
Location = $path
Time = [datetime]::Now
}
# populate folder immidiatly after the first cd
Add-ZWeight $path 0
}
# this approach hurts `cd` performance (0.0008 sec vs 0.025 sec).
# Consider replace it with OnIdle Event.
(Get-Variable pwd).attributes.Add((new-object ValidateScript { Update-ZLocation $_.Path; return $true }))
#
# End of weight function.
#
#
# Tab complention.
#
if (Test-Path Function:\TabExpansion) {
Rename-Item Function:\TabExpansion PreZTabExpansion
}
function Get-EscapedPath
{
param(
[Parameter(
Position=0,
Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true)
]
[string]$path
)
process {
if ($path.Contains(' '))
{
return '"' + $path + '"'
}
return $path
}
}
function global:TabExpansion($line, $lastWord) {
switch -regex ($line) {
"^(Set-ZLocation|z) .*" {
$arguments = $line -split ' ' | Where { $_.length -gt 0 } | select -Skip 1
Find-Matches (Get-ZLocation) $arguments | Get-EscapedPath
}
default {
if (Test-Path Function:\PreZTabExpansion) {
PreZTabExpansion $line $lastWord
}
}
}
}
#
# End of tab completion.
#
function Set-ZLocation()
{
if (-not $args) {
$args = @()
}
$matches = Find-Matches (Get-ZLocation) $args
if ($matches) {
Push-Location ($matches | Select-Object -First 1)
} else {
Write-Warning "Cannot find matching location"
}
}
Set-Alias -Name z -Value Set-ZLocation
Export-ModuleMember -Function Set-ZLocation, Get-ZLocation -Alias z

View File

@@ -1,2 +0,0 @@
# Hello world in powershell
Write-Host 'Hello World'

View File

@@ -1,5 +0,0 @@
# Hello World powershell module
function hello() {
Write-Host 'Hello World'
}

View File

@@ -0,0 +1,65 @@
function Save-HistoryAll() {
$history = Get-History -Count $MaximumHistoryCount
[array]::Reverse($history)
$history = $history | Group CommandLine | Foreach {$_.Group[0]}
[array]::Reverse($history)
$history | Export-Csv $historyPath
}
function Save-HistoryIncremental() {
# Get-History -Count $MaximumHistoryCount | Group CommandLine | Foreach {$_.Group[0]} | Export-Csv $historyPath
Get-History -Count 1 | Export-Csv -Append $historyPath
}
# hook powershell's exiting event & hide the registration with -supportevent.
#Register-EngineEvent -SourceIdentifier powershell.exiting -SupportEvent -Action { Save-History }
$oldPrompt = Get-Content function:\prompt
if( $oldPrompt -notlike '*Save-HistoryIncremental*' )
{
$newPrompt = @'
Save-HistoryIncremental
'@
$newPrompt += $oldPrompt
$function:prompt = [ScriptBlock]::Create($newPrompt)
}
# load previous history, if it exists
if ((Test-Path $historyPath)) {
$loadTime =
(
Measure-Command {
Import-Csv $historyPath | Add-History
Save-HistoryAll
Clear-History
Import-Csv $historyPath | ? {$count++;$true} | Add-History
}
).totalseconds
Write-Host -Fore Green "`nLoaded $count history item(s) in $loadTime seconds.`n"
}
function Search-History()
{
<#
.SYNOPSIS
Retrive and filter history based on query
.DESCRIPTION
.PARAMETER Name
.EXAMPLE
.LINK
#>
param(
[string[]] $query
)
$history = Get-History -Count $MaximumHistoryCount
foreach ($item in $query){
$item = $item.ToLower()
$history = $history | where {$_.CommandLine.ToLower().Contains($item)}
}
$history
}

View File

@@ -0,0 +1,7 @@
PREFIX foaf: <http://xmlns.com/foaf/0.1/>
SELECT ?name ?email
WHERE {
?person a foaf:Person.
?person foaf:name ?name.
?person foaf:mbox ?email.
}

View File

@@ -0,0 +1,40 @@
PREFIX owl: <http://www.w3.org/2002/07/owl#>
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
PREFIX skos: <http://www.w3.org/2004/02/skos/core#>
SELECT DISTINCT ?s ?label
WHERE {
SERVICE <http://api.finto.fi/sparql>
{
SELECT DISTINCT ?s ?label ?plabel ?alabel ?hlabel (GROUP_CONCAT(DISTINCT STR(?type)) as ?types)
WHERE {
GRAPH <http://www.yso.fi/onto/kauno/>
{
?s rdf:type <http://www.w3.org/2004/02/skos/core#Concept>
{
?s rdf:type ?type .
?s ?prop ?match .
FILTER (
strstarts(lcase(str(?match)), "test") && !(?match != ?label && strstarts(lcase(str(?label)), "test"))
)
OPTIONAL {
?s skos:prefLabel ?label .
FILTER (langMatches(lang(?label), "en"))
}
OPTIONAL { # in case previous OPTIONAL block gives no labels
?s ?prop ?match .
?s skos:prefLabel ?label .
FILTER (langMatches(lang(?label), lang(?match))) }
}
FILTER NOT EXISTS { ?s owl:deprecated true }
}
BIND(IF(?prop = skos:prefLabel && ?match != ?label, ?match, "") as ?plabel)
BIND(IF(?prop = skos:altLabel, ?match, "") as ?alabel)
BIND(IF(?prop = skos:hiddenLabel, ?match, "") as ?hlabel)
VALUES (?prop) { (skos:prefLabel) (skos:altLabel) (skos:hiddenLabel) }
}
GROUP BY ?match ?s ?label ?plabel ?alabel ?hlabel ?prop
ORDER BY lcase(str(?match)) lang(?match)
LIMIT 10
}
}

85
samples/SQL/videodb.cql Normal file
View File

@@ -0,0 +1,85 @@
CREATE KEYSPACE videodb WITH REPLICATION = { 'class' : 'SimpleStrategy', 'replication_factor' : 1 };
use videodb;
// Basic entity table
// Object mapping ?
CREATE TABLE users (
username varchar,
firstname varchar,
lastname varchar,
email varchar,
password varchar,
created_date timestamp,
total_credits int,
credit_change_date timeuuid,
PRIMARY KEY (username)
);
// One-to-many entity table
CREATE TABLE videos (
videoid uuid,
videoname varchar,
username varchar,
description varchar,
tags list<varchar>,
upload_date timestamp,
PRIMARY KEY (videoid)
);
// One-to-many from the user point of view
// Also know as a lookup table
CREATE TABLE username_video_index (
username varchar,
videoid uuid,
upload_date timestamp,
videoname varchar,
PRIMARY KEY (username, videoid)
);
// Counter table
CREATE TABLE video_rating (
videoid uuid,
rating_counter counter,
rating_total counter,
PRIMARY KEY (videoid)
);
// Creating index tables for tab keywords
CREATE TABLE tag_index (
tag varchar,
videoid uuid,
timestamp timestamp,
PRIMARY KEY (tag, videoid)
);
// Comments as a many-to-many
// Looking from the video side to many users
CREATE TABLE comments_by_video (
videoid uuid,
username varchar,
comment_ts timestamp,
comment varchar,
PRIMARY KEY (videoid,comment_ts,username)
) WITH CLUSTERING ORDER BY (comment_ts DESC, username ASC);
// looking from the user side to many videos
CREATE TABLE comments_by_user (
username varchar,
videoid uuid,
comment_ts timestamp,
comment varchar,
PRIMARY KEY (username,comment_ts,videoid)
) WITH CLUSTERING ORDER BY (comment_ts DESC, videoid ASC);
// Time series wide row with reverse comparator
CREATE TABLE video_event (
videoid uuid,
username varchar,
event varchar,
event_timestamp timeuuid,
video_timestamp bigint,
PRIMARY KEY ((videoid,username), event_timestamp,event)
) WITH CLUSTERING ORDER BY (event_timestamp DESC,event ASC);

85
samples/SQL/videodb.ddl Normal file
View File

@@ -0,0 +1,85 @@
CREATE KEYSPACE videodb WITH REPLICATION = { 'class' : 'SimpleStrategy', 'replication_factor' : 1 };
use videodb;
// Basic entity table
// Object mapping ?
CREATE TABLE users (
username varchar,
firstname varchar,
lastname varchar,
email varchar,
password varchar,
created_date timestamp,
total_credits int,
credit_change_date timeuuid,
PRIMARY KEY (username)
);
// One-to-many entity table
CREATE TABLE videos (
videoid uuid,
videoname varchar,
username varchar,
description varchar,
tags list<varchar>,
upload_date timestamp,
PRIMARY KEY (videoid)
);
// One-to-many from the user point of view
// Also know as a lookup table
CREATE TABLE username_video_index (
username varchar,
videoid uuid,
upload_date timestamp,
videoname varchar,
PRIMARY KEY (username, videoid)
);
// Counter table
CREATE TABLE video_rating (
videoid uuid,
rating_counter counter,
rating_total counter,
PRIMARY KEY (videoid)
);
// Creating index tables for tab keywords
CREATE TABLE tag_index (
tag varchar,
videoid uuid,
timestamp timestamp,
PRIMARY KEY (tag, videoid)
);
// Comments as a many-to-many
// Looking from the video side to many users
CREATE TABLE comments_by_video (
videoid uuid,
username varchar,
comment_ts timestamp,
comment varchar,
PRIMARY KEY (videoid,comment_ts,username)
) WITH CLUSTERING ORDER BY (comment_ts DESC, username ASC);
// looking from the user side to many videos
CREATE TABLE comments_by_user (
username varchar,
videoid uuid,
comment_ts timestamp,
comment varchar,
PRIMARY KEY (username,comment_ts,videoid)
) WITH CLUSTERING ORDER BY (comment_ts DESC, videoid ASC);
// Time series wide row with reverse comparator
CREATE TABLE video_event (
videoid uuid,
username varchar,
event varchar,
event_timestamp timeuuid,
video_timestamp bigint,
PRIMARY KEY ((videoid,username), event_timestamp,event)
) WITH CLUSTERING ORDER BY (event_timestamp DESC,event ASC);

View File

@@ -0,0 +1,136 @@
# -*- coding: utf-8 -*-
#
# Funciones en Python/Sage para el trabajo con polinomios con una
# incógnita (x).
#
# Copyright (C) 2014-2015, David Abián <davidabian [at] davidabian.com>
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation, either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
def pols (grado=-1, K=GF(2), mostrar=False):
"""Devuelve la lista de polinomios constantes y no constantes de
coeficientes mónicos y grado igual o menor que el especificado.
Si el grado indicado no es válido, devuelve una lista vacía.
"""
lpols = []
if not grado.is_integer():
grado = grado.round()
if grado >= 0:
var('x')
xs = vector([(x^i) for i in range(grado+1)])
V = VectorSpace(K,grado+1)
lpols = [cs*xs for cs in V]
if mostrar:
for pol in lpols:
print pol
return lpols
def polsNoCtes (grado=-1, K=GF(2), mostrar=False):
"""Devuelve la lista de polinomios no constantes de coeficientes mónicos y
grado igual o menor que el especificado.
Si el grado indicado no es válido, devuelve una lista vacía.
"""
lpols = []
if not grado.is_integer():
grado = grado.round()
if grado >= 0:
var('x')
xs = vector([(x^i) for i in range(grado+1)])
for cs in K^(grado+1):
if cs[:grado] != vector(grado*[0]): # no constantes
lpols += [cs*xs]
if mostrar:
for pol in lpols:
print pol
return lpols
def polsMismoGrado (grado=-1, K=GF(2), mostrar=False):
"""Devuelve la lista de polinomios de coeficientes mónicos del grado
especificado.
Si el grado indicado no es válido, devuelve una lista vacía.
"""
lpols = []
if not grado.is_integer():
grado = grado.round()
if grado >= 0:
var('x')
xs = vector([(x^(grado-i)) for i in [0..grado]])
for cs in K^(grado+1):
if cs[0] != 0: # polinomios del mismo grado
lpols += [cs*xs]
if mostrar:
for pol in lpols:
print pol
return lpols
def excluirReducibles (lpols=[], mostrar=False):
"""Filtra una lista dada de polinomios de coeficientes mónicos y devuelve
aquellos irreducibles.
"""
var('x')
irreds = []
for p in lpols:
fp = (p.factor_list())
if len(fp) == 1 and fp[0][1] == 1:
irreds += [p]
if mostrar:
for pol in irreds:
print pol
return irreds
def vecPol (vec=random_vector(GF(2),0)):
"""Transforma los coeficientes dados en forma de vector en el polinomio
que representan.
Por ejemplo, con vecPol(vector([1,0,3,1])) se obtiene x³ + 3*x + 1.
Para la función opuesta, véase polVec().
"""
var('x')
xs = vector([x^(len(vec)-1-i) for i in range(len(vec))])
return vec*xs
def polVec (p=None):
"""Devuelve el vector de coeficientes del polinomio dado que acompañan a la
incógnita x, de mayor a menor grado.
Por ejemplo, con polVec(x^3 + 3*x + 1) se obtiene el vector (1, 0, 3, 1).
Para la función opuesta, véase vecPol().
"""
cs = []
if p != None:
var('x')
p(x) = p
for i in [0..p(x).degree(x)]:
cs.append(p(x).coefficient(x,i))
cs = list(reversed(cs))
return vector(cs)
def completar2 (p=0):
"""Aplica el método de completar cuadrados en parábolas al polinomio dado de
grado 2 y lo devuelve en su nueva forma.
Si el polinomio dado no es válido, devuelve 0.
Por ejemplo, con complCuad(3*x^2 + 12*x + 5) se obtiene 3*(x + 2)^2 - 7.
"""
var('x')
p(x) = p.expand()
if p(x).degree(x) != 2:
p(x) = 0
else:
cs = polVec(p(x))
p(x) = cs[0]*(x+(cs[1]/(2*cs[0])))^2+(4*cs[0]*cs[2]-cs[1]^2)/(4*cs[0])
return p(x)

View File

@@ -1 +1,2 @@
the green potato=la pomme de terre verte
le nouveau type de musique=the new type of music

View File

@@ -0,0 +1,183 @@
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
@prefix owl: <http://www.w3.org/2002/07/owl#> .
@prefix gndo: <http://d-nb.info/standards/elementset/gnd#> .
@prefix xsd: <http://www.w3.org/2001/XMLSchema#> .
<http://d-nb.info/gnd/118514768>
a <http://d-nb.info/standards/elementset/gnd#Pseudonym> ;
foaf:page <http://de.wikipedia.org/wiki/Bertolt_Brecht> ;
owl:sameAs <http://dbpedia.org/resource/Bertolt_Brecht>, <http://viaf.org/viaf/2467372>, <http://www.filmportal.de/person/261E2D3A93D54134BF8AB5F21F0B2399> ;
gndo:gndIdentifier "118514768" ;
gndo:oldAuthorityNumber "(DE-588)1022091077", "(DE-588a)118514768", "(DE-588a)141399074", "(DE-588a)139089691", "(DE-588a)141300248", "(DE-588a)136949541", "(DE-588a)134336232", "(DE-588a)12794544X", "(DE-588a)12736630X", "(DE-588a)12722811X", "(DE-588a)127228098", "(DE-588a)127228101" ;
gndo:variantNameForThePerson "Brêcht, Becton", "Brecht, Bert", "Brecht, Bertolʹ", "Brecht, Berthold", "Brecht, Bertholt", "Brecht, Bertold", "Brecht, B.", "Brecht, Eugen Berthold Friedrich", "Brecht, ...", "Brecht-Eisler, ...", "Becht, Bertolt", "Beituo'erte-Bulaixite", "Berchito, B.", "Brechtas, B.", "Brechts, Bertolts", "Brehd, Berd", "Breht, Bertolt", "Brehts, Bertolts", "Breḳhṭ, Bārṭolṭ", "Brekt, Berṭolṭ", "Brekṭ, Berṭōlṭ", "Breḳṭ, Berṭôlṭ", "Breśṭ, Berṭalṭa", "Breṣṭa, Barṭolṭa", "Brišt, Bartūlt", "Brišt, Birtūld", "Brišt, Birtult", "Buchito, Berutorutu", "Bulaixite, Beituo'erte", "Bulaixite, ...", "Burehito, Berutoruto", "Burehito, ...", "B. B.", "Larsen, Berthold", "Mprecht, Mpertolt", "Mprecht, ...", "Pulaihsit'ê, Peit'oĉrht'ê", "Pulaihsit'ê, ...", "Pŭrehit'ŭ, Peŏt'olt'ŭ", "Bŭrehit'ŭ, Beŏt'olt'ŭ", "برشت، برتولد", "브레히트, 베르톨트", "ברכט, ברטולט", "贝·布莱希特", "布莱希特, 贝", "ブレヒト, ベルトルト" ;
gndo:variantNameEntityForThePerson [
gndo:forename "Becton" ;
gndo:surname "Brêcht"
], [
gndo:forename "Bert" ;
gndo:surname "Brecht"
], [
gndo:forename "Bertolʹ" ;
gndo:surname "Brecht"
], [
gndo:forename "Berthold" ;
gndo:surname "Brecht"
], [
gndo:forename "Bertholt" ;
gndo:surname "Brecht"
], [
gndo:forename "Bertold" ;
gndo:surname "Brecht"
], [
gndo:forename "B." ;
gndo:surname "Brecht"
], [
gndo:forename "Eugen Berthold Friedrich" ;
gndo:surname "Brecht"
], [
gndo:forename "..." ;
gndo:surname "Brecht"
], [
gndo:forename "..." ;
gndo:surname "Brecht-Eisler"
], [
gndo:forename "Bertolt" ;
gndo:surname "Becht"
], [ gndo:personalName "Beituo'erte-Bulaixite" ], [
gndo:forename "B." ;
gndo:surname "Berchito"
], [
gndo:forename "B." ;
gndo:surname "Brechtas"
], [
gndo:forename "Bertolts" ;
gndo:surname "Brechts"
], [
gndo:forename "Berd" ;
gndo:surname "Brehd"
], [
gndo:forename "Bertolt" ;
gndo:surname "Breht"
], [
gndo:forename "Bertolts" ;
gndo:surname "Brehts"
], [
gndo:forename "Bārṭolṭ" ;
gndo:surname "Breḳhṭ"
], [
gndo:forename "Berṭolṭ" ;
gndo:surname "Brekt"
], [
gndo:forename "Berṭōlṭ" ;
gndo:surname "Brekṭ"
], [
gndo:forename "Berṭôlṭ" ;
gndo:surname "Breḳṭ"
], [
gndo:forename "Berṭalṭa" ;
gndo:surname "Breśṭ"
], [
gndo:forename "Barṭolṭa" ;
gndo:surname "Breṣṭa"
], [
gndo:forename "Bartūlt" ;
gndo:surname "Brišt"
], [
gndo:forename "Birtūld" ;
gndo:surname "Brišt"
], [
gndo:forename "Birtult" ;
gndo:surname "Brišt"
], [
gndo:forename "Berutorutu" ;
gndo:surname "Buchito"
], [
gndo:forename "Beituo'erte" ;
gndo:surname "Bulaixite"
], [
gndo:forename "..." ;
gndo:surname "Bulaixite"
], [
gndo:forename "Berutoruto" ;
gndo:surname "Burehito"
], [
gndo:forename "..." ;
gndo:surname "Burehito"
], [ gndo:personalName "B. B." ], [
gndo:forename "Berthold" ;
gndo:surname "Larsen"
], [
gndo:forename "Mpertolt" ;
gndo:surname "Mprecht"
], [
gndo:forename "..." ;
gndo:surname "Mprecht"
], [
gndo:forename "Peit'oĉrht'ê" ;
gndo:surname "Pulaihsit'ê"
], [
gndo:forename "..." ;
gndo:surname "Pulaihsit'ê"
], [
gndo:forename "Peŏt'olt'ŭ" ;
gndo:surname "Pŭrehit'ŭ"
], [
gndo:forename "Beŏt'olt'ŭ" ;
gndo:surname "Bŭrehit'ŭ"
], [ gndo:personalName "برشت، برتولد" ], [
gndo:forename "베르톨트" ;
gndo:surname "브레히트"
], [
gndo:forename "ברטולט" ;
gndo:surname "ברכט"
], [ gndo:personalName "贝·布莱希特" ], [
gndo:forename "" ;
gndo:surname "布莱希特"
], [
gndo:forename "ベルトルト" ;
gndo:surname "ブレヒト"
] ;
gndo:preferredNameForThePerson "Brecht, Bertolt" ;
gndo:preferredNameEntityForThePerson [
gndo:forename "Bertolt" ;
gndo:surname "Brecht"
] ;
gndo:familialRelationship <http://d-nb.info/gnd/121608557>, <http://d-nb.info/gnd/119056011>, <http://d-nb.info/gnd/118738348>, <http://d-nb.info/gnd/137070411>, <http://d-nb.info/gnd/118809849>, <http://d-nb.info/gnd/119027615>, <http://d-nb.info/gnd/118940163>, <http://d-nb.info/gnd/118630091>, <http://d-nb.info/gnd/123783283>, <http://d-nb.info/gnd/118940155>, <http://d-nb.info/gnd/110005449>, <http://d-nb.info/gnd/13612495X>, <http://d-nb.info/gnd/123757398>, <http://d-nb.info/gnd/1030496250>, <http://d-nb.info/gnd/1030496366> ;
gndo:professionOrOccupation <http://d-nb.info/gnd/4185053-1>, <http://d-nb.info/gnd/4140241-8>, <http://d-nb.info/gnd/4052154-0>, <http://d-nb.info/gnd/4168391-2>, <http://d-nb.info/gnd/4053309-8>, <http://d-nb.info/gnd/4049050-6>, <http://d-nb.info/gnd/4294338-3> ;
gndo:playedInstrument <http://d-nb.info/gnd/4057587-1> ;
gndo:gndSubjectCategory <http://d-nb.info/standards/vocab/gnd/gnd-sc#12.2p>, <http://d-nb.info/standards/vocab/gnd/gnd-sc#15.1p>, <http://d-nb.info/standards/vocab/gnd/gnd-sc#15.3p> ;
gndo:geographicAreaCode <http://d-nb.info/standards/vocab/gnd/geographic-area-code#XA-DE> ;
gndo:languageCode <http://id.loc.gov/vocabulary/iso639-2/ger> ;
gndo:placeOfBirth <http://d-nb.info/gnd/4003614-5> ;
gndo:placeOfDeath <http://d-nb.info/gnd/4005728-8> ;
gndo:placeOfExile <http://d-nb.info/gnd/4010877-6>, <http://d-nb.info/gnd/4077258-5> ;
gndo:gender <http://d-nb.info/standards/vocab/gnd/Gender#male> ;
gndo:dateOfBirth "1898-02-10"^^xsd:date ;
gndo:dateOfDeath "1956-08-14"^^xsd:date .
<http://d-nb.info/gnd/121608557> gndo:preferredNameForThePerson "Brecht, Berthold Friedrich" .
<http://d-nb.info/gnd/119056011> gndo:preferredNameForThePerson "Banholzer, Paula" .
<http://d-nb.info/gnd/118738348> gndo:preferredNameForThePerson "Neher, Carola" .
<http://d-nb.info/gnd/137070411> gndo:preferredNameForThePerson "Banholzer, Frank" .
<http://d-nb.info/gnd/118809849> gndo:preferredNameForThePerson "Berlau, Ruth" .
<http://d-nb.info/gnd/119027615> gndo:preferredNameForThePerson "Steffin, Margarete" .
<http://d-nb.info/gnd/118940163> gndo:preferredNameForThePerson "Zoff, Marianne" .
<http://d-nb.info/gnd/118630091> gndo:preferredNameForThePerson "Weigel, Helene" .
<http://d-nb.info/gnd/123783283> gndo:preferredNameForThePerson "Reichel, Käthe" .
<http://d-nb.info/gnd/118940155> gndo:preferredNameForThePerson "Hiob, Hanne" .
<http://d-nb.info/gnd/110005449> gndo:preferredNameForThePerson "Brecht, Stefan" .
<http://d-nb.info/gnd/13612495X> gndo:preferredNameForThePerson "Brecht-Schall, Barbara" .
<http://d-nb.info/gnd/123757398> gndo:preferredNameForThePerson "Schall, Ekkehard" .
<http://d-nb.info/gnd/1030496250> gndo:preferredNameForThePerson "Brezing, Joseph Friedrich" .
<http://d-nb.info/gnd/1030496366> gndo:preferredNameForThePerson "Brezing, Friederike" .
<http://d-nb.info/gnd/4185053-1> gndo:preferredNameForTheSubjectHeading "Theaterregisseur" .
<http://d-nb.info/gnd/4140241-8> gndo:preferredNameForTheSubjectHeading "Dramatiker" .
<http://d-nb.info/gnd/4052154-0> gndo:preferredNameForTheSubjectHeading "Schauspieler" .
<http://d-nb.info/gnd/4168391-2> gndo:preferredNameForTheSubjectHeading "Lyriker" .
<http://d-nb.info/gnd/4053309-8> gndo:preferredNameForTheSubjectHeading "Schriftsteller" .
<http://d-nb.info/gnd/4049050-6> gndo:preferredNameForTheSubjectHeading "Regisseur" .
<http://d-nb.info/gnd/4294338-3> gndo:preferredNameForTheSubjectHeading "Drehbuchautor" .
<http://d-nb.info/gnd/4003614-5> gndo:preferredNameForThePlaceOrGeographicName "Augsburg" .
<http://d-nb.info/gnd/4005728-8> gndo:preferredNameForThePlaceOrGeographicName "Berlin" .
<http://d-nb.info/gnd/4010877-6> gndo:preferredNameForThePlaceOrGeographicName "Dänemark" .
<http://d-nb.info/gnd/4077258-5> gndo:preferredNameForThePlaceOrGeographicName "Schweden" .

View File

@@ -0,0 +1,10 @@
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix dc: <http://purl.org/dc/elements/1.1/> .
@prefix ex: <http://example.org/stuff/1.0/> .
<http://www.w3.org/TR/rdf-syntax-grammar>
dc:title "RDF/XML Syntax Specification (Revised)" ;
ex:editor [
ex:fullname "Dave Beckett";
ex:homePage <http://purl.org/net/dajobe/>
] .

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,30 @@
<?xml version="1.0" encoding="UTF-8"?>
<?import javafx.geometry.*?>
<?import javafx.scene.control.*?>
<?import java.lang.*?>
<?import javafx.scene.layout.*?>
<BorderPane maxHeight="-Infinity" maxWidth="-Infinity" minHeight="-Infinity" minWidth="-Infinity" prefHeight="400.0" prefWidth="600.0" xmlns="http://javafx.com/javafx/8" xmlns:fx="http://javafx.com/fxml/1">
<center>
<TableView prefHeight="200.0" prefWidth="200.0" BorderPane.alignment="CENTER">
<columns>
<TableColumn prefWidth="114.0" text="Column 1" />
<TableColumn minWidth="0.0" prefWidth="243.0" text="Column 2" />
<TableColumn prefWidth="214.0" text="Column 3" />
</columns>
</TableView>
</center>
<bottom>
<HBox alignment="CENTER_RIGHT" prefWidth="200.0" spacing="10.0" BorderPane.alignment="CENTER">
<children>
<Button mnemonicParsing="false" text="Button">
<HBox.margin>
<Insets bottom="10.0" left="10.0" right="10.0" top="10.0" />
</HBox.margin>
</Button>
</children>
</HBox>
</bottom>
</BorderPane>

View File

@@ -0,0 +1,21 @@
# http://standards.freedesktop.org/desktop-entry-spec/latest/apa.html
[Desktop Entry]
Version=1.0
Type=Application
Name=Foo Viewer
Comment=The best viewer for Foo objects available!
TryExec=fooview
Exec=fooview %F
Icon=fooview
MimeType=image/x-foo;
Actions=Gallery;Create;
[Desktop Action Gallery]
Exec=fooview --gallery
Name=Browse Gallery
[Desktop Action Create]
Exec=fooview --create-new
Name=Create a new Foo!
Icon=fooview-new

View File

@@ -250,9 +250,10 @@ def main(sources)
all_scopes = {}
if $options[:add]
if source = $options[:add]
Dir.mktmpdir do |tmpdir|
install_grammar(tmpdir, ARGV[0], all_scopes)
grammars = load_grammars(tmpdir, source, all_scopes)
install_grammars(grammars, source) if $options[:install]
end
generate_yaml(all_scopes, sources)
else

View File

@@ -0,0 +1 @@
; -*-mode:Smalltalk-*-

View File

@@ -0,0 +1 @@
; -*- mode: php;-*-

View File

@@ -0,0 +1,3 @@
/* vim: set filetype=prolog: */
# I am Prolog

3
test/fixtures/Data/Modelines/ruby vendored Normal file
View File

@@ -0,0 +1,3 @@
/* vim: set filetype=ruby: */
# I am Ruby

View File

@@ -0,0 +1,3 @@
/* vim: set ft=cpp: */
I would like to be C++ please.

View File

@@ -2,3 +2,21 @@ require "bundler/setup"
require "minitest/autorun"
require "mocha/setup"
require "linguist"
def fixtures_path
File.expand_path("../fixtures", __FILE__)
end
def fixture_blob(name)
name = File.join(fixtures_path, name) unless name =~ /^\//
Linguist::FileBlob.new(name, fixtures_path)
end
def samples_path
File.expand_path("../../samples", __FILE__)
end
def sample_blob(name)
name = File.join(samples_path, name) unless name =~ /^\//
Linguist::FileBlob.new(name, samples_path)
end

View File

@@ -14,24 +14,6 @@ class TestBlob < Minitest::Test
Encoding.default_external = @original_external
end
def samples_path
File.expand_path("../../samples", __FILE__)
end
def fixtures_path
File.expand_path("../fixtures", __FILE__)
end
def sample_blob(name)
name = File.join(samples_path, name) unless name =~ /^\//
FileBlob.new(name, samples_path)
end
def fixture_blob(name)
name = File.join(fixtures_path, name) unless name =~ /^\//
FileBlob.new(name, fixtures_path)
end
def script_blob(name)
blob = sample_blob(name)
blob.instance_variable_set(:@name, 'script')

View File

@@ -3,10 +3,6 @@ require_relative "./helper"
class TestClassifier < Minitest::Test
include Linguist
def samples_path
File.expand_path("../../samples", __FILE__)
end
def fixture(name)
File.read(File.join(samples_path, name))
end

View File

@@ -3,10 +3,6 @@ require_relative "./helper"
class TestGenerated < Minitest::Test
include Linguist
def samples_path
File.expand_path("../../samples", __FILE__)
end
class DataLoadedError < StandardError; end
def generated_without_loading_data(name)
@@ -45,9 +41,9 @@ class TestGenerated < Minitest::Test
generated_without_loading_data("Godeps/_workspace/src/github.com/kr/s3/sign.go")
# Generated by Zephir
generated_without_loading_data("Zephir/filenames/exception.zep.c")
generated_without_loading_data("Zephir/filenames/exception.zep.h")
generated_without_loading_data("Zephir/filenames/exception.zep.php")
generated_without_loading_data("C/exception.zep.c")
generated_without_loading_data("C/exception.zep.h")
generated_without_loading_data("PHP/exception.zep.php")
# Minified files
generated_loading_data("JavaScript/jquery-1.6.1.min.js")

View File

@@ -60,7 +60,7 @@ class TestGrammars < Minitest::Test
def test_submodules_have_licenses
categories = submodule_paths.group_by do |submodule|
files = Dir[File.join(ROOT, submodule, "*")]
license = files.find { |path| File.basename(path) =~ /\blicen[cs]e\b/i } || files.find { |path| File.basename(path) =~ /\bcopying\b/i }
license = files.find { |path| File.basename(path) =~ /\b(un)?licen[cs]e\b/i } || files.find { |path| File.basename(path) =~ /\bcopying\b/i }
if license.nil?
if readme = files.find { |path| File.basename(path) =~ /\Areadme\b/i }
license = readme if File.read(readme) =~ /\blicen[cs]e\b/i
@@ -129,6 +129,8 @@ class TestGrammars < Minitest::Test
"MIT"
elsif content.include?("unlicense.org")
"unlicense"
elsif content.include?("http://www.wtfpl.net/txt/copying/")
"WTFPL"
end
end
end

View File

@@ -3,10 +3,6 @@ require_relative "./helper"
class TestHeuristcs < Minitest::Test
include Linguist
def samples_path
File.expand_path("../../samples", __FILE__)
end
def fixture(name)
File.read(File.join(samples_path, name))
end
@@ -133,6 +129,20 @@ class TestHeuristcs < Minitest::Test
})
end
def test_lsp_by_heuristics
assert_heuristics({
"Common Lisp" => all_fixtures("Common Lisp"),
"NewLisp" => all_fixtures("NewLisp")
})
end
def test_cs_by_heuristics
assert_heuristics({
"C#" => all_fixtures("C#", "*.cs"),
"Smalltalk" => all_fixtures("Smalltalk", "*.cs")
})
end
def assert_heuristics(hash)
candidates = hash.keys.map { |l| Language[l] }

25
test/test_modelines.rb Normal file
View File

@@ -0,0 +1,25 @@
require_relative "./helper"
class TestModelines < Minitest::Test
include Linguist
def assert_modeline(language, blob)
assert_equal language, Linguist::Strategy::Modeline.call(blob).first
end
def test_modeline_strategy
assert_modeline Language["Ruby"], fixture_blob("Data/Modelines/ruby")
assert_modeline Language["C++"], fixture_blob("Data/Modelines/seeplusplus")
assert_modeline Language["Prolog"], fixture_blob("Data/Modelines/not_perl.pl")
assert_modeline Language["Smalltalk"], fixture_blob("Data/Modelines/example_smalltalk.md")
assert_modeline Language["PHP"], fixture_blob("Data/Modelines/iamphp.inc")
end
def test_modeline_languages
assert_equal Language["Ruby"], fixture_blob("Data/Modelines/ruby").language
assert_equal Language["C++"], fixture_blob("Data/Modelines/seeplusplus").language
assert_equal Language["Prolog"], fixture_blob("Data/Modelines/not_perl.pl").language
assert_equal Language["Smalltalk"], fixture_blob("Data/Modelines/example_smalltalk.md").language
assert_equal Language["PHP"], fixture_blob("Data/Modelines/iamphp.inc").language
end
end

View File

@@ -3,10 +3,6 @@ require_relative "./helper"
class TestTokenizer < Minitest::Test
include Linguist
def samples_path
File.expand_path("../../samples", __FILE__)
end
def tokenize(data)
data = File.read(File.join(samples_path, data.to_s)) if data.is_a?(Symbol)
Tokenizer.tokenize(data)
@@ -41,6 +37,8 @@ class TestTokenizer < Minitest::Test
assert_equal %w(foo), tokenize("foo {- Comment -}")
assert_equal %w(foo), tokenize("foo (* Comment *)")
assert_equal %w(%), tokenize("2 % 10\n% Comment")
assert_equal %w(foo bar), tokenize("foo\n\"\"\"\nComment\n\"\"\"\nbar")
assert_equal %w(foo bar), tokenize("foo\n'''\nComment\n'''\nbar")
end
def test_sgml_tags

1
vendor/grammars/AutoHotkey vendored Submodule

1
vendor/grammars/Creole vendored Submodule

Submodule vendor/grammars/Creole added at bac4656c8d

1
vendor/grammars/JSyntax vendored Submodule

Submodule vendor/grammars/JSyntax added at 74971149b5

1
vendor/grammars/Modelica vendored Submodule

1
vendor/grammars/Racket vendored Submodule

Submodule vendor/grammars/Racket added at 02739c25ae

1
vendor/grammars/Sublime-HTTP vendored Submodule

1
vendor/grammars/Sublime-Nit vendored Submodule

1
vendor/grammars/ats.sublime vendored Submodule

1
vendor/grammars/language-hy vendored Submodule

Some files were not shown because too many files have changed in this diff Show More