mirror of
https://github.com/KevinMidboe/linguist.git
synced 2025-10-29 17:50:22 +00:00
Merge branch 'master' into 1036-local
Conflicts: lib/linguist/heuristics.rb lib/linguist/samples.json
This commit is contained in:
0
.gitattributes
vendored
Normal file
0
.gitattributes
vendored
Normal file
3
.gitignore
vendored
3
.gitignore
vendored
@@ -1,3 +1,4 @@
|
||||
Gemfile.lock
|
||||
.bundle/
|
||||
vendor/
|
||||
benchmark/
|
||||
lib/linguist/samples.json
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
before_install:
|
||||
before_install:
|
||||
- git fetch origin master:master
|
||||
- git fetch origin v2.0.0:v2.0.0
|
||||
- git fetch origin test/attributes:test/attributes
|
||||
- sudo apt-get install libicu-dev -y
|
||||
- gem update --system 2.1.11
|
||||
rvm:
|
||||
- 1.9.3
|
||||
- 2.0.0
|
||||
|
||||
87
README.md
87
README.md
@@ -32,33 +32,57 @@ The Language stats bar that you see on every repository is built by aggregating
|
||||
|
||||
The repository stats API, accessed through `#languages`, can be used on a directory:
|
||||
|
||||
***API UPDATE***
|
||||
|
||||
Since [Version 3.0.0](https://github.com/github/linguist/releases/tag/v3.0.0) Linguist expects a git repository (in the form of a [Rugged::Repository](https://github.com/libgit2/rugged#repositories)) to be passed when initializing `Linguist::Repository`.
|
||||
|
||||
|
||||
```ruby
|
||||
project = Linguist::Repository.from_directory(".")
|
||||
project.language.name #=> "Ruby"
|
||||
project.languages #=> { "Ruby" => 0.98, "Shell" => 0.02 }
|
||||
require 'rugged'
|
||||
require 'linguist'
|
||||
|
||||
repo = Rugged::Repository.new('.')
|
||||
project = Linguist::Repository.new(repo, repo.head.target_id)
|
||||
project.language #=> "Ruby"
|
||||
project.languages #=> { "Ruby" => 119387 }
|
||||
```
|
||||
|
||||
These stats are also printed out by the `linguist` binary. You can use the
|
||||
`--breakdown` flag, and the binary will also output the breakdown of files by language.
|
||||
|
||||
You can try running `linguist` on the `lib/` directory in this repository itself:
|
||||
You can try running `linguist` on the root directory in this repository itself:
|
||||
|
||||
$ bundle exec linguist lib/ --breakdown
|
||||
$ bundle exec linguist --breakdown
|
||||
|
||||
100.00% Ruby
|
||||
|
||||
Ruby:
|
||||
linguist/blob_helper.rb
|
||||
linguist/classifier.rb
|
||||
linguist/file_blob.rb
|
||||
linguist/generated.rb
|
||||
linguist/heuristics.rb
|
||||
linguist/language.rb
|
||||
linguist/md5.rb
|
||||
linguist/repository.rb
|
||||
linguist/samples.rb
|
||||
linguist/tokenizer.rb
|
||||
linguist.rb
|
||||
Gemfile
|
||||
Rakefile
|
||||
bin/linguist
|
||||
github-linguist.gemspec
|
||||
lib/linguist.rb
|
||||
lib/linguist/blob_helper.rb
|
||||
lib/linguist/classifier.rb
|
||||
lib/linguist/file_blob.rb
|
||||
lib/linguist/generated.rb
|
||||
lib/linguist/heuristics.rb
|
||||
lib/linguist/language.rb
|
||||
lib/linguist/lazy_blob.rb
|
||||
lib/linguist/md5.rb
|
||||
lib/linguist/repository.rb
|
||||
lib/linguist/samples.rb
|
||||
lib/linguist/tokenizer.rb
|
||||
lib/linguist/version.rb
|
||||
test/test_blob.rb
|
||||
test/test_classifier.rb
|
||||
test/test_heuristics.rb
|
||||
test/test_language.rb
|
||||
test/test_md5.rb
|
||||
test/test_pedantic.rb
|
||||
test/test_repository.rb
|
||||
test/test_samples.rb
|
||||
test/test_tokenizer.rb
|
||||
|
||||
#### Ignore vendored files
|
||||
|
||||
@@ -80,9 +104,32 @@ Linguist::FileBlob.new("underscore.min.js").generated? # => true
|
||||
|
||||
See [Linguist::Generated#generated?](https://github.com/github/linguist/blob/master/lib/linguist/generated.rb).
|
||||
|
||||
## Overrides
|
||||
|
||||
Linguist supports custom overrides for language definitions and vendored paths. Add a `.gitattributes` file to your project using the keys `linguist-language` and `linguist-vendored` with the standard git-style path matchers for the files you want to override.
|
||||
|
||||
```
|
||||
$ cat .gitattributes
|
||||
*.rb linguist-language=Java
|
||||
|
||||
$ linguist --breakdown
|
||||
100.00% Java
|
||||
|
||||
Java:
|
||||
ruby_file.rb
|
||||
```
|
||||
|
||||
By default, Linguist treats all of the paths defined in [lib/linguist/vendor.yml](https://github.com/github/linguist/blob/master/lib/linguist/vendor.yml) as vendored and therefore doesn't include them in the language statistics for a repository. Use the `linguist-vendored` attribute to vendor or un-vendor paths.
|
||||
|
||||
```
|
||||
$ cat .gitattributes
|
||||
special-vendored-path/* linguist-vendored
|
||||
jquery.js linguist-vendored=false
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
github.com is usually running the latest version of the `github-linguist` gem that is released on [RubyGems.org](http://rubygems.org/gems/github-linguist).
|
||||
Github.com is usually running the latest version of the `github-linguist` gem that is released on [RubyGems.org](http://rubygems.org/gems/github-linguist).
|
||||
|
||||
But for development you are going to want to checkout out the source. To get it, clone the repo and run [Bundler](http://gembundler.com/) to install its dependencies.
|
||||
|
||||
@@ -102,10 +149,6 @@ We try to only add languages once they have some usage on GitHub, so please note
|
||||
|
||||
Almost all bug fixes or new language additions should come with some additional code samples. Just drop them under [`samples/`](https://github.com/github/linguist/tree/master/samples) in the correct subdirectory and our test suite will automatically test them. In most cases you shouldn't need to add any new assertions.
|
||||
|
||||
To update the `samples.json` after adding new files to [`samples/`](https://github.com/github/linguist/tree/master/samples):
|
||||
|
||||
bundle exec rake samples
|
||||
|
||||
### A note on language extensions
|
||||
|
||||
Linguist has a number of methods available to it for identifying the language of a particular file. The initial lookup is based upon the extension of the file, possible file extensions are defined in an array called `extensions`. Take a look at this example for example for `Perl`:
|
||||
@@ -145,7 +188,7 @@ If you are the current maintainer of this gem:
|
||||
0. Ensure that tests are green: `bundle exec rake test`
|
||||
0. Bump gem version in `lib/linguist/version.rb`. For example, [like this](https://github.com/github/linguist/commit/8d2ea90a5ba3b2fe6e1508b7155aa4632eea2985).
|
||||
0. Make a PR to github/linguist. For example, [#1238](https://github.com/github/linguist/pull/1238).
|
||||
0. Build a local gem: `gem build github-linguist.gemspec`
|
||||
0. Build a local gem: `bundle exec rake build_gem`
|
||||
0. Testing:
|
||||
0. Bump the Gemfile and Gemfile.lock versions for an app which relies on this gem
|
||||
0. Install the new gem locally
|
||||
|
||||
83
Rakefile
83
Rakefile
@@ -1,3 +1,4 @@
|
||||
require 'bundler/setup'
|
||||
require 'json'
|
||||
require 'rake/clean'
|
||||
require 'rake/testtask'
|
||||
@@ -7,6 +8,16 @@ task :default => :test
|
||||
|
||||
Rake::TestTask.new
|
||||
|
||||
# Extend test task to check for samples
|
||||
task :test => :check_samples
|
||||
|
||||
desc "Check that we have samples.json generated"
|
||||
task :check_samples do
|
||||
unless File.exist?('lib/linguist/samples.json')
|
||||
Rake::Task[:samples].invoke
|
||||
end
|
||||
end
|
||||
|
||||
task :samples do
|
||||
require 'linguist/samples'
|
||||
require 'yajl'
|
||||
@@ -15,13 +26,74 @@ task :samples do
|
||||
File.open('lib/linguist/samples.json', 'w') { |io| io.write json }
|
||||
end
|
||||
|
||||
task :build_gem do
|
||||
task :build_gem => :samples do
|
||||
languages = YAML.load_file("lib/linguist/languages.yml")
|
||||
File.write("lib/linguist/languages.json", JSON.dump(languages))
|
||||
`gem build github-linguist.gemspec`
|
||||
File.delete("lib/linguist/languages.json")
|
||||
end
|
||||
|
||||
namespace :benchmark do
|
||||
benchmark_path = "benchmark/results"
|
||||
|
||||
# $ bundle exec rake benchmark:generate CORPUS=path/to/samples
|
||||
desc "Generate results for"
|
||||
task :generate do
|
||||
ref = `git rev-parse HEAD`.strip[0,8]
|
||||
|
||||
corpus = File.expand_path(ENV["CORPUS"] || "samples")
|
||||
|
||||
require 'linguist/language'
|
||||
|
||||
results = Hash.new
|
||||
Dir.glob("#{corpus}/**/*").each do |file|
|
||||
next unless File.file?(file)
|
||||
filename = file.gsub("#{corpus}/", "")
|
||||
results[filename] = Linguist::FileBlob.new(file).language
|
||||
end
|
||||
|
||||
# Ensure results directory exists
|
||||
FileUtils.mkdir_p("benchmark/results")
|
||||
|
||||
# Write results
|
||||
if `git status`.include?('working directory clean')
|
||||
result_filename = "benchmark/results/#{File.basename(corpus)}-#{ref}.json"
|
||||
else
|
||||
result_filename = "benchmark/results/#{File.basename(corpus)}-#{ref}-unstaged.json"
|
||||
end
|
||||
|
||||
File.write(result_filename, results.to_json)
|
||||
puts "wrote #{result_filename}"
|
||||
end
|
||||
|
||||
# $ bundle exec rake benchmark:compare REFERENCE=path/to/reference.json CANDIDATE=path/to/candidate.json
|
||||
desc "Compare results"
|
||||
task :compare do
|
||||
reference_file = ENV["REFERENCE"]
|
||||
candidate_file = ENV["CANDIDATE"]
|
||||
|
||||
reference = JSON.parse(File.read(reference_file))
|
||||
reference_counts = Hash.new(0)
|
||||
reference.each { |filename, language| reference_counts[language] += 1 }
|
||||
|
||||
candidate = JSON.parse(File.read(candidate_file))
|
||||
candidate_counts = Hash.new(0)
|
||||
candidate.each { |filename, language| candidate_counts[language] += 1 }
|
||||
|
||||
changes = diff(reference_counts, candidate_counts)
|
||||
|
||||
if changes.any?
|
||||
changes.each do |language, (before, after)|
|
||||
before_percent = 100 * before / reference.size.to_f
|
||||
after_percent = 100 * after / candidate.size.to_f
|
||||
puts "%s changed from %.1f%% to %.1f%%" % [language || 'unknown', before_percent, after_percent]
|
||||
end
|
||||
else
|
||||
puts "No changes"
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
namespace :classifier do
|
||||
LIMIT = 1_000
|
||||
|
||||
@@ -37,7 +109,7 @@ namespace :classifier do
|
||||
next if file_language.nil? || file_language == 'Text'
|
||||
begin
|
||||
data = open(file_url).read
|
||||
guessed_language, score = Linguist::Classifier.classify(Linguist::Samples::DATA, data).first
|
||||
guessed_language, score = Linguist::Classifier.classify(Linguist::Samples.cache, data).first
|
||||
|
||||
total += 1
|
||||
guessed_language == file_language ? correct += 1 : incorrect += 1
|
||||
@@ -71,3 +143,10 @@ namespace :classifier do
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
def diff(a, b)
|
||||
(a.keys | b.keys).each_with_object({}) do |key, diff|
|
||||
diff[key] = [a[key], b[key]] unless a[key] == b[key]
|
||||
end
|
||||
end
|
||||
|
||||
@@ -17,10 +17,11 @@ Gem::Specification.new do |s|
|
||||
s.add_dependency 'escape_utils', '~> 1.0.1'
|
||||
s.add_dependency 'mime-types', '~> 1.19'
|
||||
s.add_dependency 'pygments.rb', '~> 0.6.0'
|
||||
s.add_dependency 'rugged', '~> 0.21.0'
|
||||
s.add_dependency 'rugged', '~> 0.21.1b2'
|
||||
|
||||
s.add_development_dependency 'json'
|
||||
s.add_development_dependency 'mocha'
|
||||
s.add_development_dependency 'pry'
|
||||
s.add_development_dependency 'rake'
|
||||
s.add_development_dependency 'yajl-ruby'
|
||||
end
|
||||
|
||||
@@ -321,6 +321,11 @@ module Linguist
|
||||
language ? language.lexer : Pygments::Lexer.find_by_name('Text only')
|
||||
end
|
||||
|
||||
# Internal: Get the TextMate compatible scope for the blob
|
||||
def tm_scope
|
||||
language && language.tm_scope
|
||||
end
|
||||
|
||||
# Public: Highlight syntax of blob
|
||||
#
|
||||
# options - A Hash of options (defaults to {})
|
||||
|
||||
@@ -63,6 +63,7 @@ module Linguist
|
||||
generated_jni_header? ||
|
||||
composer_lock? ||
|
||||
node_modules? ||
|
||||
godeps? ||
|
||||
vcr_cassette? ||
|
||||
generated_by_zephir?
|
||||
end
|
||||
@@ -231,6 +232,14 @@ module Linguist
|
||||
!!name.match(/node_modules\//)
|
||||
end
|
||||
|
||||
# Internal: Is the blob part of Godeps/,
|
||||
# which are not meant for humans in pull requests.
|
||||
#
|
||||
# Returns true or false.
|
||||
def godeps?
|
||||
!!name.match(/Godeps\//)
|
||||
end
|
||||
|
||||
# Internal: Is the blob a generated php composer lock file?
|
||||
#
|
||||
# Returns true or false.
|
||||
|
||||
@@ -14,27 +14,25 @@ module Linguist
|
||||
def self.find_by_heuristics(data, languages)
|
||||
if active?
|
||||
if languages.all? { |l| ["Objective-C", "C++", "C"].include?(l) }
|
||||
disambiguate_c(data, languages)
|
||||
result = disambiguate_c(data, languages)
|
||||
end
|
||||
if languages.all? { |l| ["Perl", "Prolog"].include?(l) }
|
||||
disambiguate_pl(data, languages)
|
||||
result = disambiguate_pl(data, languages)
|
||||
end
|
||||
if languages.all? { |l| ["ECL", "Prolog"].include?(l) }
|
||||
disambiguate_ecl(data, languages)
|
||||
result = disambiguate_ecl(data, languages)
|
||||
end
|
||||
if languages.all? { |l| ["TypeScript", "XML"].include?(l) }
|
||||
disambiguate_ts(data, languages)
|
||||
if languages.all? { |l| ["IDL", "Prolog"].include?(l) }
|
||||
result = disambiguate_pro(data, languages)
|
||||
end
|
||||
if languages.all? { |l| ["Common Lisp", "OpenCL"].include?(l) }
|
||||
disambiguate_cl(data, languages)
|
||||
end
|
||||
if languages.all? { |l| ["Rebol", "R"].include?(l) }
|
||||
disambiguate_r(data, languages)
|
||||
result = disambiguate_cl(data, languages)
|
||||
end
|
||||
return result
|
||||
end
|
||||
end
|
||||
|
||||
# .h extensions are ambigious between C, C++, and Objective-C.
|
||||
# .h extensions are ambiguous between C, C++, and Objective-C.
|
||||
# We want to shortcut look for Objective-C _and_ now C++ too!
|
||||
#
|
||||
# Returns an array of Languages or []
|
||||
@@ -64,6 +62,16 @@ module Linguist
|
||||
matches
|
||||
end
|
||||
|
||||
def self.disambiguate_pro(data, languages)
|
||||
matches = []
|
||||
if (data.include?(":-"))
|
||||
matches << Language["Prolog"]
|
||||
else
|
||||
matches << Language["IDL"]
|
||||
end
|
||||
matches
|
||||
end
|
||||
|
||||
def self.disambiguate_ts(data, languages)
|
||||
matches = []
|
||||
if (data.include?("</translation>"))
|
||||
|
||||
@@ -135,8 +135,8 @@ module Linguist
|
||||
# No shebang. Still more work to do. Try to find it with our heuristics.
|
||||
elsif (determined = Heuristics.find_by_heuristics(data, possible_language_names)) && !determined.empty?
|
||||
determined.first
|
||||
# Lastly, fall back to the probablistic classifier.
|
||||
elsif classified = Classifier.classify(Samples::DATA, data, possible_language_names ).first
|
||||
# Lastly, fall back to the probabilistic classifier.
|
||||
elsif classified = Classifier.classify(Samples.cache, data, possible_language_names).first
|
||||
# Return the actual Language object based of the string language name (i.e., first element of `#classify`)
|
||||
Language[classified[0]]
|
||||
end
|
||||
@@ -290,6 +290,16 @@ module Linguist
|
||||
@lexer = Pygments::Lexer.find_by_name(attributes[:lexer] || name) ||
|
||||
raise(ArgumentError, "#{@name} is missing lexer")
|
||||
|
||||
@tm_scope = attributes[:tm_scope] || begin
|
||||
context = case @type
|
||||
when :data, :markup, :prose
|
||||
'text'
|
||||
when :programming, nil
|
||||
'source'
|
||||
end
|
||||
"#{context}.#{@name.downcase}"
|
||||
end
|
||||
|
||||
@ace_mode = attributes[:ace_mode]
|
||||
@wrap = attributes[:wrap] || false
|
||||
|
||||
@@ -363,6 +373,11 @@ module Linguist
|
||||
# Returns the Lexer
|
||||
attr_reader :lexer
|
||||
|
||||
# Public: Get the name of a TextMate-compatible scope
|
||||
#
|
||||
# Returns the scope
|
||||
attr_reader :tm_scope
|
||||
|
||||
# Public: Get Ace mode
|
||||
#
|
||||
# Examples
|
||||
@@ -510,9 +525,9 @@ module Linguist
|
||||
end
|
||||
end
|
||||
|
||||
extensions = Samples::DATA['extnames']
|
||||
interpreters = Samples::DATA['interpreters']
|
||||
filenames = Samples::DATA['filenames']
|
||||
extensions = Samples.cache['extnames']
|
||||
interpreters = Samples.cache['interpreters']
|
||||
filenames = Samples.cache['filenames']
|
||||
popular = YAML.load_file(File.expand_path("../popular.yml", __FILE__))
|
||||
|
||||
languages_yml = File.expand_path("../languages.yml", __FILE__)
|
||||
@@ -564,6 +579,7 @@ module Linguist
|
||||
:type => options['type'],
|
||||
:aliases => options['aliases'],
|
||||
:lexer => options['lexer'],
|
||||
:tm_scope => options['tm_scope'],
|
||||
:ace_mode => options['ace_mode'],
|
||||
:wrap => options['wrap'],
|
||||
:group_name => options['group'],
|
||||
|
||||
@@ -83,6 +83,7 @@ ATS:
|
||||
ActionScript:
|
||||
type: programming
|
||||
lexer: ActionScript 3
|
||||
tm_scope: source.actionscript.3
|
||||
color: "#e3491a"
|
||||
search_term: as3
|
||||
aliases:
|
||||
@@ -119,7 +120,7 @@ ApacheConf:
|
||||
|
||||
Apex:
|
||||
type: programming
|
||||
lexer: Text only
|
||||
lexer: Java
|
||||
extensions:
|
||||
- .cls
|
||||
|
||||
@@ -173,6 +174,7 @@ Assembly:
|
||||
- nasm
|
||||
extensions:
|
||||
- .asm
|
||||
- .a51
|
||||
|
||||
Augeas:
|
||||
type: programming
|
||||
@@ -284,8 +286,9 @@ C:
|
||||
C#:
|
||||
type: programming
|
||||
ace_mode: csharp
|
||||
tm_scope: source.cs
|
||||
search_term: csharp
|
||||
color: "#5a25a2"
|
||||
color: "#178600"
|
||||
aliases:
|
||||
- csharp
|
||||
extensions:
|
||||
@@ -411,6 +414,7 @@ Clojure:
|
||||
|
||||
CoffeeScript:
|
||||
type: programming
|
||||
tm_scope: source.coffee
|
||||
ace_mode: coffee
|
||||
color: "#244776"
|
||||
aliases:
|
||||
@@ -453,6 +457,7 @@ ColdFusion CFC:
|
||||
|
||||
Common Lisp:
|
||||
type: programming
|
||||
tm_scope: source.lisp
|
||||
color: "#3fb68b"
|
||||
aliases:
|
||||
- lisp
|
||||
@@ -648,6 +653,7 @@ Elm:
|
||||
Emacs Lisp:
|
||||
type: programming
|
||||
lexer: Common Lisp
|
||||
tm_scope: source.lisp
|
||||
color: "#c065db"
|
||||
aliases:
|
||||
- elisp
|
||||
@@ -748,6 +754,7 @@ Forth:
|
||||
- .fth
|
||||
- .4th
|
||||
- .forth
|
||||
- .frt
|
||||
|
||||
Frege:
|
||||
type: programming
|
||||
@@ -756,6 +763,14 @@ Frege:
|
||||
extensions:
|
||||
- .fr
|
||||
|
||||
G-code:
|
||||
type: data
|
||||
lexer: Text only
|
||||
extensions:
|
||||
- .g
|
||||
- .gco
|
||||
- .gcode
|
||||
|
||||
Game Maker Language:
|
||||
type: programming
|
||||
color: "#8ad353"
|
||||
@@ -785,6 +800,12 @@ GAS:
|
||||
- .s
|
||||
- .S
|
||||
|
||||
GDScript:
|
||||
type: programming
|
||||
lexer: Text only
|
||||
extensions:
|
||||
- .gd
|
||||
|
||||
GLSL:
|
||||
group: C
|
||||
type: programming
|
||||
@@ -877,6 +898,12 @@ Grammatical Framework:
|
||||
searchable: true
|
||||
color: "#ff0000"
|
||||
|
||||
Graph Modeling Language:
|
||||
type: data
|
||||
lexer: Text only
|
||||
extensions:
|
||||
- .gml
|
||||
|
||||
Groff:
|
||||
extensions:
|
||||
- .man
|
||||
@@ -911,6 +938,7 @@ Groovy Server Pages:
|
||||
|
||||
HTML:
|
||||
type: markup
|
||||
tm_scope: text.html.basic
|
||||
ace_mode: html
|
||||
aliases:
|
||||
- xhtml
|
||||
@@ -922,6 +950,7 @@ HTML:
|
||||
|
||||
HTML+Django:
|
||||
type: markup
|
||||
tm_scope: text.html.django
|
||||
group: HTML
|
||||
lexer: HTML+Django/Jinja
|
||||
extensions:
|
||||
@@ -930,6 +959,7 @@ HTML+Django:
|
||||
|
||||
HTML+ERB:
|
||||
type: markup
|
||||
tm_scope: text.html.ruby
|
||||
group: HTML
|
||||
lexer: RHTML
|
||||
aliases:
|
||||
@@ -940,6 +970,7 @@ HTML+ERB:
|
||||
|
||||
HTML+PHP:
|
||||
type: markup
|
||||
tm_scope: text.html.php
|
||||
group: HTML
|
||||
extensions:
|
||||
- .phtml
|
||||
@@ -959,6 +990,8 @@ Haml:
|
||||
Handlebars:
|
||||
type: markup
|
||||
lexer: Handlebars
|
||||
aliases:
|
||||
- hbs
|
||||
extensions:
|
||||
- .handlebars
|
||||
- .hbs
|
||||
@@ -1075,6 +1108,7 @@ J:
|
||||
|
||||
JSON:
|
||||
type: data
|
||||
tm_scope: source.json
|
||||
group: JavaScript
|
||||
ace_mode: json
|
||||
searchable: false
|
||||
@@ -1137,6 +1171,7 @@ Java Server Pages:
|
||||
|
||||
JavaScript:
|
||||
type: programming
|
||||
tm_scope: source.js
|
||||
ace_mode: javascript
|
||||
color: "#f1e05a"
|
||||
aliases:
|
||||
@@ -1149,6 +1184,7 @@ JavaScript:
|
||||
- .es6
|
||||
- .frag
|
||||
- .jake
|
||||
- .jsb
|
||||
- .jsfl
|
||||
- .jsm
|
||||
- .jss
|
||||
@@ -1202,7 +1238,17 @@ LFE:
|
||||
LLVM:
|
||||
extensions:
|
||||
- .ll
|
||||
|
||||
|
||||
LSL:
|
||||
type: programming
|
||||
lexer: LSL
|
||||
ace_mode: lsl
|
||||
extensions:
|
||||
- .lsl
|
||||
interpreters:
|
||||
- lsl
|
||||
color: '#3d9970'
|
||||
|
||||
LabVIEW:
|
||||
type: programming
|
||||
lexer: Text only
|
||||
@@ -1254,6 +1300,7 @@ Literate Agda:
|
||||
|
||||
Literate CoffeeScript:
|
||||
type: programming
|
||||
tm_scope: source.litcoffee
|
||||
group: CoffeeScript
|
||||
lexer: Text only
|
||||
ace_mode: markdown
|
||||
@@ -1310,6 +1357,7 @@ Lua:
|
||||
color: "#fa1fa1"
|
||||
extensions:
|
||||
- .lua
|
||||
- .fcgi
|
||||
- .nse
|
||||
- .pd_lua
|
||||
- .rbxs
|
||||
@@ -1537,6 +1585,7 @@ ObjDump:
|
||||
|
||||
Objective-C:
|
||||
type: programming
|
||||
tm_scope: source.objc
|
||||
color: "#438eff"
|
||||
aliases:
|
||||
- obj-c
|
||||
@@ -1547,6 +1596,7 @@ Objective-C:
|
||||
|
||||
Objective-C++:
|
||||
type: programming
|
||||
tm_scope: source.objc++
|
||||
color: "#4886FC"
|
||||
aliases:
|
||||
- obj-c++
|
||||
@@ -1637,12 +1687,14 @@ PAWN:
|
||||
|
||||
PHP:
|
||||
type: programming
|
||||
tm_scope: text.html.php
|
||||
ace_mode: php
|
||||
color: "#4F5D95"
|
||||
extensions:
|
||||
- .php
|
||||
- .aw
|
||||
- .ctp
|
||||
- .fcgi
|
||||
- .module
|
||||
- .php3
|
||||
- .php4
|
||||
@@ -1694,6 +1746,7 @@ Pascal:
|
||||
- .dfm
|
||||
- .dpr
|
||||
- .lpr
|
||||
- .pp
|
||||
|
||||
Perl:
|
||||
type: programming
|
||||
@@ -1782,11 +1835,13 @@ Processing:
|
||||
|
||||
Prolog:
|
||||
type: programming
|
||||
lexer: Logtalk
|
||||
color: "#74283c"
|
||||
extensions:
|
||||
- .prolog
|
||||
- .ecl
|
||||
- .pl
|
||||
- .ecl
|
||||
- .pro
|
||||
- .prolog
|
||||
|
||||
Propeller Spin:
|
||||
type: programming
|
||||
@@ -1831,6 +1886,8 @@ Python:
|
||||
color: "#3581ba"
|
||||
extensions:
|
||||
- .py
|
||||
- .cgi
|
||||
- .fcgi
|
||||
- .gyp
|
||||
- .lmi
|
||||
- .pyde
|
||||
@@ -1992,6 +2049,7 @@ Ruby:
|
||||
extensions:
|
||||
- .rb
|
||||
- .builder
|
||||
- .fcgi
|
||||
- .gemspec
|
||||
- .god
|
||||
- .irbrc
|
||||
@@ -2039,6 +2097,7 @@ SAS:
|
||||
|
||||
SCSS:
|
||||
type: markup
|
||||
tm_scope: source.scss
|
||||
group: CSS
|
||||
ace_mode: scss
|
||||
extensions:
|
||||
@@ -2054,6 +2113,7 @@ SQF:
|
||||
|
||||
SQL:
|
||||
type: data
|
||||
tm_scope: source.sql
|
||||
ace_mode: sql
|
||||
extensions:
|
||||
- .sql
|
||||
@@ -2078,6 +2138,7 @@ Sage:
|
||||
|
||||
Sass:
|
||||
type: markup
|
||||
tm_scope: source.sass
|
||||
group: CSS
|
||||
extensions:
|
||||
- .sass
|
||||
@@ -2140,6 +2201,8 @@ Shell:
|
||||
- .sh
|
||||
- .bash
|
||||
- .bats
|
||||
- .cgi
|
||||
- .fcgi
|
||||
- .tmux
|
||||
- .zsh
|
||||
interpreters:
|
||||
@@ -2271,6 +2334,9 @@ Tcl:
|
||||
- .tcl
|
||||
- .adp
|
||||
- .tm
|
||||
interpreters:
|
||||
- tclsh
|
||||
- wish
|
||||
|
||||
Tcsh:
|
||||
type: programming
|
||||
@@ -2402,6 +2468,7 @@ VimL:
|
||||
- .vim
|
||||
filenames:
|
||||
- .vimrc
|
||||
- _vimrc
|
||||
- vimrc
|
||||
- gvimrc
|
||||
|
||||
@@ -2551,6 +2618,7 @@ Xtend:
|
||||
|
||||
YAML:
|
||||
type: data
|
||||
tm_scope: source.yaml
|
||||
aliases:
|
||||
- yml
|
||||
extensions:
|
||||
|
||||
@@ -1,8 +1,13 @@
|
||||
require 'linguist/blob_helper'
|
||||
require 'linguist/language'
|
||||
require 'rugged'
|
||||
|
||||
module Linguist
|
||||
class LazyBlob
|
||||
GIT_ATTR = ['linguist-language', 'linguist-vendored']
|
||||
GIT_ATTR_OPTS = { :priority => [:index], :skip_system => true }
|
||||
GIT_ATTR_FLAGS = Rugged::Repository::Attributes.parse_opts(GIT_ATTR_OPTS)
|
||||
|
||||
include BlobHelper
|
||||
|
||||
MAX_SIZE = 128 * 1024
|
||||
@@ -19,6 +24,29 @@ module Linguist
|
||||
@mode = mode
|
||||
end
|
||||
|
||||
def git_attributes
|
||||
@git_attributes ||= repository.fetch_attributes(
|
||||
name, GIT_ATTR, GIT_ATTR_FLAGS)
|
||||
end
|
||||
|
||||
def vendored?
|
||||
if attr = git_attributes['linguist-vendored']
|
||||
return boolean_attribute(attr)
|
||||
else
|
||||
return super
|
||||
end
|
||||
end
|
||||
|
||||
def language
|
||||
return @language if defined?(@language)
|
||||
|
||||
@language = if lang = git_attributes['linguist-language']
|
||||
Language.find_by_name(lang)
|
||||
else
|
||||
super
|
||||
end
|
||||
end
|
||||
|
||||
def data
|
||||
load_blob!
|
||||
@data
|
||||
@@ -30,6 +58,12 @@ module Linguist
|
||||
end
|
||||
|
||||
protected
|
||||
|
||||
# Returns true if the attribute is present and not the string "false".
|
||||
def boolean_attribute(attr)
|
||||
attr != "false"
|
||||
end
|
||||
|
||||
def load_blob!
|
||||
@data, @size = Rugged::Blob.to_buffer(repository, oid, MAX_SIZE) if @data.nil?
|
||||
end
|
||||
|
||||
@@ -110,18 +110,37 @@ module Linguist
|
||||
if @old_commit_oid == @commit_oid
|
||||
@old_stats
|
||||
else
|
||||
compute_stats(@old_commit_oid, @commit_oid, @old_stats)
|
||||
compute_stats(@old_commit_oid, @old_stats)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
protected
|
||||
def compute_stats(old_commit_oid, commit_oid, cache = nil)
|
||||
file_map = cache ? cache.dup : {}
|
||||
old_tree = old_commit_oid && Rugged::Commit.lookup(repository, old_commit_oid).tree
|
||||
new_tree = Rugged::Commit.lookup(repository, commit_oid).tree
|
||||
def read_index
|
||||
attr_index = Rugged::Index.new
|
||||
attr_index.read_tree(current_tree)
|
||||
repository.index = attr_index
|
||||
end
|
||||
|
||||
diff = Rugged::Tree.diff(repository, old_tree, new_tree)
|
||||
def current_tree
|
||||
@tree ||= Rugged::Commit.lookup(repository, @commit_oid).tree
|
||||
end
|
||||
|
||||
protected
|
||||
|
||||
def compute_stats(old_commit_oid, cache = nil)
|
||||
old_tree = old_commit_oid && Rugged::Commit.lookup(repository, old_commit_oid).tree
|
||||
|
||||
read_index
|
||||
|
||||
diff = Rugged::Tree.diff(repository, old_tree, current_tree)
|
||||
|
||||
# Clear file map and fetch full diff if any .gitattributes files are changed
|
||||
if cache && diff.each_delta.any? { |delta| File.basename(delta.new_file[:path]) == ".gitattributes" }
|
||||
diff = Rugged::Tree.diff(repository, old_tree = nil, current_tree)
|
||||
file_map = {}
|
||||
else
|
||||
file_map = cache ? cache.dup : {}
|
||||
end
|
||||
|
||||
diff.each_delta do |delta|
|
||||
old = delta.old_file[:path]
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -17,9 +17,11 @@ module Linguist
|
||||
PATH = File.expand_path('../samples.json', __FILE__)
|
||||
|
||||
# Hash of serialized samples object
|
||||
if File.exist?(PATH)
|
||||
serializer = defined?(JSON) ? JSON : YAML
|
||||
DATA = serializer.load(File.read(PATH))
|
||||
def self.cache
|
||||
@cache ||= begin
|
||||
serializer = defined?(JSON) ? JSON : YAML
|
||||
serializer.load(File.read(PATH))
|
||||
end
|
||||
end
|
||||
|
||||
# Public: Iterate over each sample.
|
||||
|
||||
@@ -33,6 +33,9 @@
|
||||
# Erlang bundles
|
||||
- ^rebar$
|
||||
|
||||
# Go dependencies
|
||||
- Godeps/_workspace/
|
||||
|
||||
# Bootstrap minified css and js
|
||||
- (^|/)bootstrap([^.]*)(\.min)?\.(js|css)$
|
||||
|
||||
@@ -235,3 +238,7 @@
|
||||
- octicons.css
|
||||
- octicons.min.css
|
||||
- sprockets-octicons.scss
|
||||
|
||||
# Typesafe Activator
|
||||
- (^|/)activator$
|
||||
- (^|/)activator\.bat$
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
module Linguist
|
||||
VERSION = "3.1.2"
|
||||
VERSION = "3.4.0"
|
||||
end
|
||||
|
||||
66
samples/Assembly/External Interrupt.a51
Normal file
66
samples/Assembly/External Interrupt.a51
Normal file
@@ -0,0 +1,66 @@
|
||||
ORG 0000h
|
||||
SJMP START
|
||||
ORG 0003h
|
||||
LCALL INT0_ISR
|
||||
RETI
|
||||
ORG 000Bh
|
||||
LCALL T0_ISR
|
||||
RETI
|
||||
ORG 0013h
|
||||
LCALL INT1_ISR
|
||||
RETI
|
||||
ORG 001Bh
|
||||
LCALL T1_ISR
|
||||
RETI
|
||||
ORG 0023h
|
||||
LCALL UART_ISR
|
||||
RETI
|
||||
ORG 0030h
|
||||
START:
|
||||
MOV A,#11111110b
|
||||
SETB IT0 ; Set External Interrupt 0 to be falling edge triggered
|
||||
SETB EX0 ; Enable External Interrut 0
|
||||
SETB EA ; Enable Interrupt
|
||||
LEFT:
|
||||
CJNE A,#01111111b,LOOP1
|
||||
JMP RIGHT
|
||||
LOOP1:
|
||||
MOV P1,A
|
||||
RL A
|
||||
LCALL DELAY
|
||||
SJMP LEFT
|
||||
RIGHT:
|
||||
CJNE A,#11111110b,LOOP2
|
||||
JMP LEFT
|
||||
LOOP2:
|
||||
MOV P1,A
|
||||
RR A
|
||||
LCALL DELAY
|
||||
SJMP RIGHT
|
||||
|
||||
INT0_ISR:
|
||||
MOV R1,#3
|
||||
FLASH:
|
||||
MOV P1,#00h
|
||||
LCALL DELAY
|
||||
MOV P1,#0FFh
|
||||
LCALL DELAY
|
||||
DJNZ R1,FLASH
|
||||
RET
|
||||
T0_ISR:
|
||||
RET
|
||||
INT1_ISR:
|
||||
RET
|
||||
T1_ISR:
|
||||
RET
|
||||
UART_ISR:
|
||||
RET
|
||||
|
||||
DELAY: MOV R5,#20 ;R5*20 mS
|
||||
D1: MOV R6,#40
|
||||
D2: MOV R7,#249
|
||||
DJNZ R7,$
|
||||
DJNZ R6,D2
|
||||
DJNZ R5,D1
|
||||
RET
|
||||
END
|
||||
@@ -1,13 +1,13 @@
|
||||
doc "Test function for Ceylon"
|
||||
by "Enrique"
|
||||
"Test function for Ceylon"
|
||||
by ("Enrique")
|
||||
shared void test() {
|
||||
print("test");
|
||||
print("test");
|
||||
}
|
||||
|
||||
doc "Test class for Ceylon"
|
||||
"Test class for Ceylon"
|
||||
shared class Test(name) satisfies Comparable<Test> {
|
||||
shared String name;
|
||||
shared actual String string = "Test " name ".";
|
||||
shared actual String string = "Test ``name``.";
|
||||
|
||||
shared actual Comparison compare(Test other) {
|
||||
return name<=>other.name;
|
||||
|
||||
11
samples/Clean/GenHylo.dcl
Normal file
11
samples/Clean/GenHylo.dcl
Normal file
@@ -0,0 +1,11 @@
|
||||
definition module GenHylo
|
||||
|
||||
import StdGeneric, GenMap
|
||||
|
||||
:: Fix f = In (f .(Fix f))
|
||||
Out :: !u:(Fix v:a) -> v:(a w:(Fix v:a)), [u <= w]
|
||||
|
||||
hylo :: ((.f .b) -> .b) (.a -> (.f .a)) -> (.a -> .b) | gMap{|*->*|} f
|
||||
cata :: (u:(f .a) -> .a) -> (Fix u:f) -> .a | gMap{|*->*|} f
|
||||
ana :: (.a -> u:(f .a)) -> .a -> (Fix u:f) | gMap{|*->*|} f
|
||||
|
||||
9
samples/Clean/GenMap.dcl
Normal file
9
samples/Clean/GenMap.dcl
Normal file
@@ -0,0 +1,9 @@
|
||||
definition module GenMap
|
||||
|
||||
import StdGeneric
|
||||
|
||||
generic gMap a b :: .a -> .b
|
||||
derive gMap c, UNIT, PAIR, EITHER, CONS, FIELD, OBJECT, {}, {!}
|
||||
|
||||
derive gMap [], (,), (,,), (,,,), (,,,,), (,,,,,), (,,,,,,), (,,,,,,,)
|
||||
|
||||
19
samples/Clean/GenMap.icl
Normal file
19
samples/Clean/GenMap.icl
Normal file
@@ -0,0 +1,19 @@
|
||||
implementation module GenMap
|
||||
|
||||
import StdClass, StdArray, StdInt, StdFunc
|
||||
import StdGeneric, _Array
|
||||
|
||||
generic gMap a b :: .a -> .b
|
||||
gMap{|c|} x = x
|
||||
gMap{|UNIT|} x = x
|
||||
gMap{|PAIR|} fx fy (PAIR x y) = PAIR (fx x) (fy y)
|
||||
gMap{|EITHER|} fl fr (LEFT x) = LEFT (fl x)
|
||||
gMap{|EITHER|} fl fr (RIGHT x) = RIGHT (fr x)
|
||||
gMap{|CONS|} f (CONS x) = CONS (f x)
|
||||
gMap{|FIELD|} f (FIELD x) = FIELD (f x)
|
||||
gMap{|OBJECT|} f (OBJECT x) = OBJECT (f x)
|
||||
gMap{|{}|} f xs = mapArray f xs
|
||||
gMap{|{!}|} f xs = mapArray f xs
|
||||
|
||||
derive gMap [], (,), (,,), (,,,), (,,,,), (,,,,,), (,,,,,,), (,,,,,,,)
|
||||
|
||||
54
samples/Clean/fsieve.icl
Normal file
54
samples/Clean/fsieve.icl
Normal file
@@ -0,0 +1,54 @@
|
||||
module fsieve
|
||||
|
||||
/*
|
||||
The Fast Sieve of Eratosthenes.
|
||||
|
||||
A sequential and optimized version of the sieve of Eratosthenes.
|
||||
The program calculates a list of the first NrOfPrime primes.
|
||||
The result of the program is the NrOfPrimes'th prime.
|
||||
|
||||
Strictness annotations have been added because the strictness analyser
|
||||
is not able to deduce all strictness information. Removal of these !'s
|
||||
will make the program about 20% slower.
|
||||
|
||||
On a machine without a math coprocessor the execution of this
|
||||
program might take a (very) long time. Set NrOfPrimes to a smaller value.
|
||||
*/
|
||||
|
||||
import StdClass; // RWS
|
||||
import StdInt, StdReal
|
||||
|
||||
NrOfPrimes :== 3000
|
||||
|
||||
// The sieve algorithm: generate an infinite list of all primes.
|
||||
|
||||
Primes::[Int]
|
||||
Primes = pr where pr = [5 : Sieve 7 4 pr]
|
||||
|
||||
Sieve::Int !Int [Int] -> [Int]
|
||||
Sieve g i prs
|
||||
| IsPrime prs g (toInt (sqrt (toReal g))) = [g : Sieve` g i prs]
|
||||
= Sieve (g + i) (6 - i) prs
|
||||
|
||||
Sieve`::Int Int [Int] -> [Int]
|
||||
Sieve` g i prs = Sieve (g + i) (6 - i) prs
|
||||
|
||||
IsPrime::[Int] !Int Int -> Bool
|
||||
IsPrime [f:r] pr bd | f>bd = True
|
||||
| pr rem f==0 = False
|
||||
= IsPrime r pr bd
|
||||
|
||||
// Select is used to get the NrOfPrimes'th prime from the infinite list.
|
||||
|
||||
Select::[x] Int -> x
|
||||
Select [f:r] 1 = f
|
||||
Select [f:r] n = Select r (n - 1)
|
||||
|
||||
|
||||
/* The Start rule: Select the NrOfPrimes'th prime from the list of primes
|
||||
generated by Primes.
|
||||
*/
|
||||
|
||||
Start::Int
|
||||
Start = Select [2, 3 : Primes] NrOfPrimes
|
||||
|
||||
99
samples/Clean/sem.icl
Normal file
99
samples/Clean/sem.icl
Normal file
@@ -0,0 +1,99 @@
|
||||
module monadicSemantics
|
||||
|
||||
import StdEnv, StdGeneric, GenMap, GenHylo
|
||||
|
||||
/* For fun I implemented the recursive datastructre Exp and Stm as fixpoints
|
||||
This helps us define recursive functions on them (only a little bit though)
|
||||
However deriving gMap for Fix did not works out of the box
|
||||
I had to remove some uniqueness typing in GenMap and GenHylo */
|
||||
:: Op = Plus | Minus | Times | Rem | Equal | LessThan
|
||||
:: Var :== String
|
||||
|
||||
:: ExpP a = Int Int | Var Var | Op Op a a
|
||||
:: Exp :== Fix ExpP
|
||||
|
||||
:: StmP a = Assign Var Exp | If Exp a a | While Exp a | Seq a a | Cont
|
||||
:: Stm :== Fix StmP
|
||||
|
||||
derive gMap ExpP, StmP, Fix
|
||||
|
||||
// Environment. Semantics is basically Env -> Env
|
||||
:: Env :== Var -> Int
|
||||
:: Sem :== Env -> (Int, Env)
|
||||
empty = \v . 0
|
||||
|
||||
// return
|
||||
rtn :: Int -> Sem
|
||||
rtn i = \e. (i, e)
|
||||
|
||||
// the usual bind
|
||||
(>>=) infixl 1 :: Sem (Int->Sem) -> Sem
|
||||
(>>=) x y = \e. (\(i,e2).y i e2) (x e)
|
||||
(>>|) infixl 1 :: Sem Sem -> Sem
|
||||
(>>|) x y = x >>= \_. y
|
||||
|
||||
// read variable from environment
|
||||
read :: Var -> Sem
|
||||
read v = \e. (e v, e)
|
||||
|
||||
// assign value to give variable in environment
|
||||
write :: Var Int -> Sem
|
||||
write v i = \e. (i, \w. if (w==v) i (e w))
|
||||
|
||||
// semantics
|
||||
class sem a :: a -> Sem
|
||||
|
||||
operator :: Op -> Int -> Int -> Int
|
||||
operator Plus = (+)
|
||||
operator Minus = (-)
|
||||
operator Times = (*)
|
||||
operator Rem = rem
|
||||
operator Equal = \x y . if (x==y) 1 0
|
||||
operator LessThan = \x y . if (x< y) 1 0
|
||||
|
||||
// semantics of expressions
|
||||
instance sem Exp where
|
||||
sem x = cata phi x where
|
||||
phi (Int n) = rtn n
|
||||
phi (Var v) = read v
|
||||
phi (Op op x y) = x >>= \v1. y >>= return o (operator op v1)
|
||||
|
||||
// semantics of statments
|
||||
// NOTE: while will always return 0, as it might not even be executed
|
||||
instance sem Stm where
|
||||
sem x = cata phi x where
|
||||
phi (Assign v e) = sem e >>= write v
|
||||
phi (If e s1 s2) = sem e >>= \b . if (b<>0) s1 s2
|
||||
phi stm=:(While e s) = sem e >>= \b . if (b<>0) (s >>| phi stm) (phi Cont)
|
||||
phi (Seq s1 s2) = s1 >>| s2 // Here the cata *finally* pays off :D
|
||||
phi Cont = rtn 0
|
||||
|
||||
// convenience functions
|
||||
int = In o Int
|
||||
var = In o Var
|
||||
op o = In o2 (Op o)
|
||||
assign = In o2 Assign
|
||||
ifte e = In o2 (If e)
|
||||
while = In o2 While
|
||||
seq = In o2 Seq
|
||||
cont = In Cont
|
||||
|
||||
// test case, also testing the new operator <
|
||||
pEuclides =
|
||||
while (op LessThan (int 0) (var "b"))(
|
||||
seq (assign "r" (op Rem (var "a") (var "b")))
|
||||
(seq (assign "a" (var "b"))
|
||||
( (assign "b" (var "r")))
|
||||
)
|
||||
)
|
||||
|
||||
Start = fst (program start) where
|
||||
program = sem pEuclides >>| read "a"
|
||||
start "a" = 9
|
||||
start "b" = 12
|
||||
start _ = 0
|
||||
|
||||
// Helper
|
||||
(o2) infixr 9
|
||||
(o2) f g x :== f o (g x)
|
||||
|
||||
14
samples/Clean/stack.dcl
Normal file
14
samples/Clean/stack.dcl
Normal file
@@ -0,0 +1,14 @@
|
||||
definition module stack
|
||||
|
||||
:: Stack a
|
||||
|
||||
newStack :: (Stack a)
|
||||
push :: a (Stack a) -> Stack a
|
||||
pushes :: [a] (Stack a) -> Stack a
|
||||
pop :: (Stack a) -> Stack a
|
||||
popn :: Int (Stack a) -> Stack a
|
||||
top :: (Stack a) -> a
|
||||
topn :: Int (Stack a) -> [a]
|
||||
elements :: (Stack a) -> [a]
|
||||
count :: (Stack a) -> Int
|
||||
|
||||
33
samples/Clean/stack.icl
Normal file
33
samples/Clean/stack.icl
Normal file
@@ -0,0 +1,33 @@
|
||||
implementation module stack
|
||||
import StdEnv
|
||||
|
||||
:: Stack a :== [a]
|
||||
|
||||
newStack :: (Stack a)
|
||||
newStack = []
|
||||
|
||||
push :: a (Stack a) -> Stack a
|
||||
push x s = [x:s]
|
||||
|
||||
pushes :: [a] (Stack a) -> Stack a
|
||||
pushes x s = x ++ s
|
||||
|
||||
pop :: (Stack a) -> Stack a
|
||||
pop [] = abort "Cannot use pop on an empty stack"
|
||||
pop [e:s] = s
|
||||
|
||||
popn :: Int (Stack a) -> Stack a
|
||||
popn n s = drop n s
|
||||
|
||||
top :: (Stack a) -> a
|
||||
top [] = abort "Cannot use top on an empty stack"
|
||||
top [e:s] = e
|
||||
|
||||
topn :: Int (Stack a) -> [a]
|
||||
topn n s = take n s
|
||||
elements :: (Stack a) -> [a]
|
||||
elements s = s
|
||||
|
||||
count :: (Stack a) -> Int
|
||||
count s = length s
|
||||
|
||||
16
samples/Clean/streams.dcl
Normal file
16
samples/Clean/streams.dcl
Normal file
@@ -0,0 +1,16 @@
|
||||
definition module streams
|
||||
|
||||
import StdEnv
|
||||
|
||||
instance zero [Real]
|
||||
instance one [Real]
|
||||
instance + [Real]
|
||||
instance - [Real]
|
||||
instance * [Real]
|
||||
instance / [Real]
|
||||
|
||||
X :: [Real]
|
||||
invert :: [Real] -> [Real]
|
||||
pow :: [Real] Int -> [Real]
|
||||
(shuffle) infixl 7 :: [Real] [Real] -> [Real]
|
||||
|
||||
49
samples/Clean/streams.icl
Normal file
49
samples/Clean/streams.icl
Normal file
@@ -0,0 +1,49 @@
|
||||
implementation module streams
|
||||
|
||||
import StdEnv
|
||||
|
||||
instance zero [Real]
|
||||
where
|
||||
zero = [] //Infinite row of zeroes represented as empty list to ease computation
|
||||
|
||||
instance one [Real]
|
||||
where
|
||||
one = [1.0:zero]
|
||||
|
||||
instance + [Real]
|
||||
where
|
||||
(+) [s:s`] [t:t`] = [s+t:s`+t`]
|
||||
(+) [s:s`] [] = [s:s`]
|
||||
(+) [] [t:t`] = [t:t`]
|
||||
(+) [] [] = []
|
||||
|
||||
instance - [Real]
|
||||
where
|
||||
(-) [s:s`] [t:t`] = [s-t:s`-t`]
|
||||
(-) [s:s`] [] = [s:s`]
|
||||
(-) [] [t:t`] = [-1.0] * [t:t`]
|
||||
(-) [] [] = []
|
||||
|
||||
instance * [Real]
|
||||
where
|
||||
(*) [s:s`] [t:t`] = [s*t:s`*[t:t`]+[s]*t`]
|
||||
(*) _ _ = []
|
||||
|
||||
instance / [Real]
|
||||
where
|
||||
(/) s t = s * (invert t)
|
||||
|
||||
X :: [Real]
|
||||
X = [0.0:one]
|
||||
|
||||
invert :: [Real] -> [Real]
|
||||
invert [s:s`] = [1.0/s:(invert [s:s`]) * s` * [-1.0/s]]
|
||||
|
||||
pow :: [Real] Int -> [Real]
|
||||
pow s 0 = one
|
||||
pow s n = s * pow s (n-1)
|
||||
|
||||
(shuffle) infixl 7 :: [Real] [Real] -> [Real]
|
||||
(shuffle) [s:s`] [t:t`] = [s*t:s` shuffle [t:t`] + [s:s`] shuffle t`]
|
||||
(shuffle) _ _ = []
|
||||
|
||||
8
samples/Forth/bitmap.frt
Normal file
8
samples/Forth/bitmap.frt
Normal file
@@ -0,0 +1,8 @@
|
||||
\ Bit arrays.
|
||||
: bits ( u1 -- u2 ) 7 + 3 rshift ;
|
||||
: bitmap ( u "name" -- ) create bits here over erase allot
|
||||
does> ( u -- a x ) over 3 rshift + 1 rot 7 and lshift ;
|
||||
: bit@ ( a x -- f ) swap c@ and ;
|
||||
: 1bit ( a x -- ) over c@ or swap c! ;
|
||||
: 0bit ( a x -- ) invert over c@ and swap c! ;
|
||||
: bit! ( f a x -- ) rot if 1bit else 0bit then ;
|
||||
7
samples/Forth/enum.frt
Normal file
7
samples/Forth/enum.frt
Normal file
@@ -0,0 +1,7 @@
|
||||
\ Implements ENUM.
|
||||
|
||||
\ Double DOES>!
|
||||
: enum create 0 , does> create dup @ 1 rot +! , does> @ ;
|
||||
|
||||
\ But this is simpler.
|
||||
: enum create 0 , does> dup @ constant 1 swap +! ;
|
||||
8
samples/Forth/macros.frt
Normal file
8
samples/Forth/macros.frt
Normal file
@@ -0,0 +1,8 @@
|
||||
\ Simplifies compiling words.
|
||||
|
||||
: [[ ; immediate
|
||||
: '<> >in @ ' swap >in ! <> ;
|
||||
: (]]) begin dup '<> while postpone postpone repeat drop ;
|
||||
: ]] ['] [[ (]]) ; immediate
|
||||
|
||||
( Usage: : foo ]] dup * [[ ; immediate : bar 42 foo . ; )
|
||||
57
samples/G-code/duettest.g
Normal file
57
samples/G-code/duettest.g
Normal file
@@ -0,0 +1,57 @@
|
||||
; RepRapPro Ormerod
|
||||
; Board test GCodes
|
||||
M111 S1; Debug on
|
||||
G21 ; mm
|
||||
G90 ; Absolute positioning
|
||||
M83 ; Extrusion relative
|
||||
M906 X800 Y800 Z800 E800 ; Motor currents (mA)
|
||||
T0 ; Extruder 0
|
||||
G1 X50 F500
|
||||
G1 X0
|
||||
G4 P500
|
||||
G1 Y50 F500
|
||||
G1 Y0
|
||||
G4 P500
|
||||
G1 Z20 F200
|
||||
G1 Z0
|
||||
G4 P500
|
||||
G1 E20 F200
|
||||
G1 E-20
|
||||
G4 P500
|
||||
M106 S255
|
||||
G4 P500
|
||||
M106 S0
|
||||
G4 P500
|
||||
M105
|
||||
G10 P0 S100
|
||||
T0
|
||||
M140 S100
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
G4 P5000
|
||||
M105
|
||||
M0
|
||||
|
||||
|
||||
|
||||
|
||||
25912
samples/G-code/lm.g
Normal file
25912
samples/G-code/lm.g
Normal file
File diff suppressed because it is too large
Load Diff
29735
samples/G-code/rm.g
Normal file
29735
samples/G-code/rm.g
Normal file
File diff suppressed because it is too large
Load Diff
13
samples/G-code/square.g
Normal file
13
samples/G-code/square.g
Normal file
@@ -0,0 +1,13 @@
|
||||
G28 X0 Y0
|
||||
G1 X55 Y5 F2000
|
||||
G1 Y180
|
||||
G1 X180
|
||||
G1 Y5
|
||||
G1 X55
|
||||
G1 Y180
|
||||
G1 X180
|
||||
G1 Y5
|
||||
G1 X55
|
||||
M0
|
||||
|
||||
|
||||
57
samples/GDScript/example.gd
Normal file
57
samples/GDScript/example.gd
Normal file
@@ -0,0 +1,57 @@
|
||||
# Taken from https://github.com/okamstudio/godot/wiki/gdscript
|
||||
# a file is a class!
|
||||
|
||||
# inheritance
|
||||
|
||||
extends BaseClass
|
||||
|
||||
# member variables
|
||||
|
||||
var a = 5
|
||||
var s = "Hello"
|
||||
var arr = [1, 2, 3]
|
||||
var dict = {"key":"value", 2:3}
|
||||
|
||||
# constants
|
||||
|
||||
const answer = 42
|
||||
const thename = "Charly"
|
||||
|
||||
# built-in vector types
|
||||
|
||||
var v2 = Vector2(1, 2)
|
||||
var v3 = Vector3(1, 2, 3)
|
||||
|
||||
# function
|
||||
|
||||
func some_function(param1, param2):
|
||||
var local_var = 5
|
||||
|
||||
if param1 < local_var:
|
||||
print(param1)
|
||||
elif param2 > 5:
|
||||
print(param2)
|
||||
else:
|
||||
print("fail!")
|
||||
|
||||
for i in range(20):
|
||||
print(i)
|
||||
|
||||
while(param2 != 0):
|
||||
param2 -= 1
|
||||
|
||||
var local_var2 = param1+3
|
||||
return local_var2
|
||||
|
||||
|
||||
# subclass
|
||||
|
||||
class Something:
|
||||
var a = 10
|
||||
|
||||
# constructor
|
||||
|
||||
func _init():
|
||||
print("constructed!")
|
||||
var lv = Something.new()
|
||||
print(lv.a)
|
||||
216
samples/GDScript/grid.gd
Normal file
216
samples/GDScript/grid.gd
Normal file
@@ -0,0 +1,216 @@
|
||||
|
||||
|
||||
extends Control
|
||||
|
||||
# Simple Tetris-like demo, (c) 2012 Juan Linietsky
|
||||
# Implemented by using a regular Control and drawing on it during the _draw() callback.
|
||||
# The drawing surface is updated only when changes happen (by calling update())
|
||||
|
||||
|
||||
var score = 0
|
||||
var score_label=null
|
||||
|
||||
const MAX_SHAPES = 7
|
||||
|
||||
var block = preload("block.png")
|
||||
|
||||
var block_colors=[
|
||||
Color(1,0.5,0.5),
|
||||
Color(0.5,1,0.5),
|
||||
Color(0.5,0.5,1),
|
||||
Color(0.8,0.4,0.8),
|
||||
Color(0.8,0.8,0.4),
|
||||
Color(0.4,0.8,0.8),
|
||||
Color(0.7,0.7,0.7)]
|
||||
|
||||
var block_shapes=[
|
||||
[ Vector2(0,-1),Vector2(0,0),Vector2(0,1),Vector2(0,2) ], # I
|
||||
[ Vector2(0,0),Vector2(1,0),Vector2(1,1),Vector2(0,1) ], # O
|
||||
[ Vector2(-1,1),Vector2(0,1),Vector2(0,0),Vector2(1,0) ], # S
|
||||
[ Vector2(1,1),Vector2(0,1),Vector2(0,0),Vector2(-1,0) ], # Z
|
||||
[ Vector2(-1,1),Vector2(-1,0),Vector2(0,0),Vector2(1,0) ], # L
|
||||
[ Vector2(1,1),Vector2(1,0),Vector2(0,0),Vector2(-1,0) ], # J
|
||||
[ Vector2(0,1),Vector2(1,0),Vector2(0,0),Vector2(-1,0) ]] # T
|
||||
|
||||
|
||||
var block_rotations=[
|
||||
Matrix32( Vector2(1,0),Vector2(0,1), Vector2() ),
|
||||
Matrix32( Vector2(0,1),Vector2(-1,0), Vector2() ),
|
||||
Matrix32( Vector2(-1,0),Vector2(0,-1), Vector2() ),
|
||||
Matrix32( Vector2(0,-1),Vector2(1,0), Vector2() )
|
||||
]
|
||||
|
||||
|
||||
var width=0
|
||||
var height=0
|
||||
|
||||
var cells={}
|
||||
|
||||
var piece_active=false
|
||||
var piece_shape=0
|
||||
var piece_pos=Vector2()
|
||||
var piece_rot=0
|
||||
|
||||
|
||||
func piece_cell_xform(p,er=0):
|
||||
var r = (4+er+piece_rot)%4
|
||||
return piece_pos+block_rotations[r].xform(p)
|
||||
|
||||
func _draw():
|
||||
|
||||
var sb = get_stylebox("bg","Tree") # use line edit bg
|
||||
draw_style_box(sb,Rect2(Vector2(),get_size()).grow(3))
|
||||
|
||||
var bs = block.get_size()
|
||||
for y in range(height):
|
||||
for x in range(width):
|
||||
if (Vector2(x,y) in cells):
|
||||
draw_texture_rect(block,Rect2(Vector2(x,y)*bs,bs),false,block_colors[cells[Vector2(x,y)]])
|
||||
|
||||
if (piece_active):
|
||||
|
||||
for c in block_shapes[piece_shape]:
|
||||
draw_texture_rect(block,Rect2(piece_cell_xform(c)*bs,bs),false,block_colors[piece_shape])
|
||||
|
||||
|
||||
func piece_check_fit(ofs,er=0):
|
||||
|
||||
for c in block_shapes[piece_shape]:
|
||||
var pos = piece_cell_xform(c,er)+ofs
|
||||
if (pos.x < 0):
|
||||
return false
|
||||
if (pos.y < 0):
|
||||
return false
|
||||
if (pos.x >= width):
|
||||
return false
|
||||
if (pos.y >= height):
|
||||
return false
|
||||
if (pos in cells):
|
||||
return false
|
||||
|
||||
return true
|
||||
|
||||
func new_piece():
|
||||
|
||||
piece_shape = randi() % MAX_SHAPES
|
||||
piece_pos = Vector2(width/2,0)
|
||||
piece_active=true
|
||||
piece_rot=0
|
||||
if (piece_shape==0):
|
||||
piece_pos.y+=1
|
||||
|
||||
if (not piece_check_fit(Vector2())):
|
||||
#game over
|
||||
#print("GAME OVER!")
|
||||
game_over()
|
||||
|
||||
update()
|
||||
|
||||
|
||||
func test_collapse_rows():
|
||||
var accum_down=0
|
||||
for i in range(height):
|
||||
var y = height - i - 1
|
||||
var collapse = true
|
||||
for x in range(width):
|
||||
if (Vector2(x,y) in cells):
|
||||
if (accum_down):
|
||||
cells[ Vector2(x,y+accum_down) ] = cells[Vector2(x,y)]
|
||||
else:
|
||||
collapse=false
|
||||
if (accum_down):
|
||||
cells.erase( Vector2(x,y+accum_down) )
|
||||
|
||||
if (collapse):
|
||||
accum_down+=1
|
||||
|
||||
|
||||
score+=accum_down*100
|
||||
score_label.set_text(str(score))
|
||||
|
||||
|
||||
func game_over():
|
||||
|
||||
piece_active=false
|
||||
get_node("gameover").set_text("Game Over")
|
||||
update()
|
||||
|
||||
|
||||
func restart_pressed():
|
||||
|
||||
score=0
|
||||
score_label.set_text("0")
|
||||
cells.clear()
|
||||
get_node("gameover").set_text("")
|
||||
piece_active=true
|
||||
update()
|
||||
|
||||
|
||||
|
||||
func piece_move_down():
|
||||
|
||||
if (!piece_active):
|
||||
return
|
||||
if (piece_check_fit(Vector2(0,1))):
|
||||
piece_pos.y+=1
|
||||
update()
|
||||
else:
|
||||
|
||||
for c in block_shapes[piece_shape]:
|
||||
var pos = piece_cell_xform(c)
|
||||
cells[pos]=piece_shape
|
||||
test_collapse_rows()
|
||||
new_piece()
|
||||
|
||||
|
||||
func piece_rotate():
|
||||
|
||||
var adv = 1
|
||||
if (not piece_check_fit(Vector2(),1)):
|
||||
return
|
||||
piece_rot = (piece_rot + adv) % 4
|
||||
update()
|
||||
|
||||
|
||||
|
||||
func _input(ie):
|
||||
|
||||
|
||||
if (not piece_active):
|
||||
return
|
||||
if (!ie.is_pressed()):
|
||||
return
|
||||
|
||||
if (ie.is_action("move_left")):
|
||||
if (piece_check_fit(Vector2(-1,0))):
|
||||
piece_pos.x-=1
|
||||
update()
|
||||
elif (ie.is_action("move_right")):
|
||||
if (piece_check_fit(Vector2(1,0))):
|
||||
piece_pos.x+=1
|
||||
update()
|
||||
elif (ie.is_action("move_down")):
|
||||
piece_move_down()
|
||||
elif (ie.is_action("rotate")):
|
||||
piece_rotate()
|
||||
|
||||
|
||||
func setup(w,h):
|
||||
width=w
|
||||
height=h
|
||||
set_size( Vector2(w,h)*block.get_size() )
|
||||
new_piece()
|
||||
get_node("timer").start()
|
||||
|
||||
|
||||
func _ready():
|
||||
# Initalization here
|
||||
|
||||
setup(10,20)
|
||||
score_label = get_node("../score")
|
||||
|
||||
set_process_input(true)
|
||||
|
||||
|
||||
|
||||
|
||||
243
samples/GDScript/player.gd
Normal file
243
samples/GDScript/player.gd
Normal file
@@ -0,0 +1,243 @@
|
||||
|
||||
extends RigidBody
|
||||
|
||||
# member variables here, example:
|
||||
# var a=2
|
||||
# var b="textvar"
|
||||
|
||||
#var dir=Vector3()
|
||||
|
||||
const ANIM_FLOOR = 0
|
||||
const ANIM_AIR_UP = 1
|
||||
const ANIM_AIR_DOWN = 2
|
||||
|
||||
const SHOOT_TIME = 1.5
|
||||
const SHOOT_SCALE = 2
|
||||
|
||||
const CHAR_SCALE = Vector3(0.3,0.3,0.3)
|
||||
|
||||
var facing_dir = Vector3(1, 0, 0)
|
||||
var movement_dir = Vector3()
|
||||
|
||||
var jumping=false
|
||||
|
||||
var turn_speed=40
|
||||
var keep_jump_inertia = true
|
||||
var air_idle_deaccel = false
|
||||
var accel=19.0
|
||||
var deaccel=14.0
|
||||
var sharp_turn_threshhold = 140
|
||||
|
||||
var max_speed=3.1
|
||||
var on_floor = false
|
||||
|
||||
var prev_shoot = false
|
||||
|
||||
var last_floor_velocity = Vector3()
|
||||
|
||||
var shoot_blend = 0
|
||||
|
||||
func adjust_facing(p_facing, p_target,p_step, p_adjust_rate,current_gn):
|
||||
|
||||
var n = p_target # normal
|
||||
var t = n.cross(current_gn).normalized()
|
||||
|
||||
var x = n.dot(p_facing)
|
||||
var y = t.dot(p_facing)
|
||||
|
||||
var ang = atan2(y,x)
|
||||
|
||||
if (abs(ang)<0.001): # too small
|
||||
return p_facing
|
||||
|
||||
var s = sign(ang)
|
||||
ang = ang * s
|
||||
var turn = ang * p_adjust_rate * p_step
|
||||
var a
|
||||
if (ang<turn):
|
||||
a=ang
|
||||
else:
|
||||
a=turn
|
||||
ang = (ang - a) * s
|
||||
|
||||
return ((n * cos(ang)) + (t * sin(ang))) * p_facing.length()
|
||||
|
||||
|
||||
|
||||
func _integrate_forces( state ):
|
||||
|
||||
var lv = state.get_linear_velocity() # linear velocity
|
||||
var g = state.get_total_gravity()
|
||||
var delta = state.get_step()
|
||||
var d = 1.0 - delta*state.get_total_density()
|
||||
if (d<0):
|
||||
d=0
|
||||
lv += g * delta #apply gravity
|
||||
|
||||
var anim = ANIM_FLOOR
|
||||
|
||||
var up = -g.normalized() # (up is against gravity)
|
||||
var vv = up.dot(lv) # vertical velocity
|
||||
var hv = lv - (up*vv) # horizontal velocity
|
||||
|
||||
|
||||
|
||||
var hdir = hv.normalized() # horizontal direction
|
||||
var hspeed = hv.length() #horizontal speed
|
||||
|
||||
var floor_velocity
|
||||
var onfloor = false
|
||||
|
||||
if (state.get_contact_count() == 0):
|
||||
floor_velocity = last_floor_velocity
|
||||
else:
|
||||
for i in range(state.get_contact_count()):
|
||||
if (state.get_contact_local_shape(i) != 1):
|
||||
continue
|
||||
|
||||
onfloor = true
|
||||
floor_velocity = state.get_contact_collider_velocity_at_pos(i)
|
||||
break
|
||||
|
||||
|
||||
var dir = Vector3() #where does the player intend to walk to
|
||||
var cam_xform = get_node("target/camera").get_global_transform()
|
||||
|
||||
if (Input.is_action_pressed("move_forward")):
|
||||
dir+=-cam_xform.basis[2]
|
||||
if (Input.is_action_pressed("move_backwards")):
|
||||
dir+=cam_xform.basis[2]
|
||||
if (Input.is_action_pressed("move_left")):
|
||||
dir+=-cam_xform.basis[0]
|
||||
if (Input.is_action_pressed("move_right")):
|
||||
dir+=cam_xform.basis[0]
|
||||
|
||||
var jump_attempt = Input.is_action_pressed("jump")
|
||||
var shoot_attempt = Input.is_action_pressed("shoot")
|
||||
|
||||
var target_dir = (dir - up*dir.dot(up)).normalized()
|
||||
|
||||
if (onfloor):
|
||||
|
||||
var sharp_turn = hspeed > 0.1 and rad2deg(acos(target_dir.dot(hdir))) > sharp_turn_threshhold
|
||||
|
||||
if (dir.length()>0.1 and !sharp_turn) :
|
||||
if (hspeed > 0.001) :
|
||||
|
||||
#linear_dir = linear_h_velocity/linear_vel
|
||||
#if (linear_vel > brake_velocity_limit and linear_dir.dot(ctarget_dir)<-cos(Math::deg2rad(brake_angular_limit)))
|
||||
# brake=true
|
||||
#else
|
||||
hdir = adjust_facing(hdir,target_dir,delta,1.0/hspeed*turn_speed,up)
|
||||
facing_dir = hdir
|
||||
else:
|
||||
|
||||
hdir = target_dir
|
||||
|
||||
if (hspeed<max_speed):
|
||||
hspeed+=accel*delta
|
||||
|
||||
else:
|
||||
hspeed-=deaccel*delta
|
||||
if (hspeed<0):
|
||||
hspeed=0
|
||||
|
||||
hv = hdir*hspeed
|
||||
|
||||
var mesh_xform = get_node("Armature").get_transform()
|
||||
var facing_mesh=-mesh_xform.basis[0].normalized()
|
||||
facing_mesh = (facing_mesh - up*facing_mesh.dot(up)).normalized()
|
||||
facing_mesh = adjust_facing(facing_mesh,target_dir,delta,1.0/hspeed*turn_speed,up)
|
||||
var m3 = Matrix3(-facing_mesh,up,-facing_mesh.cross(up).normalized()).scaled( CHAR_SCALE )
|
||||
|
||||
get_node("Armature").set_transform(Transform(m3,mesh_xform.origin))
|
||||
|
||||
if (not jumping and jump_attempt):
|
||||
vv = 7.0
|
||||
jumping = true
|
||||
get_node("sfx").play("jump")
|
||||
else:
|
||||
|
||||
if (vv>0):
|
||||
anim=ANIM_AIR_UP
|
||||
else:
|
||||
anim=ANIM_AIR_DOWN
|
||||
|
||||
var hs
|
||||
if (dir.length()>0.1):
|
||||
|
||||
hv += target_dir * (accel * 0.2) * delta
|
||||
if (hv.length() > max_speed):
|
||||
hv = hv.normalized() * max_speed
|
||||
|
||||
else:
|
||||
|
||||
if (air_idle_deaccel):
|
||||
hspeed = hspeed - (deaccel * 0.2) * delta
|
||||
if (hspeed<0):
|
||||
hspeed=0
|
||||
|
||||
hv = hdir*hspeed
|
||||
|
||||
|
||||
if (jumping and vv < 0):
|
||||
jumping=false
|
||||
|
||||
lv = hv+up*vv
|
||||
|
||||
|
||||
|
||||
if (onfloor):
|
||||
|
||||
movement_dir = lv
|
||||
#lv += floor_velocity
|
||||
last_floor_velocity = floor_velocity
|
||||
else:
|
||||
|
||||
if (on_floor) :
|
||||
|
||||
#if (keep_jump_inertia):
|
||||
# lv += last_floor_velocity
|
||||
pass
|
||||
|
||||
last_floor_velocity = Vector3()
|
||||
movement_dir = lv
|
||||
|
||||
on_floor = onfloor
|
||||
|
||||
state.set_linear_velocity(lv)
|
||||
|
||||
if (shoot_blend>0):
|
||||
shoot_blend -= delta * SHOOT_SCALE
|
||||
if (shoot_blend<0):
|
||||
shoot_blend=0
|
||||
|
||||
if (shoot_attempt and not prev_shoot):
|
||||
shoot_blend = SHOOT_TIME
|
||||
var bullet = preload("res://bullet.scn").instance()
|
||||
bullet.set_transform( get_node("Armature/bullet").get_global_transform().orthonormalized() )
|
||||
get_parent().add_child( bullet )
|
||||
bullet.set_linear_velocity( get_node("Armature/bullet").get_global_transform().basis[2].normalized() * 20 )
|
||||
PS.body_add_collision_exception( bullet.get_rid(), get_rid() ) #add it to bullet
|
||||
get_node("sfx").play("shoot")
|
||||
|
||||
prev_shoot = shoot_attempt
|
||||
|
||||
if (onfloor):
|
||||
get_node("AnimationTreePlayer").blend2_node_set_amount("walk",hspeed / max_speed)
|
||||
|
||||
get_node("AnimationTreePlayer").transition_node_set_current("state",anim)
|
||||
get_node("AnimationTreePlayer").blend2_node_set_amount("gun",min(shoot_blend,1.0))
|
||||
# state.set_angular_velocity(Vector3())
|
||||
|
||||
|
||||
|
||||
|
||||
func _ready():
|
||||
|
||||
|
||||
# Initalization here
|
||||
get_node("AnimationTreePlayer").set_active(true)
|
||||
pass
|
||||
|
||||
|
||||
73
samples/GDScript/pong.gd
Normal file
73
samples/GDScript/pong.gd
Normal file
@@ -0,0 +1,73 @@
|
||||
|
||||
extends Node2D
|
||||
|
||||
# member variables here, example:
|
||||
# var a=2
|
||||
# var b="textvar"
|
||||
const INITIAL_BALL_SPEED = 80
|
||||
var ball_speed = INITIAL_BALL_SPEED
|
||||
var screen_size = Vector2(640,400)
|
||||
#default ball direction
|
||||
var direction = Vector2(-1,0)
|
||||
var pad_size = Vector2(8,32)
|
||||
const PAD_SPEED = 150
|
||||
|
||||
|
||||
func _process(delta):
|
||||
|
||||
|
||||
# get ball positio and pad rectangles
|
||||
var ball_pos = get_node("ball").get_pos()
|
||||
var left_rect = Rect2( get_node("left").get_pos() - pad_size*0.5, pad_size )
|
||||
var right_rect = Rect2( get_node("right").get_pos() - pad_size*0.5, pad_size )
|
||||
|
||||
#integrate new ball postion
|
||||
ball_pos+=direction*ball_speed*delta
|
||||
|
||||
#flip when touching roof or floor
|
||||
if ( (ball_pos.y<0 and direction.y <0) or (ball_pos.y>screen_size.y and direction.y>0)):
|
||||
direction.y = -direction.y
|
||||
|
||||
#flip, change direction and increase speed when touching pads
|
||||
if ( (left_rect.has_point(ball_pos) and direction.x < 0) or (right_rect.has_point(ball_pos) and direction.x > 0)):
|
||||
direction.x=-direction.x
|
||||
ball_speed*=1.1
|
||||
direction.y=randf()*2.0-1
|
||||
direction = direction.normalized()
|
||||
|
||||
#check gameover
|
||||
if (ball_pos.x<0 or ball_pos.x>screen_size.x):
|
||||
ball_pos=screen_size*0.5
|
||||
ball_speed=INITIAL_BALL_SPEED
|
||||
direction=Vector2(-1,0)
|
||||
|
||||
|
||||
get_node("ball").set_pos(ball_pos)
|
||||
|
||||
#move left pad
|
||||
var left_pos = get_node("left").get_pos()
|
||||
|
||||
if (left_pos.y > 0 and Input.is_action_pressed("left_move_up")):
|
||||
left_pos.y+=-PAD_SPEED*delta
|
||||
if (left_pos.y < screen_size.y and Input.is_action_pressed("left_move_down")):
|
||||
left_pos.y+=PAD_SPEED*delta
|
||||
|
||||
get_node("left").set_pos(left_pos)
|
||||
|
||||
#move right pad
|
||||
var right_pos = get_node("right").get_pos()
|
||||
|
||||
if (right_pos.y > 0 and Input.is_action_pressed("right_move_up")):
|
||||
right_pos.y+=-PAD_SPEED*delta
|
||||
if (right_pos.y < screen_size.y and Input.is_action_pressed("right_move_down")):
|
||||
right_pos.y+=PAD_SPEED*delta
|
||||
|
||||
get_node("right").set_pos(right_pos)
|
||||
|
||||
|
||||
|
||||
func _ready():
|
||||
screen_size = get_viewport_rect().size # get actual size
|
||||
pad_size = get_node("left").get_texture().get_size()
|
||||
set_process(true)
|
||||
|
||||
21
samples/Graph Modeling Language/sample.gml
Normal file
21
samples/Graph Modeling Language/sample.gml
Normal file
@@ -0,0 +1,21 @@
|
||||
graph
|
||||
[
|
||||
directed 0
|
||||
node
|
||||
[
|
||||
id 0
|
||||
label "Node 1"
|
||||
value 100
|
||||
]
|
||||
node
|
||||
[
|
||||
id 1
|
||||
label "Node 2"
|
||||
value 200
|
||||
]
|
||||
edge
|
||||
[
|
||||
source 1
|
||||
target 0
|
||||
]
|
||||
]
|
||||
13
samples/Groff/sample.4
Normal file
13
samples/Groff/sample.4
Normal file
@@ -0,0 +1,13 @@
|
||||
.TH FOO 1
|
||||
.SH NAME
|
||||
foo \- bar
|
||||
.SH SYNOPSIS
|
||||
.B foo
|
||||
.I bar
|
||||
.SH DESCRIPTION
|
||||
Foo bar
|
||||
.BR baz
|
||||
quux.
|
||||
.PP
|
||||
.B Foo
|
||||
bar baz.
|
||||
12
samples/JavaScript/jsbuild.jsb
Normal file
12
samples/JavaScript/jsbuild.jsb
Normal file
@@ -0,0 +1,12 @@
|
||||
jsb.library('mylibrary', jsb.STATIC_LIBRARY, function(libObject) {
|
||||
libObject.outputName = 'mylibrary';
|
||||
libObject.cflags = [ '-Wall' ];
|
||||
libObject.ldflags = [ '-pthread' ];
|
||||
libObject.includePaths = [ 'src/include' ];
|
||||
libObject.sources = [
|
||||
'src/main.cpp',
|
||||
'src/app.cpp'
|
||||
];
|
||||
});
|
||||
|
||||
jsb.build();
|
||||
74
samples/LSL/LSL.lsl
Normal file
74
samples/LSL/LSL.lsl
Normal file
@@ -0,0 +1,74 @@
|
||||
/*
|
||||
Testing syntax highlighting
|
||||
for the Linden Scripting Language
|
||||
*/
|
||||
|
||||
integer someIntNormal = 3672;
|
||||
integer someIntHex = 0x00000000;
|
||||
integer someIntMath = PI_BY_TWO;
|
||||
|
||||
integer event = 5673;// 'event' is invalid.illegal
|
||||
|
||||
key someKeyTexture = TEXTURE_DEFAULT;
|
||||
string someStringSpecial = EOF;
|
||||
|
||||
some_user_defined_function_without_return_type(string inputAsString)
|
||||
{
|
||||
llSay(PUBLIC_CHANNEL, inputAsString);
|
||||
}
|
||||
|
||||
string user_defined_function_returning_a_string(key inputAsKey)
|
||||
{
|
||||
return (string)inputAsKey;
|
||||
}
|
||||
|
||||
default
|
||||
{
|
||||
state_entry()
|
||||
{
|
||||
key someKey = NULL_KEY;
|
||||
someKey = llGetOwner();
|
||||
|
||||
string someString = user_defined_function_returning_a_string(someKey);
|
||||
|
||||
some_user_defined_function_without_return_type(someString);
|
||||
}
|
||||
|
||||
touch_start(integer num_detected)
|
||||
{
|
||||
list agentsInRegion = llGetAgentList(AGENT_LIST_REGION, []);
|
||||
integer numOfAgents = llGetListLength(agentsInRegion);
|
||||
|
||||
integer index; // defaults to 0
|
||||
for (; index <= numOfAgents - 1; index++) // for each agent in region
|
||||
{
|
||||
llRegionSayTo(llList2Key(agentsInRegion, index), PUBLIC_CHANNEL, "Hello, Avatar!");
|
||||
}
|
||||
}
|
||||
|
||||
touch_end(integer num_detected)
|
||||
{
|
||||
someIntNormal = 3672;
|
||||
someIntHex = 0x00000000;
|
||||
someIntMath = PI_BY_TWO;
|
||||
|
||||
event = 5673;// 'event' is invalid.illegal
|
||||
|
||||
someKeyTexture = TEXTURE_DEFAULT;
|
||||
someStringSpecial = EOF;
|
||||
|
||||
llSetInventoryPermMask("some item", MASK_NEXT, PERM_ALL);// 'llSetInventoryPermMask' is reserved.godmode
|
||||
|
||||
llWhisper(PUBLIC_CHANNEL, "Leaving \"default\" now...");
|
||||
state other;
|
||||
}
|
||||
}
|
||||
|
||||
state other
|
||||
{
|
||||
state_entry()
|
||||
{
|
||||
llWhisper(PUBLIC_CHANNEL, "Entered \"state other\", returning to \"default\" again...");
|
||||
state default;
|
||||
}
|
||||
}
|
||||
28
samples/Lua/wsapi.fcgi
Executable file
28
samples/Lua/wsapi.fcgi
Executable file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/lua
|
||||
|
||||
-- Generic WSAPI FastCGI launcher, extracts application to launch
|
||||
-- from SCRIPT_FILENAME/PATH_TRANSLATED, each application (defined
|
||||
-- by its script entry point) gets an isolated Lua VM; sequential
|
||||
-- requests to the same application go to the same VM
|
||||
|
||||
pcall(require,"luarocks.require")
|
||||
|
||||
local common = require "wsapi.common"
|
||||
local fastcgi = require "wsapi.fastcgi"
|
||||
|
||||
local ONE_HOUR = 60 * 60
|
||||
local ONE_DAY = 24 * ONE_HOUR
|
||||
|
||||
local wsapi_loader = common.make_loader{
|
||||
isolated = true, -- isolate each script in its own Lua state
|
||||
filename = nil, -- if you want to force the launch of a single script
|
||||
launcher = "wsapi.fcgi", -- the name of this script
|
||||
reload = false, -- if you want to reload the application on every request
|
||||
period = ONE_HOUR, -- frequency of Lua state staleness checks
|
||||
ttl = ONE_DAY, -- time-to-live for Lua states
|
||||
vars = -- order of checking for the path of the script
|
||||
{ "SCRIPT_FILENAME",
|
||||
"PATH_TRANSLATED" }
|
||||
}
|
||||
|
||||
fastcgi.run(wsapi_loader)
|
||||
3
samples/PHP/prefix.fcgi
Executable file
3
samples/PHP/prefix.fcgi
Executable file
@@ -0,0 +1,3 @@
|
||||
<?php
|
||||
echo $_SERVER[$_GET["var"]];
|
||||
?>
|
||||
193
samples/Pascal/custforms.pp
Normal file
193
samples/Pascal/custforms.pp
Normal file
@@ -0,0 +1,193 @@
|
||||
unit custforms;
|
||||
|
||||
{$mode objfpc}{$H+}
|
||||
|
||||
interface
|
||||
|
||||
uses
|
||||
Classes, SysUtils, Forms;
|
||||
|
||||
Type
|
||||
|
||||
{ TCustomFormDescr }
|
||||
|
||||
TCustomFormDescr = Class
|
||||
private
|
||||
FAuthor: String;
|
||||
FCaption: String;
|
||||
FCategory: String;
|
||||
FDescription: String;
|
||||
FFormClass: TFormClass;
|
||||
FLazPackage: String;
|
||||
FUnitName: String;
|
||||
public
|
||||
Constructor Create(AFormClass : TFormClass; const APackage: string);
|
||||
Constructor Create(AFormClass : TFormClass; Const ACaption,ADescription,AUnit,APackage : String);
|
||||
Property FormClass : TFormClass Read FFormClass Write FFormClass;
|
||||
Property Caption : String Read FCaption Write FCaption;
|
||||
Property Description : String Read FDescription Write FDescription;
|
||||
Property UnitName : String Read FUnitName Write FUnitName;
|
||||
Property Category : String Read FCategory Write FCategory;
|
||||
Property Author : String Read FAuthor Write FAuthor;
|
||||
Property LazPackage : String Read FLazPackage Write FLazPackage;
|
||||
end;
|
||||
|
||||
Procedure RegisterCustomForm(Descr : TCustomFormDescr);
|
||||
Procedure RegisterCustomForm(AFormClass : TFormClass; const APackage: string);
|
||||
Procedure RegisterCustomForm(AFormClass : TFormClass; Const AUnitName, APackage : String);
|
||||
|
||||
Procedure Register;
|
||||
|
||||
implementation
|
||||
|
||||
uses ProjectIntf,NewItemIntf,contnrs;
|
||||
|
||||
Const
|
||||
SAppFrameWork = 'Custom forms';
|
||||
SInstanceOf = 'Create a new instance of %s';
|
||||
|
||||
{ TCustomFormDescr }
|
||||
|
||||
constructor TCustomFormDescr.Create(AFormClass: TFormClass;
|
||||
const APackage: string);
|
||||
|
||||
Var
|
||||
N,U : String;
|
||||
|
||||
begin
|
||||
N:=AFormClass.ClassName;
|
||||
U:=N;
|
||||
If (Upcase(U[1])='T') then
|
||||
Delete(U,1,1);
|
||||
Create(AFormClass,N,Format(SInstanceOf,[N]),U,APackage);
|
||||
end;
|
||||
|
||||
constructor TCustomFormDescr.Create(AFormClass: TFormClass;
|
||||
const ACaption, ADescription, AUnit, APackage: String);
|
||||
begin
|
||||
FFormClass:=AFormClass;
|
||||
FCaption:=ACaption;
|
||||
FDescription:=ADescription;
|
||||
FUnitName:=AUnit;
|
||||
FCategory:=SAppFrameWork;
|
||||
FLazPackage:=APackage;
|
||||
end;
|
||||
|
||||
// Registration code.
|
||||
|
||||
Type
|
||||
{ TCustomFormFileDescriptor }
|
||||
TCustomFormFileDescriptor = Class(TFileDescPascalUnitWithResource)
|
||||
private
|
||||
FFormDescr: TCustomFormDescr;
|
||||
Public
|
||||
Constructor Create(ADescr : TCustomFormDescr);
|
||||
Property FormDescr : TCustomFormDescr Read FFormDescr;
|
||||
Function GetLocalizedName : String; override;
|
||||
Function GetLocalizedDescription : String; override;
|
||||
Function GetInterfaceUsesSection : String; override;
|
||||
end;
|
||||
|
||||
{ TCustomFormFileDescriptor }
|
||||
|
||||
constructor TCustomFormFileDescriptor.Create(ADescr: TCustomFormDescr);
|
||||
begin
|
||||
Inherited Create;
|
||||
FFormDescr:=ADescr;
|
||||
ResourceClass:=FFormDescr.FFormClass;
|
||||
Name:=FFormDescr.Caption;
|
||||
RequiredPackages:=ADescr.LazPackage;
|
||||
//Writeln('TCustomFormFileDescriptor.Create RequiredPackages=',RequiredPackages);
|
||||
end;
|
||||
|
||||
function TCustomFormFileDescriptor.GetLocalizedName: String;
|
||||
begin
|
||||
Result:=FFormDescr.Caption;
|
||||
end;
|
||||
|
||||
function TCustomFormFileDescriptor.GetLocalizedDescription: String;
|
||||
begin
|
||||
Result:=FFormDescr.Description;
|
||||
If (FFormDescr.Author<>'') then
|
||||
Result:=Result+LineEnding+'By '+FFormDescr.Author;
|
||||
end;
|
||||
|
||||
function TCustomFormFileDescriptor.GetInterfaceUsesSection: String;
|
||||
begin
|
||||
Result:=inherited GetInterfaceUsesSection;
|
||||
Result:=Result+',Forms,'+FFormDescr.UnitName;
|
||||
end;
|
||||
|
||||
Var
|
||||
CustomFormList : TObjectList;
|
||||
|
||||
Procedure RegisterCustomForm(Descr : TCustomFormDescr);
|
||||
|
||||
begin
|
||||
CustomFormList.Add(Descr);
|
||||
end;
|
||||
|
||||
Procedure RegisterCustomForm(AFormClass : TFormClass; const APackage: string);
|
||||
|
||||
begin
|
||||
RegisterCustomForm(TCustomFormDescr.Create(AFormClass,APackage));
|
||||
end;
|
||||
|
||||
Procedure RegisterCustomForm(AFormClass : TFormClass; Const AUnitName, APackage : String);
|
||||
|
||||
Var
|
||||
D : TCustomFormDescr;
|
||||
|
||||
begin
|
||||
D:=TCustomFormDescr.Create(AFormClass,APackage);
|
||||
D.UnitName:=AUnitName;
|
||||
RegisterCustomForm(D);
|
||||
end;
|
||||
|
||||
|
||||
Procedure Register;
|
||||
|
||||
Var
|
||||
L : TStringList;
|
||||
I : Integer;
|
||||
D : TCustomFormDescr;
|
||||
|
||||
begin
|
||||
L:=TStringList.Create;
|
||||
Try
|
||||
L.Sorted:=True;
|
||||
L.Duplicates:=dupIgnore;
|
||||
For I:=0 to CustomFormList.Count-1 do
|
||||
L.Add(TCustomFormDescr(CustomFormList[i]).Category);
|
||||
For I:=0 to L.Count-1 do
|
||||
begin
|
||||
RegisterNewItemCategory(TNewIDEItemCategory.Create(L[i]));
|
||||
end;
|
||||
Finally
|
||||
L.Free;
|
||||
end;
|
||||
For I:=0 to CustomFormList.Count-1 do
|
||||
begin
|
||||
D:=TCustomFormDescr(CustomFormList[i]);
|
||||
RegisterProjectFileDescriptor(TCustomFormFileDescriptor.Create(D),D.Category);
|
||||
end;
|
||||
end;
|
||||
|
||||
Procedure InitCustomForms;
|
||||
|
||||
begin
|
||||
CustomFormList:=TObjectList.Create;
|
||||
end;
|
||||
|
||||
Procedure DoneCustomForms;
|
||||
|
||||
begin
|
||||
FreeAndNil(CustomFormList);
|
||||
end;
|
||||
|
||||
Initialization
|
||||
InitCustomForms;
|
||||
Finalization
|
||||
DoneCustomForms;
|
||||
end.
|
||||
|
||||
51
samples/Pascal/gtkextra.pp
Normal file
51
samples/Pascal/gtkextra.pp
Normal file
@@ -0,0 +1,51 @@
|
||||
{ $Id$ }
|
||||
{
|
||||
---------------------------------------------------------------------------
|
||||
gtkextra.pp - GTK(2) widgetset - additional gdk/gtk functions
|
||||
---------------------------------------------------------------------------
|
||||
|
||||
This unit contains missing gdk/gtk functions and defines for certain
|
||||
versions of gtk or fpc.
|
||||
|
||||
---------------------------------------------------------------------------
|
||||
|
||||
@created(Sun Jan 28th WET 2006)
|
||||
@lastmod($Date$)
|
||||
@author(Marc Weustink <marc@@dommelstein.nl>)
|
||||
|
||||
*****************************************************************************
|
||||
This file is part of the Lazarus Component Library (LCL)
|
||||
|
||||
See the file COPYING.modifiedLGPL.txt, included in this distribution,
|
||||
for details about the license.
|
||||
*****************************************************************************
|
||||
}
|
||||
|
||||
unit GtkExtra;
|
||||
|
||||
{$mode objfpc}{$H+}
|
||||
|
||||
interface
|
||||
|
||||
{$I gtkdefines.inc}
|
||||
|
||||
{$ifdef gtk1}
|
||||
{$I gtk1extrah.inc}
|
||||
{$endif}
|
||||
|
||||
{$ifdef gtk2}
|
||||
{$I gtk2extrah.inc}
|
||||
{$endif}
|
||||
|
||||
|
||||
implementation
|
||||
|
||||
{$ifdef gtk1}
|
||||
{$I gtk1extra.inc}
|
||||
{$endif}
|
||||
|
||||
{$ifdef gtk2}
|
||||
{$I gtk2extra.inc}
|
||||
{$endif}
|
||||
|
||||
end.
|
||||
1051
samples/Prolog/admin.pl
Executable file
1051
samples/Prolog/admin.pl
Executable file
File diff suppressed because it is too large
Load Diff
11
samples/Prolog/dleak-report.script!
Normal file
11
samples/Prolog/dleak-report.script!
Normal file
@@ -0,0 +1,11 @@
|
||||
#!/usr/bin/env swipl
|
||||
|
||||
:- set_prolog_flag(verbose, silent).
|
||||
:- use_module(dleak).
|
||||
|
||||
:- initialization
|
||||
main, halt.
|
||||
|
||||
main :-
|
||||
current_prolog_flag(argv, [File]),
|
||||
dleak(File).
|
||||
5
samples/Prolog/ex6.pl
Normal file
5
samples/Prolog/ex6.pl
Normal file
@@ -0,0 +1,5 @@
|
||||
%6.8
|
||||
subset(Set, Subset) :-
|
||||
append(L1, Subset, Set).
|
||||
powerset(Set, Subset) :-
|
||||
bagof(Subset, subset(Set, Subset), Subset).
|
||||
68
samples/Prolog/logic-problem.pro
Normal file
68
samples/Prolog/logic-problem.pro
Normal file
@@ -0,0 +1,68 @@
|
||||
/**
|
||||
* Question 1.1
|
||||
* combiner(+Buddies, -Pairs)
|
||||
*/
|
||||
combiner([], []).
|
||||
combiner([First|Buddies], Pairs):-
|
||||
make_pairs(First, Buddies, Pairs1),
|
||||
combiner(Buddies, Pairs2),
|
||||
concat(Pairs1, Pairs2, Pairs).
|
||||
|
||||
/**
|
||||
* make_pairs(+Buddy, +Buddies, -Pairs)
|
||||
*/
|
||||
make_pairs(Buddy, [], []).
|
||||
make_pairs(Buddy, [First|Buddies], [(Buddy, First)|Pairs]):-
|
||||
make_pairs(Buddy, Buddies, Pairs).
|
||||
|
||||
/**
|
||||
* concat(+X, +Y, ?T)
|
||||
*/
|
||||
concat([], Y, Y).
|
||||
concat([P|R], Y, [P|T]):-
|
||||
concat(R, Y, T).
|
||||
|
||||
|
||||
/**
|
||||
* Question 1.2
|
||||
* extraire(+AllPossiblePairs, +NbPairs, -Tp, -RemainingPairs)
|
||||
*/
|
||||
extraire(AllPossiblePairs, 0, [], AllPossiblePairs).
|
||||
extraire([PossiblePair|AllPossiblePairs], NbPairs, [PossiblePair|Tp], NewRemainingPairs):-
|
||||
NbPairs > 0,
|
||||
NewNbPairs is NbPairs - 1,
|
||||
extraire(AllPossiblePairs, NewNbPairs, Tp, RemainingPairs),
|
||||
not(pair_in_array(PossiblePair, Tp)),
|
||||
delete_pair(RemainingPairs, PossiblePair, NewRemainingPairs).
|
||||
extraire([PossiblePair|AllPossiblePairs], NbPairs, Tp, [PossiblePair|RemainingPairs]):-
|
||||
NbPairs > 0,
|
||||
extraire(AllPossiblePairs, NbPairs, Tp, RemainingPairs),
|
||||
pair_in_array(PossiblePair, Tp).
|
||||
|
||||
/**
|
||||
* delete_pair(+Pairs, +Pair, -PairsWithoutPair)
|
||||
*/
|
||||
delete_pair([], _, []).
|
||||
delete_pair([Pair|Pairs], Pair, Pairs):-!.
|
||||
delete_pair([FirstPair|Pairs], Pair, [FirstPair|PairsWithoutPair]):-
|
||||
delete_pair(Pairs, Pair, PairsWithoutPair).
|
||||
|
||||
/**
|
||||
* pair_in_array(+Pair, +Pairs)
|
||||
*/
|
||||
pair_in_array((A, B), [(C, D)|Pairs]):-
|
||||
(A == C ; B == D ; A == D ; B == C),
|
||||
!.
|
||||
pair_in_array(Pair, [FirstPair|Pairs]):-
|
||||
pair_in_array(Pair, Pairs).
|
||||
|
||||
|
||||
/**
|
||||
* Question 1.3
|
||||
* les_tps(+Buddies, -Tps)
|
||||
*/
|
||||
les_tps(Buddies, Tps):-
|
||||
combiner(Buddies, PossiblePairs),
|
||||
length(Buddies, NbBuddies),
|
||||
NbPairs is integer(NbBuddies / 2),
|
||||
findall(Tp, extraire(PossiblePairs, NbPairs, Tp, _), Tps).
|
||||
26
samples/Puppet/expiringhost.pp
Normal file
26
samples/Puppet/expiringhost.pp
Normal file
@@ -0,0 +1,26 @@
|
||||
define example::expiringhost($ip, $timestamp) {
|
||||
|
||||
# Calculate the age of this resource by comparing 'now' against $timestamp
|
||||
$age = inline_template("<%= require 'time'; Time.now - Time.parse(timestamp) %>")
|
||||
|
||||
# Max age, in seconds.
|
||||
$maxage = 60
|
||||
|
||||
if $age > $maxage {
|
||||
$expired = true
|
||||
notice("Expiring resource $class[$name] due to age > $maxage (actual: $age)")
|
||||
} else {
|
||||
$expired = false
|
||||
notice("Found recently-active $class[$name] (age: $age)")
|
||||
}
|
||||
|
||||
# I set target to a /tmp path so you can run this example as non-root.
|
||||
# In production, you probabyl won't set target as it defaults to /etc/hosts
|
||||
# (or wherever puppet thinks your platform wants it)
|
||||
host {
|
||||
$name:
|
||||
ip => $ip,
|
||||
target => "/tmp/expiring-hosts-example-output",
|
||||
ensure => $expired ? { true => absent, false => present };
|
||||
}
|
||||
}
|
||||
26
samples/Puppet/stages-example.pp
Normal file
26
samples/Puppet/stages-example.pp
Normal file
@@ -0,0 +1,26 @@
|
||||
class foo {
|
||||
notify {
|
||||
"foo": ;
|
||||
}
|
||||
}
|
||||
|
||||
class bar {
|
||||
notify {
|
||||
"bar": ;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
node default {
|
||||
stage {
|
||||
"one": ;
|
||||
"two": ;
|
||||
}
|
||||
|
||||
class {
|
||||
"foo": stage => "one";
|
||||
"bar": stage => "two";
|
||||
}
|
||||
|
||||
Stage["one"] -> Stage["two"]
|
||||
}
|
||||
22
samples/Puppet/unmanaged-notify-puppet25.pp
Normal file
22
samples/Puppet/unmanaged-notify-puppet25.pp
Normal file
@@ -0,0 +1,22 @@
|
||||
# Manually manage /tmp/original
|
||||
# Each puppet run will copy it to /tmp/flag if there's a change and notify
|
||||
# the exec when it changes.
|
||||
#
|
||||
# The idea here is you might need (in some case) to manually manage a file outside
|
||||
# of puppet (in this case, "/tmp/original"). Using this example, you can make puppet
|
||||
# signal other parts of your catalog based on changes to that file.
|
||||
|
||||
file {
|
||||
# This will, when different, copy /tmp/original to /tmp/flag and notify our
|
||||
# exec.
|
||||
"/tmp/flag":
|
||||
source => "file:///tmp/original",
|
||||
notify => Exec["hello world"];
|
||||
}
|
||||
|
||||
exec {
|
||||
"hello world":
|
||||
command => "/bin/echo hello world",
|
||||
refreshonly => true;
|
||||
}
|
||||
|
||||
82
samples/Python/action.cgi
Normal file
82
samples/Python/action.cgi
Normal file
@@ -0,0 +1,82 @@
|
||||
#!/usr/bin/python
|
||||
|
||||
from model import Feed
|
||||
import session
|
||||
import datetime
|
||||
import sys
|
||||
|
||||
argv = session.argv()
|
||||
|
||||
feed = Feed.get(guid=argv[1])
|
||||
action = argv[2]
|
||||
|
||||
if action == 'done':
|
||||
when = feed.notify_interval * feed.notify_unit
|
||||
elif action == 'snooze':
|
||||
if len(argv) > 3:
|
||||
when = int(argv[3])
|
||||
else:
|
||||
when = 3600
|
||||
else:
|
||||
print '''Status: 400 Bad request
|
||||
Content-type: text/html
|
||||
|
||||
Unknown action %s''' % action
|
||||
sys.exit(1)
|
||||
|
||||
feed.notify_next = datetime.datetime.utcnow() + datetime.timedelta(seconds=when)
|
||||
feed.save()
|
||||
|
||||
response = '''Content-type: text/html
|
||||
|
||||
<html><head><title>Alarm reset</title>
|
||||
<link rel="stylesheet" href="{base_url}/style.css">
|
||||
</head>
|
||||
<body>
|
||||
|
||||
<div class="container">
|
||||
<h1>Alarm reset</h1>
|
||||
<div>
|
||||
<p id="reset">Alarm "<span class="name">{name}</span>" has been reset. You won't be notified for another <span class="duration">{duration}</span>.</p>
|
||||
|
||||
<p>Actions:</p>
|
||||
<ul>
|
||||
<li><a href="{edit_url}?feed={guid}">Edit this reminder</a></li>
|
||||
<li><a href="{edit_url}">Create another reminder</a></li>
|
||||
<li><a href="{base_url}">Visit the Reminder Me site</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<p class="back"><a href=".">Reminder Me</a></p>
|
||||
|
||||
</body></html>'''
|
||||
|
||||
when_left = when
|
||||
duration_list = []
|
||||
for (label,period) in [('month',86400*365/12),
|
||||
('week',86400*7),
|
||||
('day',86400),
|
||||
('hour',3600),
|
||||
('minute',60),
|
||||
('second',1)]:
|
||||
if when == period:
|
||||
duration_list = [label]
|
||||
break
|
||||
|
||||
val = when_left/period
|
||||
if val:
|
||||
duration_list.append("%d %s%s" % (
|
||||
val,
|
||||
label,
|
||||
val > 1 and 's' or ''))
|
||||
when_left -= val*period
|
||||
|
||||
basedir=session.request_script_dir()
|
||||
|
||||
print response.format(guid=feed.guid,
|
||||
name=feed.name,
|
||||
edit_url="%s/edit.cgi" % basedir,
|
||||
base_url=basedir,
|
||||
duration=', '.join(duration_list))
|
||||
|
||||
120
samples/Python/backstage.fcgi
Executable file
120
samples/Python/backstage.fcgi
Executable file
@@ -0,0 +1,120 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import sqlite
|
||||
import urllib2
|
||||
import csv
|
||||
import cgi
|
||||
import simplejson
|
||||
import jsontemplate
|
||||
import time
|
||||
|
||||
log = open('log.txt', 'a')
|
||||
|
||||
def urldecode(query):
|
||||
d = {}
|
||||
a = query.split('&')
|
||||
for s in a:
|
||||
if s.find('='):
|
||||
k,v = map(urllib2.unquote, s.split('='))
|
||||
try:
|
||||
d[k].append(v)
|
||||
except KeyError:
|
||||
d[k] = [v]
|
||||
|
||||
return d
|
||||
|
||||
def load_table(uri, cur):
|
||||
table = uri.split('/')[-1]
|
||||
table = table.split('.')[0]
|
||||
|
||||
contents = urllib2.urlopen(uri)
|
||||
fields = ""
|
||||
for field in contents.readline().strip().split(','):
|
||||
fields += field
|
||||
fields += ","
|
||||
fields = fields.rstrip(',')
|
||||
|
||||
cur.execute("SELECT name FROM sqlite_master WHERE type='table' \
|
||||
AND name='%s';" % (table))
|
||||
if cur.fetchone() == None:
|
||||
# cur.execute("DROP TABLE %s;" % (table))
|
||||
cur.execute("CREATE TABLE %s (%s);" % (table, fields))
|
||||
for line in contents:
|
||||
values = line.strip()
|
||||
values = "','".join([val.strip() for val in values.split(",")])
|
||||
values = "'" + values + "'"
|
||||
sql = "INSERT INTO %s (%s) VALUES (%s);" % (table, fields, values)
|
||||
cur.execute(sql)
|
||||
return table
|
||||
|
||||
def build_structure(headings, allresults):
|
||||
results = []
|
||||
for result in allresults:
|
||||
results.append(dict(zip(headings, result)))
|
||||
results = { "query" : results }
|
||||
return results
|
||||
|
||||
def build_json(headings, allresults, callback):
|
||||
results = build_structure(headings, allresults)
|
||||
return_str = simplejson.dumps(results)
|
||||
if callback != None:
|
||||
return_str = callback + "(" + return_str + ");";
|
||||
return return_str
|
||||
|
||||
def load_template(templatefile):
|
||||
return "".join(urllib2.urlopen(templatefile).readlines())
|
||||
|
||||
def build_template(headings, allresults, template_str):
|
||||
results = build_structure(headings, allresults)
|
||||
return jsontemplate.expand(template_str, results)
|
||||
return ""
|
||||
|
||||
def myapp(environ, start_response):
|
||||
args = cgi.parse_qs(environ['QUERY_STRING'])
|
||||
|
||||
query = args['query'][0]
|
||||
uri = args['uri'][0]
|
||||
callback = None
|
||||
if 'callback' in args:
|
||||
callback = args['callback'][0]
|
||||
label = "no label"
|
||||
if 'label' in args:
|
||||
label = args['label'][0]
|
||||
templatefile = None
|
||||
if 'templatefile' in args:
|
||||
templatefile = args['templatefile'][0]
|
||||
|
||||
con = sqlite.connect('mydatabase.db')
|
||||
cur = con.cursor()
|
||||
table_uris = uri.split(',')
|
||||
tables = [load_table(uri, cur) for uri in table_uris]
|
||||
con.commit()
|
||||
before = time.time()
|
||||
cur.execute(query)
|
||||
allresults = cur.fetchall()
|
||||
after = time.time()
|
||||
log.write("%s: query time %f\n" % (label, after - before))
|
||||
|
||||
headings = [name[0] for name in cur.description]
|
||||
return_str = ""
|
||||
if templatefile != None:
|
||||
start_response('200 OK', [('Content-Type', 'text/html')])
|
||||
before = time.time()
|
||||
template_str = load_template(templatefile)
|
||||
after = time.time()
|
||||
log.write("%s: template loading time %f\n" % (label, after - before))
|
||||
before = time.time()
|
||||
return_str = build_template(headings, allresults, template_str)
|
||||
after = time.time()
|
||||
log.write("%s: template rendering time %f\n" % (label, after - before))
|
||||
else:
|
||||
start_response('200 OK', [('Content-Type', 'text/plain')])
|
||||
before = time.time()
|
||||
return_str = build_json(headings, allresults, callback)
|
||||
after = time.time()
|
||||
log.write("%s: json-making time %f\n" % (label, after - before))
|
||||
return return_str
|
||||
|
||||
if __name__ == '__main__':
|
||||
from fcgi import WSGIServer
|
||||
WSGIServer(myapp).run()
|
||||
68
samples/Ruby/mdata_server.fcgi
Executable file
68
samples/Ruby/mdata_server.fcgi
Executable file
@@ -0,0 +1,68 @@
|
||||
#!/usr/bin/env ruby
|
||||
require "xmlrpc/server"
|
||||
|
||||
# NOTE: force the usage of the pure-ruby version of fcgi.
|
||||
# - this is required by the workaround to get fcgi+xmlrpc working together
|
||||
FCGI_PURE_RUBY=true
|
||||
require 'fcgi'
|
||||
|
||||
require File.join(File.dirname(__FILE__), '../bt_cast/mdata_echo_server/bt_cast_mdata_server_t.rb')
|
||||
|
||||
################################################################################
|
||||
################################################################################
|
||||
# CGI handling for xmlrpc
|
||||
################################################################################
|
||||
################################################################################
|
||||
# - for basic xmlrpc via CGI example
|
||||
# - see http://www.ntecs.de/projects/xmlrpc4r/server.html#label-19
|
||||
|
||||
# create the directory needed for Neoip::Cast_mdata_server_t
|
||||
Neoip::Cast_mdata_server_t.create_dir_ifneeded();
|
||||
|
||||
# init the cgi_server
|
||||
cgi_server = XMLRPC::CGIServer.new
|
||||
# register all the xmlrpc function
|
||||
cgi_server.add_handler("set_cast_mdata_pull") do |web2srv_str, cast_name, cast_privtext, cast_id,
|
||||
port_lview, port_pview, uri_pathquery|
|
||||
Neoip::Cast_mdata_server_t.set_cast_mdata_pull(web2srv_str, cast_name, cast_privtext, cast_id,
|
||||
port_lview, port_pview, uri_pathquery, ENV['REMOTE_ADDR']);
|
||||
end
|
||||
cgi_server.add_handler("set_cast_mdata_push") do |web2srv_str, cast_name, cast_privtext, cast_mdata|
|
||||
Neoip::Cast_mdata_server_t.set_cast_mdata_push(web2srv_str, cast_name, cast_privtext, cast_mdata);
|
||||
end
|
||||
cgi_server.add_handler("get_cast_mdata") do |cast_name, cast_privhash|
|
||||
Neoip::Cast_mdata_server_t.get_cast_mdata(cast_name, cast_privhash);
|
||||
end
|
||||
cgi_server.add_handler("del_cast_mdata") do |cast_name, cast_privtext|
|
||||
Neoip::Cast_mdata_server_t.del_cast_mdata(cast_name, cast_privtext);
|
||||
end
|
||||
|
||||
# handle the unknown/bad formered calls
|
||||
cgi_server.set_default_handler do |name, *args|
|
||||
raise XMLRPC::FaultException.new(-99, "Method #{name} missing" +
|
||||
" or wrong number of parameters!")
|
||||
end
|
||||
|
||||
# server the cgi_server
|
||||
#cgi_server.serve
|
||||
#exit
|
||||
|
||||
# experiment at using fast-cgi
|
||||
FCGI.each_request do |request|
|
||||
# XMLRPC::CGIServer expect some value in ENV[] but FCGI doesnt provides them
|
||||
# - so working around by copying them by hand... dirty
|
||||
ENV['REMOTE_ADDR'] = request.env['REMOTE_ADDR'];
|
||||
ENV['REQUEST_METHOD'] = request.env['REQUEST_METHOD'];
|
||||
ENV['CONTENT_TYPE'] = "text/xml";
|
||||
ENV['CONTENT_LENGTH'] = "#{request.in.length}";
|
||||
|
||||
# copy the request in/out into the stdin/stdout to act as a CGI
|
||||
$stdin = request.in
|
||||
$stdout = request.out
|
||||
|
||||
# process the cgi itself
|
||||
cgi_server.serve
|
||||
|
||||
# mark the request as finished
|
||||
request.finish
|
||||
end
|
||||
16
samples/Shell/php.fcgi
Executable file
16
samples/Shell/php.fcgi
Executable file
@@ -0,0 +1,16 @@
|
||||
#!/bin/sh
|
||||
# you can change the PHP version here.
|
||||
version="RB_PHP_VERSION_X_Y_Z"
|
||||
|
||||
# php.ini file location
|
||||
PHPRC=/usr/local/php/phpfarm/inst/php-${version}/lib/php.ini
|
||||
export PHPRC
|
||||
|
||||
PHP_FCGI_CHILDREN=3
|
||||
export PHP_FCGI_CHILDREN
|
||||
|
||||
PHP_FCGI_MAX_REQUESTS=5000
|
||||
export PHP_FCGI_MAX_REQUESTS
|
||||
|
||||
# which php-cgi binary to execute
|
||||
exec /usr/local/php/inst/php-${version}/bin/php-cgi
|
||||
27
samples/Shell/settime.cgi
Normal file
27
samples/Shell/settime.cgi
Normal file
@@ -0,0 +1,27 @@
|
||||
#!/bin/bash
|
||||
echo "Content-type: text/html"
|
||||
day=`echo "$QUERY_STRING" | sed -n 's/^.*day=\([^&]*\).*$/\1/p' | sed "s/%20/ /g"`
|
||||
month=`echo "$QUERY_STRING" | sed -n 's/^.*month=\([^&]*\).*$/\1/p' | sed "s/%20/ /g"`
|
||||
year=`echo "$QUERY_STRING" | sed -n 's/^.*year=\([^&]*\).*$/\1/p' | sed "s/%20/ /g"`
|
||||
hour=`echo "$QUERY_STRING" | sed -n 's/^.*hour=\([^&]*\).*$/\1/p' | sed "s/%20/ /g"`
|
||||
minute=`echo "$QUERY_STRING" | sed -n 's/^.*minute=\([^&]*\).*$/\1/p' | sed "s/%20/ /g"`
|
||||
second=`echo "$QUERY_STRING" | sed -n 's/^.*second=\([^&]*\).*$/\1/p' | sed "s/%20/ /g"`
|
||||
echo ""
|
||||
echo "<html><body>"
|
||||
|
||||
echo "<pre> $(killall ems) </pre>"
|
||||
|
||||
|
||||
|
||||
echo "<pre> $(date $month$day$hour$minute$year.$second) </pre>"
|
||||
|
||||
echo "<pre> $(/sbin/hwclock -w>/dev/null & /sbin/reboot) </pre>"
|
||||
|
||||
echo "<pre> $(/sbin/reboot) </pre>"
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
echo "</body></html>"
|
||||
63
samples/Tcl/filenames/owh
Executable file
63
samples/Tcl/filenames/owh
Executable file
@@ -0,0 +1,63 @@
|
||||
#!/usr/bin/env tclsh
|
||||
# http://wiki.tcl.tk/906
|
||||
|
||||
if {[llength $argv] < 1} {
|
||||
puts "usage: owh ?init? body ?exit?
|
||||
performs body (in Tcl) for each line (\$0) from stdin
|
||||
owh: Ousterhout - Welch - Hobbs, to name a few"
|
||||
exit -1
|
||||
}
|
||||
|
||||
proc awksplit {text {split default}} {
|
||||
set no 0
|
||||
if {$split eq "default"} {
|
||||
set t {}
|
||||
foreach string [split $text] {
|
||||
if {$string ne {}} {
|
||||
lappend t $string
|
||||
}
|
||||
}
|
||||
} else {
|
||||
set t [list $text $split]
|
||||
}
|
||||
uplevel 1 [list set NF [llength $t]]
|
||||
foreach i $t {uplevel 1 [list set [incr no] $i]}
|
||||
uplevel 1 {set 0 {};trace variable 0 ru 0}
|
||||
}
|
||||
proc 0 {_name index op} {
|
||||
switch $op {
|
||||
r {
|
||||
uplevel {
|
||||
set 0 {}
|
||||
for {set i 1} {$i <= $NF} {incr i} {lappend 0 [set $i]}
|
||||
set 0 [join $0 $OFS]
|
||||
}
|
||||
}
|
||||
u {rename 0 {} ;# leave no traces of the trace..}
|
||||
}
|
||||
}
|
||||
|
||||
proc print s {if {[catch {puts $s}]} exit} ;# good for broken pipe
|
||||
|
||||
set FS default
|
||||
set OFS { }
|
||||
|
||||
if {[llength $argv] > 1} {
|
||||
eval [lindex $argv 0]
|
||||
set _body [lindex $argv 1] ;# strip outer braces
|
||||
set _exit [lindex $argv 2]
|
||||
} else {
|
||||
set _body [lindex $argv 0] ;# strip outer braces
|
||||
set _exit {}
|
||||
}
|
||||
|
||||
set NR 1
|
||||
while 1 {
|
||||
gets stdin line
|
||||
if {[eof stdin]} break
|
||||
awksplit $line $FS
|
||||
eval $_body
|
||||
incr NR
|
||||
}
|
||||
set res [eval $_exit]
|
||||
if {[string length $res]} {puts $res}
|
||||
28
samples/Tcl/filenames/starfield
Executable file
28
samples/Tcl/filenames/starfield
Executable file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env wish
|
||||
# http://wiki.tcl.tk/14140
|
||||
|
||||
proc stars'go {c factor} {
|
||||
set w [winfo width $c]
|
||||
set h [winfo height $c]
|
||||
$c scale all [expr {$w/2}] [expr {$h/2}] $factor $factor
|
||||
foreach item [$c find all] {
|
||||
if {[llength [$c bbox $item]] == 0} {$c delete $item; continue} ;# (1)
|
||||
foreach {x0 y0 x1 y1} [$c bbox $item] break
|
||||
if {$x1<0 || $x0>$w || $y1<0 || $y0>$h} {$c delete $item}
|
||||
}
|
||||
time {
|
||||
set x [expr {rand()*$w}]
|
||||
set y [expr {rand()*$h}]
|
||||
set col [lpick {white yellow beige bisque cyan}]
|
||||
$c create oval $x $y [expr {$x+1}] [expr {$y+1}] -fill $col \
|
||||
-outline $col
|
||||
} 10
|
||||
after $::ms [info level 0]
|
||||
}
|
||||
proc lpick list {lindex $list [expr {int(rand()*[llength $list])}]}
|
||||
#-- Let's go!
|
||||
pack [canvas .c -bg black] -fill both -expand 1
|
||||
set ms 40
|
||||
bind . <Up> {incr ms -5}
|
||||
bind . <Down> {incr ms 5}
|
||||
stars'go .c 1.05
|
||||
8
samples/VimL/filenames/_vimrc
Normal file
8
samples/VimL/filenames/_vimrc
Normal file
@@ -0,0 +1,8 @@
|
||||
set nocompatible
|
||||
set ignorecase
|
||||
set incsearch
|
||||
set smartcase
|
||||
set showmatch
|
||||
set showcmd
|
||||
|
||||
syntax on
|
||||
17
script/cibuild
Executable file
17
script/cibuild
Executable file
@@ -0,0 +1,17 @@
|
||||
#!/bin/sh
|
||||
if [ -d /usr/share/rbenv/shims ]; then
|
||||
export PATH=/usr/share/rbenv/shims:$PATH
|
||||
export RBENV_VERSION=2.1.2-github
|
||||
export RUBY_VERSION=2.1.2-github
|
||||
fi
|
||||
|
||||
set -x
|
||||
git log -n 1 HEAD | cat
|
||||
ruby -v
|
||||
bundle -v
|
||||
set +x
|
||||
|
||||
mkdir -p ./vendor/gems
|
||||
|
||||
bundle install --local --path ./vendor/gems
|
||||
bundle exec rake
|
||||
@@ -262,6 +262,10 @@ class TestBlob < Test::Unit::TestCase
|
||||
|
||||
|
||||
assert Linguist::Generated.generated?("node_modules/grunt/lib/grunt.js", nil)
|
||||
|
||||
# Godep saved dependencies
|
||||
assert blob("Godeps/Godeps.json").generated?
|
||||
assert blob("Godeps/_workspace/src/github.com/kr/s3/sign.go").generated?
|
||||
end
|
||||
|
||||
def test_vendored
|
||||
@@ -279,6 +283,10 @@ class TestBlob < Test::Unit::TestCase
|
||||
assert blob("app/bower_components/custom/custom.js").vendored?
|
||||
assert blob("vendor/assets/bower_components/custom/custom.js").vendored?
|
||||
|
||||
# Go dependencies
|
||||
assert !blob("Godeps/Godeps.json").vendored?
|
||||
assert blob("Godeps/_workspace/src/github.com/kr/s3/sign.go").vendored?
|
||||
|
||||
# Rails vendor/
|
||||
assert blob("vendor/plugins/will_paginate/lib/will_paginate.rb").vendored?
|
||||
|
||||
@@ -437,6 +445,12 @@ class TestBlob < Test::Unit::TestCase
|
||||
assert blob("octicons.css").vendored?
|
||||
assert blob("public/octicons.min.css").vendored?
|
||||
assert blob("public/octicons/sprockets-octicons.scss").vendored?
|
||||
|
||||
# Typesafe Activator
|
||||
assert blob("activator").vendored?
|
||||
assert blob("activator.bat").vendored?
|
||||
assert blob("subproject/activator").vendored?
|
||||
assert blob("subproject/activator.bat").vendored?
|
||||
end
|
||||
|
||||
def test_language
|
||||
|
||||
@@ -44,12 +44,12 @@ class TestClassifier < Test::Unit::TestCase
|
||||
end
|
||||
|
||||
def test_instance_classify_empty
|
||||
results = Classifier.classify(Samples::DATA, "")
|
||||
results = Classifier.classify(Samples.cache, "")
|
||||
assert results.first[1] < 0.5, results.first.inspect
|
||||
end
|
||||
|
||||
def test_instance_classify_nil
|
||||
assert_equal [], Classifier.classify(Samples::DATA, nil)
|
||||
assert_equal [], Classifier.classify(Samples.cache, nil)
|
||||
end
|
||||
|
||||
def test_classify_ambiguous_languages
|
||||
@@ -58,7 +58,7 @@ class TestClassifier < Test::Unit::TestCase
|
||||
languages = Language.find_by_filename(sample[:path]).map(&:name)
|
||||
next unless languages.length > 1
|
||||
|
||||
results = Classifier.classify(Samples::DATA, File.read(sample[:path]), languages)
|
||||
results = Classifier.classify(Samples.cache, File.read(sample[:path]), languages)
|
||||
assert_equal language.name, results.first[0], "#{sample[:path]}\n#{results.inspect}"
|
||||
end
|
||||
end
|
||||
|
||||
@@ -74,6 +74,18 @@ class TestHeuristcs < Test::Unit::TestCase
|
||||
assert_equal Language["ECL"], results.first
|
||||
end
|
||||
|
||||
def test_pro_prolog_by_heuristics
|
||||
languages = ["IDL", "Prolog"]
|
||||
results = Heuristics.disambiguate_pro(fixture("Prolog/logic-problem.pro"), languages)
|
||||
assert_equal Language["Prolog"], results.first
|
||||
end
|
||||
|
||||
def test_pro_idl_by_heuristics
|
||||
languages = ["IDL", "Prolog"]
|
||||
results = Heuristics.disambiguate_pro(fixture("IDL/mg_acosh.pro"), languages)
|
||||
assert_equal Language["IDL"], results.first
|
||||
end
|
||||
|
||||
def test_ts_typescript_by_heuristics
|
||||
languages = ["TypeScript", "XML"]
|
||||
results = Heuristics.disambiguate_ts(fixture("TypeScript/classes.ts"), languages)
|
||||
|
||||
@@ -33,6 +33,7 @@ class TestLanguage < Test::Unit::TestCase
|
||||
assert_equal Lexer['Java'], Language['ChucK'].lexer
|
||||
assert_equal Lexer['Java'], Language['Java'].lexer
|
||||
assert_equal Lexer['JavaScript'], Language['JavaScript'].lexer
|
||||
assert_equal Lexer['LSL'], Language['LSL'].lexer
|
||||
assert_equal Lexer['MOOCode'], Language['Moocode'].lexer
|
||||
assert_equal Lexer['MuPAD'], Language['mupad'].lexer
|
||||
assert_equal Lexer['NASM'], Language['Assembly'].lexer
|
||||
@@ -186,6 +187,7 @@ class TestLanguage < Test::Unit::TestCase
|
||||
|
||||
def test_programming
|
||||
assert_equal :programming, Language['JavaScript'].type
|
||||
assert_equal :programming, Language['LSL'].type
|
||||
assert_equal :programming, Language['Perl'].type
|
||||
assert_equal :programming, Language['PowerShell'].type
|
||||
assert_equal :programming, Language['Python'].type
|
||||
@@ -326,6 +328,7 @@ class TestLanguage < Test::Unit::TestCase
|
||||
assert_equal '#3581ba', Language['Python'].color
|
||||
assert_equal '#f1e05a', Language['JavaScript'].color
|
||||
assert_equal '#31859c', Language['TypeScript'].color
|
||||
assert_equal '#3d9970', Language['LSL'].color
|
||||
end
|
||||
|
||||
def test_colors
|
||||
@@ -338,6 +341,7 @@ class TestLanguage < Test::Unit::TestCase
|
||||
assert_equal 'coffee', Language['CoffeeScript'].ace_mode
|
||||
assert_equal 'csharp', Language['C#'].ace_mode
|
||||
assert_equal 'css', Language['CSS'].ace_mode
|
||||
assert_equal 'lsl', Language['LSL'].ace_mode
|
||||
assert_equal 'javascript', Language['JavaScript'].ace_mode
|
||||
end
|
||||
|
||||
@@ -352,6 +356,7 @@ class TestLanguage < Test::Unit::TestCase
|
||||
end
|
||||
|
||||
def test_extensions
|
||||
assert Language['LSL'].extensions.include?('.lsl')
|
||||
assert Language['Perl'].extensions.include?('.pl')
|
||||
assert Language['Python'].extensions.include?('.py')
|
||||
assert Language['Ruby'].extensions.include?('.rb')
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
require 'linguist/repository'
|
||||
|
||||
require 'linguist/lazy_blob'
|
||||
require 'test/unit'
|
||||
|
||||
class TestRepository < Test::Unit::TestCase
|
||||
@@ -47,4 +47,58 @@ class TestRepository < Test::Unit::TestCase
|
||||
|
||||
assert_equal linguist_repo.cache, new_repo.cache
|
||||
end
|
||||
|
||||
def test_repo_git_attributes
|
||||
# See https://github.com/github/linguist/blob/351c1cc8fd57340839bdb400d7812332af80e9bd/.gitattributes
|
||||
#
|
||||
# It looks like this:
|
||||
# Gemfile linguist-vendored=true
|
||||
# lib/linguist.rb linguist-language=Java
|
||||
# test/*.rb linguist-language=Java
|
||||
# Rakefile linguist-generated
|
||||
# test/fixtures/* linguist-vendored=false
|
||||
|
||||
attr_commit = '351c1cc8fd57340839bdb400d7812332af80e9bd'
|
||||
repo = linguist_repo(attr_commit)
|
||||
|
||||
assert repo.breakdown_by_file.has_key?("Java")
|
||||
assert repo.breakdown_by_file["Java"].include?("lib/linguist.rb")
|
||||
|
||||
assert repo.breakdown_by_file.has_key?("Ruby")
|
||||
assert !repo.breakdown_by_file["Ruby"].empty?
|
||||
end
|
||||
|
||||
def test_commit_with_git_attributes_data
|
||||
# Before we had any .gitattributes data
|
||||
old_commit = '4a017d9033f91b2776eb85275463f9613cc371ef'
|
||||
old_repo = linguist_repo(old_commit)
|
||||
|
||||
# With some .gitattributes data
|
||||
attr_commit = '7ee006cbcb2d7261f9e648510a684ee9ac64126b'
|
||||
# It's incremental but should bust the cache
|
||||
new_repo = Linguist::Repository.incremental(rugged_repository, attr_commit, old_commit, old_repo.cache)
|
||||
|
||||
assert new_repo.breakdown_by_file["Java"].include?("lib/linguist.rb")
|
||||
end
|
||||
|
||||
def test_linguist_override_vendored?
|
||||
attr_commit = '351c1cc8fd57340839bdb400d7812332af80e9bd'
|
||||
repo = linguist_repo(attr_commit).read_index
|
||||
|
||||
override_vendored = Linguist::LazyBlob.new(rugged_repository, attr_commit, 'Gemfile')
|
||||
|
||||
# overridden .gitattributes
|
||||
assert override_vendored.vendored?
|
||||
end
|
||||
|
||||
def test_linguist_override_unvendored?
|
||||
attr_commit = '351c1cc8fd57340839bdb400d7812332af80e9bd'
|
||||
repo = linguist_repo(attr_commit).read_index
|
||||
|
||||
# lib/linguist/vendor.yml defines this as vendored.
|
||||
override_unvendored = Linguist::LazyBlob.new(rugged_repository, attr_commit, 'test/fixtures/foo.rb')
|
||||
|
||||
# overridden .gitattributes
|
||||
assert !override_unvendored.vendored?
|
||||
end
|
||||
end
|
||||
|
||||
@@ -8,7 +8,7 @@ class TestSamples < Test::Unit::TestCase
|
||||
include Linguist
|
||||
|
||||
def test_up_to_date
|
||||
assert serialized = Samples::DATA
|
||||
assert serialized = Samples.cache
|
||||
assert latest = Samples.data
|
||||
|
||||
# Just warn, it shouldn't scare people off by breaking the build.
|
||||
@@ -29,7 +29,7 @@ class TestSamples < Test::Unit::TestCase
|
||||
end
|
||||
|
||||
def test_verify
|
||||
assert data = Samples::DATA
|
||||
assert data = Samples.cache
|
||||
|
||||
assert_equal data['languages_total'], data['languages'].inject(0) { |n, (_, c)| n += c }
|
||||
assert_equal data['tokens_total'], data['language_tokens'].inject(0) { |n, (_, c)| n += c }
|
||||
@@ -38,7 +38,7 @@ class TestSamples < Test::Unit::TestCase
|
||||
|
||||
# Check that there aren't samples with extensions that aren't explicitly defined in languages.yml
|
||||
def test_parity
|
||||
extensions = Samples::DATA['extnames']
|
||||
extensions = Samples.cache['extnames']
|
||||
languages_yml = File.expand_path("../../lib/linguist/languages.yml", __FILE__)
|
||||
languages = YAML.load_file(languages_yml)
|
||||
|
||||
|
||||
BIN
vendor/cache/charlock_holmes-0.7.3.gem
vendored
Normal file
BIN
vendor/cache/charlock_holmes-0.7.3.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/coderay-1.1.0.gem
vendored
Normal file
BIN
vendor/cache/coderay-1.1.0.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/escape_utils-1.0.1.gem
vendored
Normal file
BIN
vendor/cache/escape_utils-1.0.1.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/json-1.8.1.gem
vendored
Normal file
BIN
vendor/cache/json-1.8.1.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/metaclass-0.0.4.gem
vendored
Normal file
BIN
vendor/cache/metaclass-0.0.4.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/method_source-0.8.2.gem
vendored
Normal file
BIN
vendor/cache/method_source-0.8.2.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/mime-types-1.25.1.gem
vendored
Normal file
BIN
vendor/cache/mime-types-1.25.1.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/mocha-1.1.0.gem
vendored
Normal file
BIN
vendor/cache/mocha-1.1.0.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/posix-spawn-0.3.9.gem
vendored
Normal file
BIN
vendor/cache/posix-spawn-0.3.9.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/pry-0.10.1.gem
vendored
Normal file
BIN
vendor/cache/pry-0.10.1.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/pygments.rb-0.6.0.gem
vendored
Normal file
BIN
vendor/cache/pygments.rb-0.6.0.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/rake-10.3.2.gem
vendored
Normal file
BIN
vendor/cache/rake-10.3.2.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/rugged-0.21.1b2.gem
vendored
Normal file
BIN
vendor/cache/rugged-0.21.1b2.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/slop-3.6.0.gem
vendored
Normal file
BIN
vendor/cache/slop-3.6.0.gem
vendored
Normal file
Binary file not shown.
BIN
vendor/cache/yajl-ruby-1.1.0.gem
vendored
Normal file
BIN
vendor/cache/yajl-ruby-1.1.0.gem
vendored
Normal file
Binary file not shown.
Reference in New Issue
Block a user