New Grammars Compiler (#3915)

* grammars: Update several grammars with compat issues

* [WIP] Add new grammar conversion tools

* Wrap in a Docker script

* Proper Dockerfile support

* Add Javadoc grammar

* Remove NPM package.json

* Remove superfluous test

This is now always checked by the grammars compiler

* Update JSyntax grammar to new submodule

* Approve Javadoc license

* grammars: Remove checked-in dependencies

* grammars: Add regex checks to the compiler

* grammars: Point Oz to its actual submodule

* grammars: Refactor compiler to group errors by repo

* grammars: Cleanups to error reporting
This commit is contained in:
Vicent Martí
2017-11-30 16:15:48 +01:00
committed by GitHub
parent 4f46155c05
commit e335d48625
37 changed files with 1445 additions and 416 deletions

9
.gitmodules vendored
View File

@@ -439,9 +439,6 @@
[submodule "vendor/grammars/sublime-golo"] [submodule "vendor/grammars/sublime-golo"]
path = vendor/grammars/sublime-golo path = vendor/grammars/sublime-golo
url = https://github.com/TypeUnsafe/sublime-golo url = https://github.com/TypeUnsafe/sublime-golo
[submodule "vendor/grammars/JSyntax"]
path = vendor/grammars/JSyntax
url = https://github.com/bcj/JSyntax
[submodule "vendor/grammars/TXL"] [submodule "vendor/grammars/TXL"]
path = vendor/grammars/TXL path = vendor/grammars/TXL
url = https://github.com/MikeHoffert/Sublime-Text-TXL-syntax url = https://github.com/MikeHoffert/Sublime-Text-TXL-syntax
@@ -892,3 +889,9 @@
[submodule "vendor/grammars/Syntax-highlighting-for-PostCSS"] [submodule "vendor/grammars/Syntax-highlighting-for-PostCSS"]
path = vendor/grammars/Syntax-highlighting-for-PostCSS path = vendor/grammars/Syntax-highlighting-for-PostCSS
url = https://github.com/hudochenkov/Syntax-highlighting-for-PostCSS url = https://github.com/hudochenkov/Syntax-highlighting-for-PostCSS
[submodule "vendor/grammars/javadoc.tmbundle"]
path = vendor/grammars/javadoc.tmbundle
url = https://github.com/textmate/javadoc.tmbundle
[submodule "vendor/grammars/JSyntax"]
path = vendor/grammars/JSyntax
url = https://github.com/tikkanz/JSyntax

View File

@@ -1,4 +1,3 @@
---
https://bitbucket.org/Clams/sublimesystemverilog/get/default.tar.gz: https://bitbucket.org/Clams/sublimesystemverilog/get/default.tar.gz:
- source.systemverilog - source.systemverilog
- source.ucfconstraints - source.ucfconstraints
@@ -344,6 +343,8 @@ vendor/grammars/java.tmbundle:
- source.java-properties - source.java-properties
- text.html.jsp - text.html.jsp
- text.junit-test-report - text.junit-test-report
vendor/grammars/javadoc.tmbundle:
- text.html.javadoc
vendor/grammars/javascript-objective-j.tmbundle: vendor/grammars/javascript-objective-j.tmbundle:
- source.js.objj - source.js.objj
vendor/grammars/jflex.tmbundle: vendor/grammars/jflex.tmbundle:
@@ -574,7 +575,7 @@ vendor/grammars/opa.tmbundle:
- source.opa - source.opa
vendor/grammars/openscad.tmbundle: vendor/grammars/openscad.tmbundle:
- source.scad - source.scad
vendor/grammars/oz-tmbundle/Syntaxes/Oz.tmLanguage: vendor/grammars/oz-tmbundle:
- source.oz - source.oz
vendor/grammars/parrot: vendor/grammars/parrot:
- source.parrot.pir - source.parrot.pir

View File

@@ -1,7 +0,0 @@
{
"repository": "https://github.com/github/linguist",
"dependencies": {
"season": "~>5.4"
},
"license": "MIT"
}

View File

@@ -84,13 +84,13 @@ if repo_old
log "Deregistering: #{repo_old}" log "Deregistering: #{repo_old}"
`git submodule deinit #{repo_old}` `git submodule deinit #{repo_old}`
`git rm -rf #{repo_old}` `git rm -rf #{repo_old}`
`script/convert-grammars` `script/grammar-compiler -update`
end end
log "Registering new submodule: #{repo_new}" log "Registering new submodule: #{repo_new}"
`git submodule add -f #{https} #{repo_new}` `git submodule add -f #{https} #{repo_new}`
exit 1 if $?.exitstatus > 0 exit 1 if $?.exitstatus > 0
`script/convert-grammars --add #{repo_new}` `script/grammar-compiler -add #{repo_new}`
log "Confirming license" log "Confirming license"
if repo_old if repo_old

View File

@@ -1,319 +0,0 @@
#!/usr/bin/env ruby
require 'bundler/setup'
require 'json'
require 'net/http'
require 'optparse'
require 'plist'
require 'set'
require 'thread'
require 'tmpdir'
require 'uri'
require 'yaml'
ROOT = File.expand_path("../..", __FILE__)
GRAMMARS_PATH = File.join(ROOT, "grammars")
SOURCES_FILE = File.join(ROOT, "grammars.yml")
CSONC = File.join(ROOT, "node_modules", ".bin", "csonc")
$options = {
:add => false,
:install => true,
:output => SOURCES_FILE,
:remote => true,
}
class SingleFile
def initialize(path)
@path = path
end
def url
@path
end
def fetch(tmp_dir)
[@path]
end
end
class DirectoryPackage
def self.fetch(dir)
Dir["#{dir}/**/*"].select do |path|
case File.extname(path.downcase)
when '.plist'
path.split('/')[-2] == 'Syntaxes'
when '.tmlanguage', '.yaml-tmlanguage'
true
when '.cson', '.json'
path.split('/')[-2] == 'grammars'
else
false
end
end
end
def initialize(directory)
@directory = directory
end
def url
@directory
end
def fetch(tmp_dir)
self.class.fetch(File.join(ROOT, @directory))
end
end
class TarballPackage
def self.fetch(tmp_dir, url)
`curl --silent --location --max-time 30 --output "#{tmp_dir}/archive" "#{url}"`
raise "Failed to fetch GH package: #{url} #{$?.to_s}" unless $?.success?
output = File.join(tmp_dir, 'extracted')
Dir.mkdir(output)
`tar -C "#{output}" -xf "#{tmp_dir}/archive"`
raise "Failed to uncompress tarball: #{tmp_dir}/archive (from #{url}) #{$?.to_s}" unless $?.success?
DirectoryPackage.fetch(output)
end
attr_reader :url
def initialize(url)
@url = url
end
def fetch(tmp_dir)
self.class.fetch(tmp_dir, url)
end
end
class SingleGrammar
attr_reader :url
def initialize(url)
@url = url
end
def fetch(tmp_dir)
filename = File.join(tmp_dir, File.basename(url))
`curl --silent --location --max-time 10 --output "#{filename}" "#{url}"`
raise "Failed to fetch grammar: #{url}: #{$?.to_s}" unless $?.success?
[filename]
end
end
class SVNPackage
attr_reader :url
def initialize(url)
@url = url
end
def fetch(tmp_dir)
`svn export -q "#{url}/Syntaxes" "#{tmp_dir}/Syntaxes"`
raise "Failed to export SVN repository: #{url}: #{$?.to_s}" unless $?.success?
Dir["#{tmp_dir}/Syntaxes/*.{plist,tmLanguage,tmlanguage,YAML-tmLanguage}"]
end
end
class GitHubPackage
def self.parse_url(url)
url, ref = url.split("@", 2)
path = URI.parse(url).path.split('/')
[path[1], path[2].chomp('.git'), ref || "master"]
end
attr_reader :user
attr_reader :repo
attr_reader :ref
def initialize(url)
@user, @repo, @ref = self.class.parse_url(url)
end
def url
suffix = "@#{ref}" unless ref == "master"
"https://github.com/#{user}/#{repo}#{suffix}"
end
def fetch(tmp_dir)
url = "https://github.com/#{user}/#{repo}/archive/#{ref}.tar.gz"
TarballPackage.fetch(tmp_dir, url)
end
end
def load_grammar(path)
case File.extname(path.downcase)
when '.plist', '.tmlanguage'
Plist::parse_xml(path)
when '.yaml-tmlanguage'
content = File.read(path)
# Attempt to parse YAML file even if it has a YAML 1.2 header
if content.lines[0] =~ /^%YAML[ :]1\.2/
content = content.lines[1..-1].join
end
begin
YAML.load(content)
rescue Psych::SyntaxError => e
$stderr.puts "Failed to parse YAML grammar '#{path}'"
end
when '.cson'
cson = `"#{CSONC}" "#{path}"`
raise "Failed to convert CSON grammar '#{path}': #{$?.to_s}" unless $?.success?
JSON.parse(cson)
when '.json'
JSON.parse(File.read(path))
else
raise "Invalid document type #{path}"
end
end
def load_grammars(tmp_dir, source, all_scopes)
is_url = source.start_with?("http:", "https:")
return [] if is_url && !$options[:remote]
return [] if !is_url && !File.exist?(source)
p = if !is_url
if File.directory?(source)
DirectoryPackage.new(source)
else
SingleFile.new(source)
end
elsif source.end_with?('.tmLanguage', '.plist', '.YAML-tmLanguage')
SingleGrammar.new(source)
elsif source.start_with?('https://github.com')
GitHubPackage.new(source)
elsif source.start_with?('http://svn.textmate.org')
SVNPackage.new(source)
elsif source.end_with?('.tar.gz')
TarballPackage.new(source)
else
nil
end
raise "Unsupported source: #{source}" unless p
p.fetch(tmp_dir).map do |path|
grammar = load_grammar(path)
scope = grammar['scopeName'] || grammar['scope']
if all_scopes.key?(scope)
unless all_scopes[scope] == p.url
$stderr.puts "WARN: Duplicated scope #{scope}\n" +
" Current package: #{p.url}\n" +
" Previous package: #{all_scopes[scope]}"
end
next
end
all_scopes[scope] = p.url
grammar
end.compact
end
def install_grammars(grammars, path)
installed = []
grammars.each do |grammar|
scope = grammar['scopeName'] || grammar['scope']
File.write(File.join(GRAMMARS_PATH, "#{scope}.json"), JSON.pretty_generate(grammar))
installed << scope
end
$stderr.puts("OK #{path} (#{installed.join(', ')})")
end
def run_thread(queue, all_scopes)
Dir.mktmpdir do |tmpdir|
loop do
source, index = begin
queue.pop(true)
rescue ThreadError
# The queue is empty.
break
end
dir = "#{tmpdir}/#{index}"
Dir.mkdir(dir)
grammars = load_grammars(dir, source, all_scopes)
install_grammars(grammars, source) if $options[:install]
end
end
end
def generate_yaml(all_scopes, base)
yaml = all_scopes.each_with_object(base) do |(key,value),out|
out[value] ||= []
out[value] << key
end
yaml = Hash[yaml.sort]
yaml.each { |k, v| v.sort! }
yaml
end
def main(sources)
begin
Dir.mkdir(GRAMMARS_PATH)
rescue Errno::EEXIST
end
`npm install`
all_scopes = {}
if source = $options[:add]
Dir.mktmpdir do |tmpdir|
grammars = load_grammars(tmpdir, source, all_scopes)
install_grammars(grammars, source) if $options[:install]
end
generate_yaml(all_scopes, sources)
else
queue = Queue.new
sources.each do |url, scopes|
queue.push([url, queue.length])
end
threads = 8.times.map do
Thread.new { run_thread(queue, all_scopes) }
end
threads.each(&:join)
generate_yaml(all_scopes, {})
end
end
OptionParser.new do |opts|
opts.banner = "Usage: #{$0} [options]"
opts.on("--add GRAMMAR", "Add a new grammar. GRAMMAR may be a file path or URL.") do |a|
$options[:add] = a
end
opts.on("--[no-]install", "Install grammars into grammars/ directory.") do |i|
$options[:install] = i
end
opts.on("--output FILE", "Write output to FILE. Use - for stdout.") do |o|
$options[:output] = o == "-" ? $stdout : o
end
opts.on("--[no-]remote", "Download remote grammars.") do |r|
$options[:remote] = r
end
end.parse!
sources = File.open(SOURCES_FILE) do |file|
YAML.load(file)
end
yaml = main(sources)
if $options[:output].is_a?(IO)
$options[:output].write(YAML.dump(yaml))
else
File.write($options[:output], YAML.dump(yaml))
end

12
script/grammar-compiler Executable file
View File

@@ -0,0 +1,12 @@
#!/bin/sh
set -e
cd "$(dirname "$0")/.."
image="linguist/grammar-compiler:latest"
mkdir -p grammars
exec docker run --rm \
-u $(id -u $USER):$(id -g $USER) \
-v $PWD:/src/linguist \
-w /src/linguist -ti $image "$@"

View File

@@ -1,60 +0,0 @@
#!/usr/bin/env ruby
require "bundler/setup"
require "json"
require "linguist"
require "set"
require "yaml"
ROOT = File.expand_path("../../", __FILE__)
def find_includes(json)
case json
when Hash
result = []
if inc = json["include"]
result << inc.split("#", 2).first unless inc.start_with?("#", "$")
end
result + json.values.flat_map { |v| find_includes(v) }
when Array
json.flat_map { |v| find_includes(v) }
else
[]
end
end
def transitive_includes(scope, includes)
scopes = Set.new
queue = includes[scope] || []
while s = queue.shift
next if scopes.include?(s)
scopes << s
queue += includes[s] || []
end
scopes
end
includes = {}
Dir[File.join(ROOT, "grammars/*.json")].each do |path|
scope = File.basename(path).sub(/\.json/, '')
json = JSON.load(File.read(path))
incs = find_includes(json)
next if incs.empty?
includes[scope] ||= []
includes[scope] += incs
end
yaml = YAML.load(File.read(File.join(ROOT, "grammars.yml")))
language_scopes = Linguist::Language.all.map(&:tm_scope).to_set
# The set of used scopes is the scopes for each language, plus all the scopes
# they include, transitively.
used_scopes = language_scopes + language_scopes.flat_map { |s| transitive_includes(s, includes).to_a }.to_set
unused = yaml.reject { |repo, scopes| scopes.any? { |scope| used_scopes.include?(scope) } }
puts "Unused grammar repos"
puts unused.map { |repo, scopes| sprintf("%-100s %s", repo, scopes.join(", ")) }.sort.join("\n")
yaml.delete_if { |k| unused.key?(k) }
File.write(File.join(ROOT, "grammars.yml"), YAML.dump(yaml))

View File

@@ -94,19 +94,6 @@ class TestGrammars < Minitest::Test
assert nonexistent_submodules.empty? && unlisted_submodules.empty?, message.sub(/\.\Z/, "") assert nonexistent_submodules.empty? && unlisted_submodules.empty?, message.sub(/\.\Z/, "")
end end
def test_local_scopes_are_in_sync
actual = YAML.load(`"#{File.join(ROOT, "script", "convert-grammars")}" --output - --no-install --no-remote`)
assert $?.success?, "script/convert-grammars failed"
# We're not checking remote grammars. That can take a long time and make CI
# flaky if network conditions are poor.
@grammars.delete_if { |k, v| k.start_with?("http:", "https:") }
@grammars.each do |k, v|
assert_equal v, actual[k], "The scopes listed for #{k} in grammars.yml don't match the scopes found in that repository"
end
end
def test_readme_file_is_in_sync def test_readme_file_is_in_sync
current_data = File.read("#{ROOT}/vendor/README.md").to_s.sub(/\A.+?<!--.+?-->\n/ms, "") current_data = File.read("#{ROOT}/vendor/README.md").to_s.sub(/\A.+?<!--.+?-->\n/ms, "")
updated_data = `script/list-grammars --print` updated_data = `script/list-grammars --print`

1
tools/grammars/.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
/vendor

35
tools/grammars/Dockerfile Normal file
View File

@@ -0,0 +1,35 @@
FROM golang:1.9.2
RUN apt-get update
RUN apt-get upgrade -y
RUN apt-get install -y curl gnupg
RUN curl -sL https://deb.nodesource.com/setup_6.x | bash -
RUN apt-get install -y nodejs
RUN npm install -g season
RUN apt-get install -y cmake
RUN cd /tmp && git clone https://github.com/vmg/pcre
RUN mkdir -p /tmp/pcre/build && cd /tmp/pcre/build && \
cmake .. \
-DPCRE_SUPPORT_JIT=ON \
-DPCRE_SUPPORT_UTF=ON \
-DPCRE_SUPPORT_UNICODE_PROPERTIES=ON \
-DBUILD_SHARED_LIBS=OFF \
-DCMAKE_C_FLAGS="-fPIC $(EXTRA_PCRE_CFLAGS)" \
-DCMAKE_BUILD_TYPE=RelWithDebInfo \
-DPCRE_BUILD_PCRECPP=OFF \
-DPCRE_BUILD_PCREGREP=OFF \
-DPCRE_BUILD_TESTS=OFF \
-G "Unix Makefiles" && \
make && make install
RUN rm -rf /tmp/pcre
RUN go get -u github.com/golang/dep/cmd/dep
WORKDIR /go/src/github.com/github/linguist/tools/grammars
COPY . .
RUN dep ensure
RUN go install ./cmd/grammar-compiler
ENTRYPOINT ["grammar-compiler"]

45
tools/grammars/Gopkg.lock generated Normal file
View File

@@ -0,0 +1,45 @@
# This file is autogenerated, do not edit; changes may be undone by the next 'dep ensure'.
[[projects]]
branch = "master"
name = "github.com/golang/protobuf"
packages = ["proto"]
revision = "1e59b77b52bf8e4b449a57e6f79f21226d571845"
[[projects]]
branch = "master"
name = "github.com/groob/plist"
packages = ["."]
revision = "7b367e0aa692e62a223e823f3288c0c00f519a36"
[[projects]]
name = "github.com/mattn/go-runewidth"
packages = ["."]
revision = "9e777a8366cce605130a531d2cd6363d07ad7317"
version = "v0.0.2"
[[projects]]
branch = "master"
name = "github.com/mitchellh/mapstructure"
packages = ["."]
revision = "06020f85339e21b2478f756a78e295255ffa4d6a"
[[projects]]
name = "gopkg.in/cheggaaa/pb.v1"
packages = ["."]
revision = "657164d0228d6bebe316fdf725c69f131a50fb10"
version = "v1.0.18"
[[projects]]
branch = "v2"
name = "gopkg.in/yaml.v2"
packages = ["."]
revision = "287cf08546ab5e7e37d55a84f7ed3fd1db036de5"
[solve-meta]
analyzer-name = "dep"
analyzer-version = 1
inputs-digest = "eb10157687c05a542025c119a5280abe429e29141bde70dd437d48668f181861"
solver-name = "gps-cdcl"
solver-version = 1

19
tools/grammars/Gopkg.toml Normal file
View File

@@ -0,0 +1,19 @@
[[constraint]]
branch = "v2"
name = "gopkg.in/yaml.v2"
[[constraint]]
branch = "master"
name = "github.com/groob/plist"
[[constraint]]
branch = "master"
name = "github.com/golang/protobuf"
[[constraint]]
branch = "master"
name = "github.com/mitchellh/mapstructure"
[[constraint]]
name = "gopkg.in/cheggaaa/pb.v1"
version = "1.0.18"

View File

@@ -0,0 +1,80 @@
package main
import (
"flag"
"fmt"
"os"
"os/exec"
"github.com/github/linguist/tools/grammars/compiler"
)
var linguistRoot = flag.String("linguist", "", "path to Linguist installation")
var protoOut = flag.String("proto", "", "dump Protobuf library")
var jsonOut = flag.String("json", "", "dump JSON output")
var addGrammar = flag.String("add", "", "add a new grammar source")
var updateList = flag.Bool("update", false, "update grammars.yml instead of verifying its contents")
var report = flag.String("report", "", "write report to file")
func fatal(err error) {
fmt.Fprintf(os.Stderr, "FATAL: %s\n", err)
os.Exit(1)
}
func main() {
flag.Parse()
if _, err := exec.LookPath("csonc"); err != nil {
fatal(err)
}
if *linguistRoot == "" {
cwd, err := os.Getwd()
if err != nil {
fatal(err)
}
*linguistRoot = cwd
}
conv, err := compiler.NewConverter(*linguistRoot)
if err != nil {
fatal(err)
}
if *addGrammar != "" {
if err := conv.AddGrammar(*addGrammar); err != nil {
fatal(err)
}
}
if err := conv.ConvertGrammars(*updateList); err != nil {
fatal(err)
}
if err := conv.WriteGrammarList(); err != nil {
fatal(err)
}
if *protoOut != "" {
if err := conv.WriteProto(*protoOut); err != nil {
fatal(err)
}
}
if *jsonOut != "" {
if err := conv.WriteJSON(*jsonOut); err != nil {
fatal(err)
}
}
if *report == "" {
conv.Report(os.Stderr)
} else {
f, err := os.Create(*report)
if err != nil {
fatal(err)
}
conv.Report(f)
f.Close()
}
}

View File

@@ -0,0 +1,227 @@
package compiler
import (
"encoding/json"
"fmt"
"io"
"io/ioutil"
"os"
"path"
"runtime"
"sort"
"strings"
"sync"
grammar "github.com/github/linguist/tools/grammars/proto"
"github.com/golang/protobuf/proto"
pb "gopkg.in/cheggaaa/pb.v1"
yaml "gopkg.in/yaml.v2"
)
type Converter struct {
root string
modified bool
grammars map[string][]string
Loaded map[string]*Repository
progress *pb.ProgressBar
wg sync.WaitGroup
queue chan string
mu sync.Mutex
}
func (conv *Converter) Load(src string) *Repository {
if strings.HasPrefix(src, "http://") || strings.HasPrefix(src, "https://") {
return LoadFromURL(src)
}
return LoadFromFilesystem(conv.root, src)
}
func (conv *Converter) work() {
for source := range conv.queue {
repo := conv.Load(source)
conv.mu.Lock()
conv.Loaded[source] = repo
conv.mu.Unlock()
conv.progress.Increment()
}
conv.wg.Done()
}
func (conv *Converter) AddGrammar(source string) error {
repo := conv.Load(source)
if len(repo.Files) == 0 {
return fmt.Errorf("source '%s' contains no grammar files", source)
}
conv.grammars[source] = repo.Scopes()
conv.modified = true
fmt.Printf("OK! added grammar source '%s'\n", source)
for scope := range repo.Files {
fmt.Printf("\tnew scope: %s\n", scope)
}
return nil
}
func (conv *Converter) ScopeMap() map[string]*Repository {
allScopes := make(map[string]*Repository)
for _, repo := range conv.Loaded {
for scope := range repo.Files {
if original := allScopes[scope]; original != nil {
repo.Fail(&DuplicateScopeError{original, scope})
} else {
allScopes[scope] = repo
}
}
}
return allScopes
}
func (conv *Converter) ConvertGrammars(update bool) error {
conv.Loaded = make(map[string]*Repository)
conv.queue = make(chan string, 128)
conv.progress = pb.New(len(conv.grammars))
conv.progress.Start()
for i := 0; i < runtime.NumCPU(); i++ {
conv.wg.Add(1)
go conv.work()
}
for src := range conv.grammars {
conv.queue <- src
}
close(conv.queue)
conv.wg.Wait()
done := fmt.Sprintf("done! processed %d grammars\n", len(conv.Loaded))
conv.progress.FinishPrint(done)
if update {
conv.grammars = make(map[string][]string)
conv.modified = true
}
knownScopes := conv.ScopeMap()
for source, repo := range conv.Loaded {
repo.FixRules(knownScopes)
if update {
conv.grammars[source] = repo.Scopes()
} else {
expected := conv.grammars[source]
repo.CompareScopes(expected)
}
}
return nil
}
func (conv *Converter) WriteProto(path string) error {
library := grammar.Library{
Grammars: make(map[string]*grammar.Rule),
}
for _, repo := range conv.Loaded {
for scope, file := range repo.Files {
library.Grammars[scope] = file.Rule
}
}
pb, err := proto.Marshal(&library)
if err != nil {
return err
}
return ioutil.WriteFile(path, pb, 0666)
}
func (conv *Converter) writeJSONFile(path string, rule *grammar.Rule) error {
j, err := os.Create(path)
if err != nil {
return err
}
defer j.Close()
enc := json.NewEncoder(j)
enc.SetIndent("", " ")
return enc.Encode(rule)
}
func (conv *Converter) WriteJSON(rulePath string) error {
if err := os.MkdirAll(rulePath, os.ModePerm); err != nil {
return err
}
for _, repo := range conv.Loaded {
for scope, file := range repo.Files {
p := path.Join(rulePath, scope+".json")
if err := conv.writeJSONFile(p, file.Rule); err != nil {
return err
}
}
}
return nil
}
func (conv *Converter) WriteGrammarList() error {
if !conv.modified {
return nil
}
outyml, err := yaml.Marshal(conv.grammars)
if err != nil {
return err
}
ymlpath := path.Join(conv.root, "grammars.yml")
return ioutil.WriteFile(ymlpath, outyml, 0666)
}
func (conv *Converter) Report(w io.Writer) {
var failed []*Repository
for _, repo := range conv.Loaded {
if len(repo.Errors) > 0 {
failed = append(failed, repo)
}
}
sort.Slice(failed, func(i, j int) bool {
return failed[i].Source < failed[j].Source
})
for _, repo := range failed {
fmt.Fprintf(w, "- [ ] %s (%d errors)\n", repo, len(repo.Errors))
for _, err := range repo.Errors {
fmt.Fprintf(w, " - [ ] %s\n", err)
}
fmt.Fprintf(w, "\n")
}
}
func NewConverter(root string) (*Converter, error) {
yml, err := ioutil.ReadFile(path.Join(root, "grammars.yml"))
if err != nil {
return nil, err
}
conv := &Converter{root: root}
if err := yaml.Unmarshal(yml, &conv.grammars); err != nil {
return nil, err
}
return conv, nil
}

View File

@@ -0,0 +1,21 @@
package compiler
import (
"bytes"
"os/exec"
)
func ConvertCSON(data []byte) ([]byte, error) {
stdin := bytes.NewBuffer(data)
stdout := &bytes.Buffer{}
cmd := exec.Command("csonc")
cmd.Stdin = stdin
cmd.Stdout = stdout
if err := cmd.Run(); err != nil {
return nil, err
}
return stdout.Bytes(), nil
}

View File

@@ -0,0 +1,29 @@
package compiler
var GrammarAliases = map[string]string{
"source.erb": "text.html.erb",
"source.cpp": "source.c++",
"source.less": "source.css.less",
"text.html.markdown": "source.gfm",
"text.md": "source.gfm",
"source.php": "text.html.php",
"text.plain": "",
"source.asciidoc": "text.html.asciidoc",
"source.perl6": "source.perl6fe",
"source.css.scss": "source.scss",
}
var KnownFields = map[string]bool{
"comment": true,
"uuid": true,
"author": true,
"comments": true,
"macros": true,
"fileTypes": true,
"firstLineMatch": true,
"keyEquivalent": true,
"foldingStopMarker": true,
"foldingStartMarker": true,
"foldingEndMarker": true,
"limitLineLength": true,
}

View File

@@ -0,0 +1,85 @@
package compiler
import "fmt"
import "strings"
type ConversionError struct {
Path string
Err error
}
func (err *ConversionError) Error() string {
return fmt.Sprintf(
"Grammar conversion failed. File `%s` failed to parse: %s",
err.Path, err.Err)
}
type DuplicateScopeError struct {
Original *Repository
Duplicate string
}
func (err *DuplicateScopeError) Error() string {
return fmt.Sprintf(
"Duplicate scope in repository: scope `%s` was already defined in %s",
err.Duplicate, err.Original)
}
type MissingScopeError struct {
Scope string
}
func (err *MissingScopeError) Error() string {
return fmt.Sprintf(
"Missing scope in repository: `%s` is listed in grammars.yml but cannot be found",
err.Scope)
}
type UnexpectedScopeError struct {
File *LoadedFile
Scope string
}
func (err *UnexpectedScopeError) Error() string {
return fmt.Sprintf(
"Unexpected scope in repository: `%s` declared in %s was not listed in grammars.yml",
err.Scope, err.File)
}
type MissingIncludeError struct {
File *LoadedFile
Include string
}
func (err *MissingIncludeError) Error() string {
return fmt.Sprintf(
"Missing include in grammar: %s attempts to include `%s` but the scope cannot be found",
err.File, err.Include)
}
type UnknownKeysError struct {
File *LoadedFile
Keys []string
}
func (err *UnknownKeysError) Error() string {
var keys []string
for _, k := range err.Keys {
keys = append(keys, fmt.Sprintf("`%s`", k))
}
return fmt.Sprintf(
"Unknown keys in grammar: %s contains invalid keys (%s)",
err.File, strings.Join(keys, ", "))
}
type InvalidRegexError struct {
File *LoadedFile
Err error
}
func (err *InvalidRegexError) Error() string {
return fmt.Sprintf(
"Invalid regex in grammar: %s contains a malformed regex (%s)",
err.File, err.Err)
}

View File

@@ -0,0 +1,124 @@
package compiler
import (
"fmt"
"os"
"path/filepath"
"sort"
"strings"
grammar "github.com/github/linguist/tools/grammars/proto"
)
type LoadedFile struct {
Path string
Rule *grammar.Rule
}
func (f *LoadedFile) String() string {
return fmt.Sprintf("`%s` (in `%s`)", f.Rule.ScopeName, f.Path)
}
type Repository struct {
Source string
Upstream string
Files map[string]*LoadedFile
Errors []error
}
func newRepository(src string) *Repository {
return &Repository{
Source: src,
Files: make(map[string]*LoadedFile),
}
}
func (repo *Repository) String() string {
str := fmt.Sprintf("repository `%s`", repo.Source)
if repo.Upstream != "" {
str = str + fmt.Sprintf(" (from %s)", repo.Upstream)
}
return str
}
func (repo *Repository) Fail(err error) {
repo.Errors = append(repo.Errors, err)
}
func (repo *Repository) AddFile(path string, rule *grammar.Rule, uk []string) {
file := &LoadedFile{
Path: path,
Rule: rule,
}
repo.Files[rule.ScopeName] = file
if len(uk) > 0 {
repo.Fail(&UnknownKeysError{file, uk})
}
}
func toMap(slice []string) map[string]bool {
m := make(map[string]bool)
for _, s := range slice {
m[s] = true
}
return m
}
func (repo *Repository) CompareScopes(scopes []string) {
expected := toMap(scopes)
for scope, file := range repo.Files {
if !expected[scope] {
repo.Fail(&UnexpectedScopeError{file, scope})
}
}
for scope := range expected {
if _, ok := repo.Files[scope]; !ok {
repo.Fail(&MissingScopeError{scope})
}
}
}
func (repo *Repository) FixRules(knownScopes map[string]*Repository) {
for _, file := range repo.Files {
w := walker{
File: file,
Known: knownScopes,
Missing: make(map[string]bool),
}
w.walk(file.Rule)
repo.Errors = append(repo.Errors, w.Errors...)
}
}
func (repo *Repository) Scopes() (scopes []string) {
for s := range repo.Files {
scopes = append(scopes, s)
}
sort.Strings(scopes)
return
}
func isValidGrammar(path string, info os.FileInfo) bool {
if info.IsDir() {
return false
}
dir := filepath.Dir(path)
ext := filepath.Ext(path)
switch strings.ToLower(ext) {
case ".plist":
return strings.HasSuffix(dir, "/Syntaxes")
case ".tmlanguage", ".yaml-tmlanguage":
return true
case ".cson", ".json":
return strings.HasSuffix(dir, "/grammars")
default:
return false
}
}

View File

@@ -0,0 +1,80 @@
package compiler
import (
"io/ioutil"
"os"
"os/exec"
"path"
"path/filepath"
"strings"
)
type fsLoader struct {
*Repository
abspath string
}
func (l *fsLoader) findGrammars() (files []string, err error) {
err = filepath.Walk(l.abspath,
func(path string, info os.FileInfo, err error) error {
if err == nil && isValidGrammar(path, info) {
files = append(files, path)
}
return nil
})
return
}
func (l *fsLoader) load() {
grammars, err := l.findGrammars()
if err != nil {
l.Fail(err)
return
}
for _, path := range grammars {
data, err := ioutil.ReadFile(path)
if err != nil {
l.Fail(err)
continue
}
if rel, err := filepath.Rel(l.abspath, path); err == nil {
path = rel
}
rule, unknown, err := ConvertProto(filepath.Ext(path), data)
if err != nil {
l.Fail(&ConversionError{path, err})
continue
}
if _, ok := l.Files[rule.ScopeName]; ok {
continue
}
l.AddFile(path, rule, unknown)
}
}
func gitRemoteName(path string) (string, error) {
remote, err := exec.Command("git", "-C", path, "remote", "get-url", "origin").Output()
if err != nil {
return "", err
}
return strings.TrimSpace(string(remote)), nil
}
func LoadFromFilesystem(root, src string) *Repository {
loader := fsLoader{
Repository: newRepository(src),
abspath: path.Join(root, src),
}
loader.load()
if ups, err := gitRemoteName(loader.abspath); err == nil {
loader.Repository.Upstream = ups
}
return loader.Repository
}

View File

@@ -0,0 +1,93 @@
package compiler
import (
"archive/tar"
"compress/gzip"
"io"
"io/ioutil"
"net/http"
"path/filepath"
"strings"
)
type urlLoader struct {
*Repository
}
func (l *urlLoader) loadTarball(r io.Reader) {
gzf, err := gzip.NewReader(r)
if err != nil {
l.Fail(err)
return
}
defer gzf.Close()
tarReader := tar.NewReader(gzf)
for true {
header, err := tarReader.Next()
if err != nil {
if err != io.EOF {
l.Fail(err)
}
return
}
if isValidGrammar(header.Name, header.FileInfo()) {
data, err := ioutil.ReadAll(tarReader)
if err != nil {
l.Fail(err)
return
}
ext := filepath.Ext(header.Name)
rule, unknown, err := ConvertProto(ext, data)
if err != nil {
l.Fail(&ConversionError{header.Name, err})
continue
}
if _, ok := l.Files[rule.ScopeName]; ok {
continue
}
l.AddFile(header.Name, rule, unknown)
}
}
}
func (l *urlLoader) load() {
res, err := http.Get(l.Source)
if err != nil {
l.Fail(err)
return
}
defer res.Body.Close()
if strings.HasSuffix(l.Source, ".tar.gz") {
l.loadTarball(res.Body)
return
}
data, err := ioutil.ReadAll(res.Body)
if err != nil {
l.Fail(err)
return
}
ext := filepath.Ext(l.Source)
filename := filepath.Base(l.Source)
rule, unknown, err := ConvertProto(ext, data)
if err != nil {
l.Fail(&ConversionError{filename, err})
return
}
l.AddFile(filename, rule, unknown)
}
func LoadFromURL(src string) *Repository {
loader := urlLoader{newRepository(src)}
loader.load()
return loader.Repository
}

View File

@@ -0,0 +1,68 @@
package compiler
import (
"fmt"
"github.com/github/linguist/tools/grammars/pcre"
)
type replacement struct {
pos int
len int
val string
}
func fixRegex(re string) (string, bool) {
var (
replace []replacement
escape = false
hasBackRefs = false
)
for i, ch := range re {
if escape {
if ch == 'h' {
replace = append(replace, replacement{i - 1, 2, "[[:xdigit:]]"})
}
if '0' <= ch && ch <= '9' {
hasBackRefs = true
}
}
escape = !escape && ch == '\\'
}
if len(replace) > 0 {
reb := []byte(re)
offset := 0
for _, repl := range replace {
reb = append(
reb[:offset+repl.pos],
append([]byte(repl.val), reb[offset+repl.pos+repl.len:]...)...)
offset += len(repl.val) - repl.len
}
return string(reb), hasBackRefs
}
return re, hasBackRefs
}
func CheckPCRE(re string) (string, error) {
hasBackRefs := false
if re == "" {
return "", nil
}
if len(re) > 32*1024 {
return "", fmt.Errorf(
"regex %s: definition too long (%d bytes)",
pcre.RegexPP(re), len(re))
}
re, hasBackRefs = fixRegex(re)
if !hasBackRefs {
if err := pcre.CheckRegexp(re, pcre.DefaultFlags); err != nil {
return "", err
}
}
return re, nil
}

View File

@@ -0,0 +1,27 @@
package compiler
import (
"testing"
)
func Test_fixRegex(t *testing.T) {
tests := []struct {
re string
want string
}{
{"foobar", "foobar"},
{`testing\h`, "testing[[:xdigit:]]"},
{`\htest`, `[[:xdigit:]]test`},
{`abc\hdef`, `abc[[:xdigit:]]def`},
{`\\\htest`, `\\[[:xdigit:]]test`},
{`\\htest`, `\\htest`},
{`\h\h\h\h`, `[[:xdigit:]][[:xdigit:]][[:xdigit:]][[:xdigit:]]`},
{`abc\hdef\hghi\h`, `abc[[:xdigit:]]def[[:xdigit:]]ghi[[:xdigit:]]`},
}
for _, tt := range tests {
got, _ := fixRegex(tt.re)
if got != tt.want {
t.Errorf("fixRegex() got = %v, want %v", got, tt.want)
}
}
}

View File

@@ -0,0 +1,96 @@
package compiler
import (
"encoding/json"
"fmt"
"reflect"
"strings"
grammar "github.com/github/linguist/tools/grammars/proto"
"github.com/groob/plist"
"github.com/mitchellh/mapstructure"
yaml "gopkg.in/yaml.v2"
)
func looseDecoder(f reflect.Kind, t reflect.Kind, data interface{}) (interface{}, error) {
dataVal := reflect.ValueOf(data)
switch t {
case reflect.Bool:
switch f {
case reflect.Bool:
return dataVal.Bool(), nil
case reflect.Float32, reflect.Float64:
return (int(dataVal.Float()) != 0), nil
case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64:
return (dataVal.Int() != 0), nil
case reflect.Uint, reflect.Uint8, reflect.Uint16, reflect.Uint32, reflect.Uint64:
return (dataVal.Uint() != 0), nil
case reflect.String:
switch dataVal.String() {
case "1":
return true, nil
case "0":
return false, nil
}
}
}
return data, nil
}
func filterUnusedKeys(keys []string) (out []string) {
for _, k := range keys {
parts := strings.Split(k, ".")
field := parts[len(parts)-1]
if !KnownFields[field] {
out = append(out, k)
}
}
return
}
func ConvertProto(ext string, data []byte) (*grammar.Rule, []string, error) {
var (
raw map[string]interface{}
out grammar.Rule
err error
md mapstructure.Metadata
)
switch strings.ToLower(ext) {
case ".plist", ".tmlanguage":
err = plist.Unmarshal(data, &raw)
case ".yaml-tmlanguage":
err = yaml.Unmarshal(data, &raw)
case ".cson":
data, err = ConvertCSON(data)
if err == nil {
err = json.Unmarshal(data, &raw)
}
case ".json":
err = json.Unmarshal(data, &raw)
default:
err = fmt.Errorf("grammars: unsupported extension '%s'", ext)
}
if err != nil {
return nil, nil, err
}
config := mapstructure.DecoderConfig{
Result: &out,
Metadata: &md,
DecodeHook: looseDecoder,
}
decoder, err := mapstructure.NewDecoder(&config)
if err != nil {
return nil, nil, err
}
if err := decoder.Decode(raw); err != nil {
return nil, nil, err
}
return &out, filterUnusedKeys(md.Unused), nil
}

View File

@@ -0,0 +1,79 @@
package compiler
import (
"strings"
grammar "github.com/github/linguist/tools/grammars/proto"
)
func (w *walker) checkInclude(rule *grammar.Rule) {
include := rule.Include
if include == "" || include[0] == '#' || include[0] == '$' {
return
}
if alias, ok := GrammarAliases[include]; ok {
rule.Include = alias
return
}
include = strings.Split(include, "#")[0]
_, ok := w.Known[include]
if !ok {
if !w.Missing[include] {
w.Missing[include] = true
w.Errors = append(w.Errors, &MissingIncludeError{w.File, include})
}
rule.Include = ""
}
}
func (w *walker) checkRegexps(rule *grammar.Rule) {
check := func(re string) string {
re2, err := CheckPCRE(re)
if err != nil {
w.Errors = append(w.Errors, &InvalidRegexError{w.File, err})
}
return re2
}
rule.Match = check(rule.Match)
rule.Begin = check(rule.Begin)
rule.While = check(rule.While)
rule.End = check(rule.End)
}
func (w *walker) walk(rule *grammar.Rule) {
w.checkInclude(rule)
w.checkRegexps(rule)
for _, rule := range rule.Patterns {
w.walk(rule)
}
for _, rule := range rule.Captures {
w.walk(rule)
}
for _, rule := range rule.BeginCaptures {
w.walk(rule)
}
for _, rule := range rule.WhileCaptures {
w.walk(rule)
}
for _, rule := range rule.EndCaptures {
w.walk(rule)
}
for _, rule := range rule.Repository {
w.walk(rule)
}
for _, rule := range rule.Injections {
w.walk(rule)
}
}
type walker struct {
File *LoadedFile
Known map[string]*Repository
Missing map[string]bool
Errors []error
}

11
tools/grammars/docker/build Executable file
View File

@@ -0,0 +1,11 @@
#!/bin/sh
set -ex
cd "$(dirname "$0")/.."
image=linguist/grammar-compiler
docker build -t $image .
if [ "$1" = "--push" ]; then
docker push $image
fi

View File

@@ -0,0 +1,53 @@
package pcre
/*
#cgo LDFLAGS: -lpcre
#include <pcre.h>
*/
import "C"
import (
"fmt"
"strings"
"unsafe"
)
func RegexPP(re string) string {
if len(re) > 32 {
re = fmt.Sprintf("\"`%s`...\"", re[:32])
} else {
re = fmt.Sprintf("\"`%s`\"", re)
}
return strings.Replace(re, "\n", "", -1)
}
type CompileError struct {
Pattern string
Message string
Offset int
}
func (e *CompileError) Error() string {
return fmt.Sprintf("regex %s: %s (at offset %d)",
RegexPP(e.Pattern), e.Message, e.Offset)
}
const DefaultFlags = int(C.PCRE_DUPNAMES | C.PCRE_UTF8 | C.PCRE_NEWLINE_ANYCRLF)
func CheckRegexp(pattern string, flags int) error {
pattern1 := C.CString(pattern)
defer C.free(unsafe.Pointer(pattern1))
var errptr *C.char
var erroffset C.int
ptr := C.pcre_compile(pattern1, C.int(flags), &errptr, &erroffset, nil)
if ptr == nil {
return &CompileError{
Pattern: pattern,
Message: C.GoString(errptr),
Offset: int(erroffset),
}
}
C.free(unsafe.Pointer(ptr))
return nil
}

View File

@@ -0,0 +1,239 @@
// Code generated by protoc-gen-go. DO NOT EDIT.
// source: proto/grammar.proto
/*
Package grammar is a generated protocol buffer package.
It is generated from these files:
proto/grammar.proto
It has these top-level messages:
Rule
Library
*/
package grammar
import proto "github.com/golang/protobuf/proto"
import fmt "fmt"
import math "math"
// Reference imports to suppress errors if they are not otherwise used.
var _ = proto.Marshal
var _ = fmt.Errorf
var _ = math.Inf
// This is a compile-time assertion to ensure that this generated file
// is compatible with the proto package it is being compiled against.
// A compilation error at this line likely means your copy of the
// proto package needs to be updated.
const _ = proto.ProtoPackageIsVersion2 // please upgrade the proto package
type Rule struct {
Name string `protobuf:"bytes,1,opt,name=name" json:"name,omitempty"`
ScopeName string `protobuf:"bytes,2,opt,name=scopeName" json:"scopeName,omitempty"`
ContentName string `protobuf:"bytes,3,opt,name=contentName" json:"contentName,omitempty"`
Match string `protobuf:"bytes,4,opt,name=match" json:"match,omitempty"`
Begin string `protobuf:"bytes,5,opt,name=begin" json:"begin,omitempty"`
While string `protobuf:"bytes,6,opt,name=while" json:"while,omitempty"`
End string `protobuf:"bytes,7,opt,name=end" json:"end,omitempty"`
Include string `protobuf:"bytes,8,opt,name=include" json:"include,omitempty"`
Patterns []*Rule `protobuf:"bytes,9,rep,name=patterns" json:"patterns,omitempty"`
Captures map[string]*Rule `protobuf:"bytes,10,rep,name=captures" json:"captures,omitempty" protobuf_key:"bytes,1,opt,name=key" protobuf_val:"bytes,2,opt,name=value"`
BeginCaptures map[string]*Rule `protobuf:"bytes,11,rep,name=beginCaptures" json:"beginCaptures,omitempty" protobuf_key:"bytes,1,opt,name=key" protobuf_val:"bytes,2,opt,name=value"`
WhileCaptures map[string]*Rule `protobuf:"bytes,12,rep,name=whileCaptures" json:"whileCaptures,omitempty" protobuf_key:"bytes,1,opt,name=key" protobuf_val:"bytes,2,opt,name=value"`
EndCaptures map[string]*Rule `protobuf:"bytes,13,rep,name=endCaptures" json:"endCaptures,omitempty" protobuf_key:"bytes,1,opt,name=key" protobuf_val:"bytes,2,opt,name=value"`
Repository map[string]*Rule `protobuf:"bytes,14,rep,name=repository" json:"repository,omitempty" protobuf_key:"bytes,1,opt,name=key" protobuf_val:"bytes,2,opt,name=value"`
Injections map[string]*Rule `protobuf:"bytes,15,rep,name=injections" json:"injections,omitempty" protobuf_key:"bytes,1,opt,name=key" protobuf_val:"bytes,2,opt,name=value"`
Disabled bool `protobuf:"varint,16,opt,name=disabled" json:"disabled,omitempty"`
ApplyEndPatternLast bool `protobuf:"varint,17,opt,name=applyEndPatternLast" json:"applyEndPatternLast,omitempty"`
IncludeResetBase bool `protobuf:"varint,18,opt,name=includeResetBase" json:"includeResetBase,omitempty"`
}
func (m *Rule) Reset() { *m = Rule{} }
func (m *Rule) String() string { return proto.CompactTextString(m) }
func (*Rule) ProtoMessage() {}
func (*Rule) Descriptor() ([]byte, []int) { return fileDescriptor0, []int{0} }
func (m *Rule) GetName() string {
if m != nil {
return m.Name
}
return ""
}
func (m *Rule) GetScopeName() string {
if m != nil {
return m.ScopeName
}
return ""
}
func (m *Rule) GetContentName() string {
if m != nil {
return m.ContentName
}
return ""
}
func (m *Rule) GetMatch() string {
if m != nil {
return m.Match
}
return ""
}
func (m *Rule) GetBegin() string {
if m != nil {
return m.Begin
}
return ""
}
func (m *Rule) GetWhile() string {
if m != nil {
return m.While
}
return ""
}
func (m *Rule) GetEnd() string {
if m != nil {
return m.End
}
return ""
}
func (m *Rule) GetInclude() string {
if m != nil {
return m.Include
}
return ""
}
func (m *Rule) GetPatterns() []*Rule {
if m != nil {
return m.Patterns
}
return nil
}
func (m *Rule) GetCaptures() map[string]*Rule {
if m != nil {
return m.Captures
}
return nil
}
func (m *Rule) GetBeginCaptures() map[string]*Rule {
if m != nil {
return m.BeginCaptures
}
return nil
}
func (m *Rule) GetWhileCaptures() map[string]*Rule {
if m != nil {
return m.WhileCaptures
}
return nil
}
func (m *Rule) GetEndCaptures() map[string]*Rule {
if m != nil {
return m.EndCaptures
}
return nil
}
func (m *Rule) GetRepository() map[string]*Rule {
if m != nil {
return m.Repository
}
return nil
}
func (m *Rule) GetInjections() map[string]*Rule {
if m != nil {
return m.Injections
}
return nil
}
func (m *Rule) GetDisabled() bool {
if m != nil {
return m.Disabled
}
return false
}
func (m *Rule) GetApplyEndPatternLast() bool {
if m != nil {
return m.ApplyEndPatternLast
}
return false
}
func (m *Rule) GetIncludeResetBase() bool {
if m != nil {
return m.IncludeResetBase
}
return false
}
type Library struct {
Grammars map[string]*Rule `protobuf:"bytes,1,rep,name=grammars" json:"grammars,omitempty" protobuf_key:"bytes,1,opt,name=key" protobuf_val:"bytes,2,opt,name=value"`
}
func (m *Library) Reset() { *m = Library{} }
func (m *Library) String() string { return proto.CompactTextString(m) }
func (*Library) ProtoMessage() {}
func (*Library) Descriptor() ([]byte, []int) { return fileDescriptor0, []int{1} }
func (m *Library) GetGrammars() map[string]*Rule {
if m != nil {
return m.Grammars
}
return nil
}
func init() {
proto.RegisterType((*Rule)(nil), "grammar.Rule")
proto.RegisterType((*Library)(nil), "grammar.Library")
}
func init() { proto.RegisterFile("proto/grammar.proto", fileDescriptor0) }
var fileDescriptor0 = []byte{
// 486 bytes of a gzipped FileDescriptorProto
0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xa4, 0x54, 0xcb, 0x8e, 0xd3, 0x30,
0x14, 0x55, 0x66, 0xda, 0x69, 0x7a, 0x4b, 0x99, 0x72, 0x87, 0x85, 0x55, 0x1e, 0x8a, 0x86, 0x4d,
0x61, 0x51, 0x10, 0x2c, 0x40, 0x23, 0x21, 0xa1, 0x41, 0x05, 0x81, 0xca, 0x43, 0xd9, 0xb0, 0x76,
0x13, 0x6b, 0x26, 0x90, 0x3a, 0x91, 0xed, 0x82, 0xf2, 0x19, 0x7c, 0x19, 0xbf, 0x84, 0x7c, 0xed,
0xa6, 0x49, 0xdb, 0x5d, 0x76, 0xbe, 0xe7, 0x25, 0x3b, 0x3e, 0x0e, 0x5c, 0x94, 0xaa, 0x30, 0xc5,
0xf3, 0x1b, 0xc5, 0xd7, 0x6b, 0xae, 0xe6, 0x34, 0xe1, 0xc0, 0x8f, 0x97, 0xff, 0x86, 0xd0, 0x8b,
0x37, 0xb9, 0x40, 0x84, 0x9e, 0xe4, 0x6b, 0xc1, 0x82, 0x28, 0x98, 0x0d, 0x63, 0x5a, 0xe3, 0x43,
0x18, 0xea, 0xa4, 0x28, 0xc5, 0x57, 0x4b, 0x9c, 0x10, 0xb1, 0x03, 0x30, 0x82, 0x51, 0x52, 0x48,
0x23, 0xa4, 0x21, 0xfe, 0x94, 0xf8, 0x26, 0x84, 0xf7, 0xa1, 0xbf, 0xe6, 0x26, 0xb9, 0x65, 0x3d,
0xe2, 0xdc, 0x60, 0xd1, 0x95, 0xb8, 0xc9, 0x24, 0xeb, 0x3b, 0x94, 0x06, 0x8b, 0xfe, 0xb9, 0xcd,
0x72, 0xc1, 0xce, 0x1c, 0x4a, 0x03, 0x4e, 0xe0, 0x54, 0xc8, 0x94, 0x0d, 0x08, 0xb3, 0x4b, 0x64,
0x30, 0xc8, 0x64, 0x92, 0x6f, 0x52, 0xc1, 0x42, 0x42, 0xb7, 0x23, 0x3e, 0x85, 0xb0, 0xe4, 0xc6,
0x08, 0x25, 0x35, 0x1b, 0x46, 0xa7, 0xb3, 0xd1, 0xcb, 0xf1, 0x7c, 0x7b, 0x6a, 0x7b, 0xc4, 0xb8,
0xa6, 0xf1, 0x35, 0x84, 0x09, 0x2f, 0xcd, 0x46, 0x09, 0xcd, 0x80, 0xa4, 0x0f, 0x5a, 0xd2, 0xf9,
0x7b, 0xcf, 0x2e, 0xa4, 0x51, 0x55, 0x5c, 0x8b, 0xf1, 0x03, 0x8c, 0x69, 0xbb, 0x5b, 0x9e, 0x8d,
0xc8, 0x1d, 0xb5, 0xdd, 0xd7, 0x4d, 0x89, 0x8b, 0x68, 0xdb, 0x6c, 0x0e, 0x1d, 0xb0, 0xce, 0xb9,
0x73, 0x2c, 0xe7, 0x47, 0x53, 0xe2, 0x73, 0x5a, 0x36, 0x7c, 0x07, 0x23, 0x21, 0xd3, 0x3a, 0x65,
0x4c, 0x29, 0x8f, 0xdb, 0x29, 0x8b, 0x9d, 0xc0, 0x65, 0x34, 0x2d, 0xf8, 0x16, 0x40, 0x89, 0xb2,
0xd0, 0x99, 0x29, 0x54, 0xc5, 0xee, 0x52, 0xc0, 0xa3, 0x76, 0x40, 0x5c, 0xf3, 0xce, 0xdf, 0x30,
0x58, 0x7b, 0x26, 0x7f, 0x8a, 0xc4, 0x64, 0x85, 0xd4, 0xec, 0xfc, 0x98, 0xfd, 0x53, 0xcd, 0x7b,
0xfb, 0xce, 0x80, 0x53, 0x08, 0xd3, 0x4c, 0xf3, 0x55, 0x2e, 0x52, 0x36, 0x89, 0x82, 0x59, 0x18,
0xd7, 0x33, 0xbe, 0x80, 0x0b, 0x5e, 0x96, 0x79, 0xb5, 0x90, 0xe9, 0x77, 0x77, 0x71, 0x4b, 0xae,
0x0d, 0xbb, 0x47, 0xb2, 0x63, 0x14, 0x3e, 0x83, 0x89, 0x2f, 0x43, 0x2c, 0xb4, 0x30, 0xd7, 0x5c,
0x0b, 0x86, 0x24, 0x3f, 0xc0, 0xa7, 0x9f, 0x61, 0xdc, 0xfa, 0x2a, 0xb6, 0x6a, 0xbf, 0x44, 0xe5,
0xfb, 0x6f, 0x97, 0xf8, 0x04, 0xfa, 0xbf, 0x79, 0xbe, 0x71, 0xd5, 0x3f, 0x68, 0x93, 0xe3, 0xae,
0x4e, 0xde, 0x04, 0xd3, 0x6f, 0x80, 0x87, 0x57, 0xde, 0x31, 0xf0, 0xf0, 0xee, 0xbb, 0x04, 0x7e,
0x81, 0xc9, 0x7e, 0x0d, 0xba, 0xc4, 0x2d, 0xe1, 0x7c, 0xaf, 0x14, 0x1d, 0xd3, 0xf6, 0x3a, 0xd2,
0x21, 0xed, 0xf2, 0x6f, 0x00, 0x83, 0x65, 0xb6, 0x52, 0x5c, 0x55, 0x78, 0x05, 0xa1, 0x97, 0x69,
0x16, 0xec, 0xbd, 0x0d, 0xaf, 0x99, 0x7f, 0xf4, 0x02, 0xff, 0xd4, 0xb7, 0x7a, 0x5b, 0x90, 0x16,
0xd5, 0x61, 0x4f, 0xab, 0x33, 0xfa, 0xeb, 0xbe, 0xfa, 0x1f, 0x00, 0x00, 0xff, 0xff, 0x2b, 0x2e,
0xec, 0x55, 0x8c, 0x05, 0x00, 0x00,
}

2
vendor/README.md vendored
View File

@@ -169,7 +169,7 @@ This is a list of grammars that Linguist selects to provide syntax highlighting
- **Ioke:** [vic/ioke-outdated](https://github.com/vic/ioke-outdated) - **Ioke:** [vic/ioke-outdated](https://github.com/vic/ioke-outdated)
- **Isabelle:** [lsf37/Isabelle.tmbundle](https://github.com/lsf37/Isabelle.tmbundle) - **Isabelle:** [lsf37/Isabelle.tmbundle](https://github.com/lsf37/Isabelle.tmbundle)
- **Isabelle ROOT:** [lsf37/Isabelle.tmbundle](https://github.com/lsf37/Isabelle.tmbundle) - **Isabelle ROOT:** [lsf37/Isabelle.tmbundle](https://github.com/lsf37/Isabelle.tmbundle)
- **J:** [bcj/JSyntax](https://github.com/bcj/JSyntax) - **J:** [tikkanz/JSyntax](https://github.com/tikkanz/JSyntax)
- **Jasmin:** [atmarksharp/jasmin-sublime](https://github.com/atmarksharp/jasmin-sublime) - **Jasmin:** [atmarksharp/jasmin-sublime](https://github.com/atmarksharp/jasmin-sublime)
- **Java:** [textmate/java.tmbundle](https://github.com/textmate/java.tmbundle) - **Java:** [textmate/java.tmbundle](https://github.com/textmate/java.tmbundle)
- **Java Server Pages:** [textmate/java.tmbundle](https://github.com/textmate/java.tmbundle) - **Java Server Pages:** [textmate/java.tmbundle](https://github.com/textmate/java.tmbundle)

View File

@@ -1,10 +1,9 @@
--- ---
type: grammar type: grammar
name: ruby.tmbundle name: javadoc.tmbundle
license: permissive license: permissive
curated: true curated: true
--- ---
If not otherwise specified (see below), files in this repository fall under the following license: If not otherwise specified (see below), files in this repository fall under the following license:
Permission to copy, use, modify, sell and distribute this Permission to copy, use, modify, sell and distribute this