Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Named Arguments #625

Merged
merged 32 commits into from
Sep 17, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
8ac5b1d
Initial commit for semantic tokenizer.
gdotdesign Jun 12, 2023
60f1bb5
Simplify semantic tokenizer a bit.
gdotdesign Jun 13, 2023
0e865c6
Working language server implementation.
gdotdesign Jun 13, 2023
1c14a5a
Cleanup semantic tokenizer class.
gdotdesign Jun 13, 2023
ca01ad6
Save keywords automatically instead of manually.
gdotdesign Jun 13, 2023
6926d57
Use an array derived from the actual token types.
gdotdesign Jun 13, 2023
29fb79b
Implement suggestions from code review.
gdotdesign Jun 14, 2023
6b25fc8
Update src/ls/semantic_tokens.cr
gdotdesign Jun 14, 2023
0eea575
Implement HTML highlighting.
gdotdesign Jun 14, 2023
1cd96da
Implement highlight directive.
gdotdesign Jun 14, 2023
529117d
Avoid unnecessary interations.
gdotdesign Jun 14, 2023
79c3f11
Implement suggestions from code review.
gdotdesign Jun 14, 2023
6faec26
Use the ast from the workspace semantic tokens.
gdotdesign Jun 25, 2023
37db9b6
Implementation of localization language structures.
gdotdesign Jun 26, 2023
6e51ae0
Update operation.cr
gdotdesign Jul 18, 2023
673b361
Merge branch 'master' into locales
gdotdesign Jul 18, 2023
92833ca
Update test.
gdotdesign Jul 21, 2023
5e9eef7
Merge branch 'master' into locales
gdotdesign Aug 8, 2023
44ab595
Revert change to the operation formatting.
gdotdesign Aug 8, 2023
a8ecd78
Update Locale.md
gdotdesign Aug 8, 2023
cb99c78
Implement labelled calls.
gdotdesign Jul 21, 2023
dd254f8
Don't reorder arguments in the formatter.
gdotdesign Jul 22, 2023
630c080
Minor fixes.
gdotdesign Jul 26, 2023
d28c7a3
Merge branch 'master' into locales
gdotdesign Sep 6, 2023
095ab66
Merge branch 'locales' into labelled-calls
gdotdesign Sep 6, 2023
d171155
Merge branch 'master' into locales
gdotdesign Sep 7, 2023
88bd4cc
Merge branch 'locales' into labelled-calls
gdotdesign Sep 7, 2023
36c9d98
Apply suggestions from code review
gdotdesign Sep 7, 2023
a2a0c04
Update src/compilers/call.cr
gdotdesign Sep 7, 2023
687550d
Finish renaming an error from code review.
gdotdesign Sep 7, 2023
99a1bac
Merge branch 'master' into labelled-calls
gdotdesign Sep 17, 2023
99acd9f
Merge branch 'master' into labelled-calls
gdotdesign Sep 17, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Implement HTML highlighting.
  • Loading branch information
gdotdesign committed Jun 14, 2023
commit 0eea575a31a2740bbedfb7d8e807764ca95c9d5e
67 changes: 7 additions & 60 deletions src/commands/highlight.cr
Original file line number Diff line number Diff line change
Expand Up @@ -3,70 +3,17 @@ module Mint
class Highlight < Admiral::Command
include Command

define_help description: "Returns the syntax highlighted version of the given file as HTML"
define_help description: "Returns the syntax highlighted version of the given file"

define_argument path,
description: "The path to the file"
define_argument path, description: "The path to the file"

define_flag html : Bool,
description: "If specified, print the highlighted code as HTML",
default: false

def run
return unless path = arguments.path

ast =
Parser.parse(path)

tokenizer = SemanticTokenizer.new
tokenizer.tokenize(ast)

parts = [] of String | Tuple(String, SemanticTokenizer::TokenType)
contents = File.read(path)
position = 0

tokenizer.tokens.sort_by(&.from).each do |token|
if token.from > position
parts << contents[position, token.from - position]
end

parts << {contents[token.from, token.to - token.from], token.type}
position = token.to
end

if position < contents.size
parts << contents[position, contents.size]
end

result = parts.reduce("") do |memo, item|
memo + case item
in String
item
in Tuple(String, SemanticTokenizer::TokenType)
case item[1]
in .type?
item[0].colorize(:yellow)
in .type_parameter?
item[0].colorize(:light_yellow)
in .variable?
item[0].colorize(:dark_gray)
in .namespace?
item[0].colorize(:light_blue)
in .keyword?
item[0].colorize(:magenta)
in .property?
item[0].colorize(:dark_gray).mode(:underline)
in .comment?
item[0].colorize(:light_gray)
in .string?
item[0].colorize(:green)
in .number?
item[0].colorize(:red)
in .regexp?
item[0].colorize.fore(:white).back(:red)
in .operator?
item[0].colorize(:light_magenta)
end.to_s
end
end

print result
print SemanticTokenizer.highlight(path, flags.html)
end
end
end
Expand Down
66 changes: 66 additions & 0 deletions src/semantic_tokenizer.cr
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ module Mint
Ast::Variable => TokenType::Variable,
Ast::Comment => TokenType::Comment,
Ast::StringLiteral => TokenType::String,
Ast::RegexpLiteral => TokenType::Regexp,
Ast::NumberLiteral => TokenType::Number,
Ast::TypeId => TokenType::Type,
}
Expand All @@ -42,6 +43,71 @@ module Mint
# This is where the resulting tokens are stored.
getter tokens : Array(Token) = [] of Token

def self.highlight(path : String, html : Bool = false)
ast =
Parser.parse(path)

tokenizer = self.new
tokenizer.tokenize(ast)

parts = [] of String | Tuple(String, SemanticTokenizer::TokenType)
contents = ast.nodes.first.input.input
position = 0

tokenizer.tokens.sort_by(&.from).each do |token|
if token.from > position
parts << contents[position, token.from - position]
end

parts << {contents[token.from, token.to - token.from], token.type}
position = token.to
end

if position < contents.size
parts << contents[position, contents.size]
end

parts.reduce("") do |memo, item|
memo + case item
in String
if html
HTML.escape(item)
else
item
end
in Tuple(String, SemanticTokenizer::TokenType)
if html
"<span class=\"#{item[1].to_s.underscore}\">#{HTML.escape(item[0])}</span>"
else
case item[1]
in .type?
item[0].colorize(:yellow)
in .type_parameter?
item[0].colorize(:light_yellow)
in .variable?
item[0].colorize(:dark_gray)
in .namespace?
item[0].colorize(:light_blue)
in .keyword?
item[0].colorize(:magenta)
in .property?
item[0].colorize(:dark_gray).mode(:underline)
in .comment?
item[0].colorize(:light_gray)
in .string?
item[0].colorize(:green)
in .number?
item[0].colorize(:red)
in .regexp?
item[0].colorize(:light_red)
in .operator?
item[0].colorize(:light_magenta)
end.to_s
end
end
end
end

def tokenize(ast : Ast)
# We add the operators and keywords directly from the AST
ast.operators.each { |(from, to)| add(from, to, TokenType::Operator) }
Expand Down