Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Named Arguments #625

Merged
merged 32 commits into from
Sep 17, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
8ac5b1d
Initial commit for semantic tokenizer.
gdotdesign Jun 12, 2023
60f1bb5
Simplify semantic tokenizer a bit.
gdotdesign Jun 13, 2023
0e865c6
Working language server implementation.
gdotdesign Jun 13, 2023
1c14a5a
Cleanup semantic tokenizer class.
gdotdesign Jun 13, 2023
ca01ad6
Save keywords automatically instead of manually.
gdotdesign Jun 13, 2023
6926d57
Use an array derived from the actual token types.
gdotdesign Jun 13, 2023
29fb79b
Implement suggestions from code review.
gdotdesign Jun 14, 2023
6b25fc8
Update src/ls/semantic_tokens.cr
gdotdesign Jun 14, 2023
0eea575
Implement HTML highlighting.
gdotdesign Jun 14, 2023
1cd96da
Implement highlight directive.
gdotdesign Jun 14, 2023
529117d
Avoid unnecessary interations.
gdotdesign Jun 14, 2023
79c3f11
Implement suggestions from code review.
gdotdesign Jun 14, 2023
6faec26
Use the ast from the workspace semantic tokens.
gdotdesign Jun 25, 2023
37db9b6
Implementation of localization language structures.
gdotdesign Jun 26, 2023
6e51ae0
Update operation.cr
gdotdesign Jul 18, 2023
673b361
Merge branch 'master' into locales
gdotdesign Jul 18, 2023
92833ca
Update test.
gdotdesign Jul 21, 2023
5e9eef7
Merge branch 'master' into locales
gdotdesign Aug 8, 2023
44ab595
Revert change to the operation formatting.
gdotdesign Aug 8, 2023
a8ecd78
Update Locale.md
gdotdesign Aug 8, 2023
cb99c78
Implement labelled calls.
gdotdesign Jul 21, 2023
dd254f8
Don't reorder arguments in the formatter.
gdotdesign Jul 22, 2023
630c080
Minor fixes.
gdotdesign Jul 26, 2023
d28c7a3
Merge branch 'master' into locales
gdotdesign Sep 6, 2023
095ab66
Merge branch 'locales' into labelled-calls
gdotdesign Sep 6, 2023
d171155
Merge branch 'master' into locales
gdotdesign Sep 7, 2023
88bd4cc
Merge branch 'locales' into labelled-calls
gdotdesign Sep 7, 2023
36c9d98
Apply suggestions from code review
gdotdesign Sep 7, 2023
a2a0c04
Update src/compilers/call.cr
gdotdesign Sep 7, 2023
687550d
Finish renaming an error from code review.
gdotdesign Sep 7, 2023
99a1bac
Merge branch 'master' into labelled-calls
gdotdesign Sep 17, 2023
99acd9f
Merge branch 'master' into labelled-calls
gdotdesign Sep 17, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Cleanup semantic tokenizer class.
  • Loading branch information
gdotdesign committed Jun 13, 2023
commit 1c14a5ae2fdca2764298be576db51f656d2a0126
4 changes: 3 additions & 1 deletion src/ast.cr
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,10 @@ module Mint

getter components, modules, records, stores, routes, providers
getter suites, enums, comments, nodes, unified_modules, keywords
getter operators

def initialize(@keywords = [] of Tuple(Int32, Int32),
def initialize(@operators = [] of Tuple(Int32, Int32),
@keywords = [] of Tuple(Int32, Int32),
@records = [] of RecordDefinition,
@unified_modules = [] of Module,
@components = [] of Component,
Expand Down
10 changes: 0 additions & 10 deletions src/commands/highlight.cr
Original file line number Diff line number Diff line change
Expand Up @@ -45,24 +45,14 @@ module Mint
item[0].colorize(:light_yellow)
in SemanticTokenizer::TokenType::Variable
item[0].colorize(:dark_gray)
in SemanticTokenizer::TokenType::Class
item[0].colorize(:blue)
in SemanticTokenizer::TokenType::Struct
item[0].colorize.fore(:white).back(:red)
in SemanticTokenizer::TokenType::Namespace
item[0].colorize(:light_blue)
in SemanticTokenizer::TokenType::Function
item[0].colorize.fore(:white).back(:red)
in SemanticTokenizer::TokenType::Keyword
item[0].colorize(:magenta)
in SemanticTokenizer::TokenType::Property
item[0].colorize(:dark_gray).mode(:underline)
in SemanticTokenizer::TokenType::Comment
item[0].colorize(:light_gray)
in SemanticTokenizer::TokenType::Enum
item[0].colorize.fore(:white).back(:red)
in SemanticTokenizer::TokenType::EnumMember
item[0].colorize.fore(:white).back(:red)
in SemanticTokenizer::TokenType::String
item[0].colorize(:green)
in SemanticTokenizer::TokenType::Number
Expand Down
6 changes: 3 additions & 3 deletions src/ls/semantic_tokens.cr
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,14 @@ module Mint
ast =
Parser.parse(uri.path.to_s)

# This is used later on to convert the line/column of each token
input =
ast.nodes.first.input
tokenizer = SemanticTokenizer.new
tokenizer.tokenize(ast)

data =
tokenizer.tokens.sort_by(&.from).compact_map do |token|
input =
ast.nodes.first.input

location =
Ast::Node.compute_location(input, token.from, token.to)

Expand Down
16 changes: 16 additions & 0 deletions src/lsp/protocol/semantic_tokens_legend.cr
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
module LSP
struct SemanticTokensLegend
include JSON::Serializable

# The token types a server uses.
@[JSON::Field(key: "tokenTypes")]
property token_types : Array(String)

# The token modifiers a server uses.
@[JSON::Field(key: "tokenModifiers")]
property token_modifiers : Array(String)

def initialize(@token_types, @token_modifiers)
end
end
end
15 changes: 0 additions & 15 deletions src/lsp/protocol/semantic_tokens_options.cr
Original file line number Diff line number Diff line change
@@ -1,19 +1,4 @@
module LSP
struct SemanticTokensLegend
include JSON::Serializable

# The token types a server uses.
@[JSON::Field(key: "tokenTypes")]
property token_types : Array(String)

# The token modifiers a server uses.
@[JSON::Field(key: "tokenModifiers")]
property token_modifiers : Array(String)

def initialize(@token_types, @token_modifiers)
end
end

struct SemanticTokensOptions
include JSON::Serializable

Expand Down
2 changes: 2 additions & 0 deletions src/parsers/operation.cr
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,11 @@ module Mint
def operator : String?
start do
whitespace
saved_position = position
operator = OPERATORS.keys.find { |item| keyword item }
next unless operator
next unless whitespace?
ast.operators << {saved_position, saved_position + operator.size}
whitespace
operator
end
Expand Down
91 changes: 43 additions & 48 deletions src/semantic_tokenizer.cr
Original file line number Diff line number Diff line change
@@ -1,55 +1,79 @@
module Mint
class SemanticTokenizer
# This is a subset of the LSPs SemanticTokenTypes enum.
enum TokenType
Type
TypeParameter
Type

Variable

Class # Component
Struct # Record
Namespace # HTML Tags
Function
Keyword
Namespace
Property
Keyword
Comment

Enum
EnumMember

Variable
Operator
String
Number
Regexp
Operator
end

# This represents which token types are used for which node.
TOKEN_MAP = {
Ast::TypeVariable => TokenType::TypeParameter,
Ast::Variable => TokenType::Variable,
Ast::BoolLiteral => TokenType::Keyword,
Ast::Comment => TokenType::Comment,
Ast::StringLiteral => TokenType::String,
Ast::NumberLiteral => TokenType::Number,
Ast::TypeId => TokenType::Type,
}

# Represents a semantic token using the positions of the token instead
# of line / column (for the LSP it is converted to line /column).
record Token,
type : TokenType,
from : Int32,
to : Int32

# We keep a cache of all tokenized nodes to avoid duplications
getter cache : Set(Ast::Node) = Set(Ast::Node).new

# This is where the resulting tokens are stored.
getter tokens : Array(Token) = [] of Token

def tokenize(ast : Ast)
ast.keywords.each do |(from, to)|
add(from, to, TokenType::Keyword)
end
# We add the operators and keywords directly from the AST
ast.operators.each { |(from, to)| add(from, to, TokenType::Operator) }
ast.keywords.each { |(from, to)| add(from, to, TokenType::Keyword) }

tokenize(ast.nodes)
end

def tokenize(nodes : Array(Ast::Node))
nodes.each { |node| tokenize(node) }
end

def tokenize(node : Ast::Node?)
if type = TOKEN_MAP[node.class]?
add(node, type)
end
end

def tokenize(node : Ast::CssDefinition)
add(node.from, node.from + node.name.size, TokenType::Property)
end

def tokenize(node : Ast::ArrayAccess)
# TODO: The index should be parsed as a number literal when
# implemented remove this
case index = node.index
when Int64
add(node.from + 1, node.from + 1 + index.to_s.size, TokenType::Number)
end
end

def tokenize(node : Ast::HtmlElement)
# The closing tag is not saved only the position to it.
node.closing_tag_position.try do |position|
add(position, position + node.tag.value.size, TokenType::Namespace)
end
Expand All @@ -63,46 +87,17 @@ module Mint
end
end

def tokenize(node : Ast::StringLiteral)
add(node, TokenType::String)
end

def tokenize(node : Ast::BoolLiteral)
add(node, TokenType::Keyword)
end

def tokenize(node : Ast::NumberLiteral)
add(node, TokenType::Number)
end

def tokenize(node : Ast::Comment)
add(node, TokenType::Comment)
end

def tokenize(node : Ast::Variable)
add(node, TokenType::Variable)
end

def tokenize(node : Ast::TypeId)
add(node, TokenType::Type)
end

def add(from : Int32, to : Int32, type : TokenType)
tokens << Token.new(
type: type,
from: from,
to: to)
end

def add(node : Ast::Node | Nil, type : TokenType)
add(node.from, node.to, type) if node
end

def tokenize(nodes : Array(Ast::Node))
nodes.each { |node| tokenize(node) }
end

def tokenize(node : Ast::Node?)
def add(node : Ast::Node, type : TokenType)
return if cache.includes?(node)
add(node.from, node.to, type)
cache.add(node)
end
end
end