Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

package comments and readme update #2182

Merged
merged 1 commit into from
Oct 9, 2022
Merged

package comments and readme update #2182

merged 1 commit into from
Oct 9, 2022

Conversation

gqcn
Copy link
Member

@gqcn gqcn commented Oct 9, 2022

No description provided.

@codecov-commenter
Copy link

codecov-commenter commented Oct 9, 2022

Codecov Report

Base: 76.44% // Head: 76.45% // Increases project coverage by +0.01% 🎉

Coverage data is based on head (9a9f34b) compared to base (6cb9102).
Patch has no changes to coverable lines.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #2182      +/-   ##
==========================================
+ Coverage   76.44%   76.45%   +0.01%     
==========================================
  Files         581      581              
  Lines       48260    48260              
==========================================
+ Hits        36890    36895       +5     
+ Misses       9340     9335       -5     
  Partials     2030     2030              
Flag Coverage Δ
go-1.15-386 ?
go-1.15-amd64 76.44% <ø> (-0.02%) ⬇️
go-1.16-386 ?
go-1.16-amd64 76.46% <ø> (+<0.01%) ⬆️
go-1.17-386 76.44% <ø> (-0.03%) ⬇️
go-1.17-amd64 76.49% <ø> (+0.02%) ⬆️
go-1.18-386 76.36% <ø> (ø)
go-1.18-amd64 76.39% <ø> (-0.02%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
internal/httputil/httputils.go 65.11% <ø> (ø)
util/guid/guid.go 88.23% <ø> (ø)
os/gcache/gcache_adapter_memory_lru.go 89.13% <0.00%> (-8.70%) ⬇️
os/gfsnotify/gfsnotify_watcher_loop.go 85.71% <0.00%> (+7.56%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@gqcn gqcn merged commit 182a393 into master Oct 9, 2022
@gqcn gqcn deleted the enhance/readme branch October 9, 2022 13:23
houseme added a commit to houseme/gf that referenced this pull request Oct 13, 2022
* master: (29 commits)
  fix issue gogf#1946 (gogf#2194)
  fix issue of OmitEmptyWhere in Builder for package gdb (gogf#2195)
  fix: modify `Polaris` config readme.md (gogf#2186)
  fix info content when listens on port :0 for ghttp.Server (gogf#2191)
  fix: pgsql driver check local type error (gogf#2192)
  new version v2.2.0 (gogf#2185)
  feat: temporarily disable the unit testing of the Polaris configuration center (gogf#2183)
  package comments and readme update (gogf#2182)
  feat: create polaris config (gogf#2170)
  add function `ZipPathContent` for package `gcompress` (gogf#2179)
  feat: improve glog for polaris  register (gogf#2178)
  improve port listening for ghttp.Server (gogf#2175)
  add WithUUID for package gtrace (gogf#2176)
  fix issue gogf#1965 (gogf#2177)
  fix issue gogf#1965 (gogf#2174)
  fix issue gogf#2172 (gogf#2173)
  add `gcfg.Adapter` implements using apollo service (gogf#2165)
  add watch feature for package kubecm (gogf#2164)
  fix configuration management for package gdb (gogf#2163)
  add local db configuration support for package gdb (gogf#2161)
  ...

# Conflicts:
#	contrib/config/apollo/README.MD
#	contrib/config/apollo/apollo.go
#	contrib/config/kubecm/README.MD
#	contrib/config/kubecm/kubecm.go
#	contrib/drivers/README.MD
#	contrib/registry/polaris/go.mod
#	contrib/registry/polaris/go.sum
#	database/gdb/gdb_driver_wrapper_db.go
#	example/go.mod
#	example/go.sum
#	version.go
houseme pushed a commit that referenced this pull request Oct 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants