Skip to content

Commit

Permalink
Spelling (qdrant#2658)
Browse files Browse the repository at this point in the history
* spelling: accumulating

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: and

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: back

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: batching

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: been

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: benchmark

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: collections

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: confusion

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: consensus

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: decrease

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: equal

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: github

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: minimal

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: nonexistent

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: oversampling

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: paths

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: points

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: prevent

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: protobuf

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: proxied

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: randomness

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

* spelling: recover

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>

---------

Signed-off-by: Josh Soref <2119212+jsoref@users.noreply.github.com>
  • Loading branch information
jsoref authored and generall committed Oct 6, 2023
1 parent ed88971 commit a18573b
Show file tree
Hide file tree
Showing 23 changed files with 39 additions and 39 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/rust-lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest

steps:
- name: Install miniml nightly (only for fmt)
- name: Install minimal nightly (only for fmt)
uses: dtolnay/rust-toolchain@nightly
with:
profile: minimal
Expand Down
8 changes: 4 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@ We love your input! We want to make contributing to this project as easy and tra
- Submitting a fix
- Proposing new features

## We Develop with Github
## We Develop with GitHub
We use github to host code, to track issues and feature requests, as well as accept pull requests.

## We Use [Github Flow](https://guides.github.com/introduction/flow/index.html), So All Code Changes Happen Through Pull Requests
Pull requests are the best way to propose changes to the codebase (we use [Github Flow](https://docs.github.com/en/get-started/quickstart/github-flow)). We actively welcome your pull requests:
## We Use [GitHub Flow](https://guides.github.com/introduction/flow/index.html), So All Code Changes Happen Through Pull Requests
Pull requests are the best way to propose changes to the codebase (we use [GitHub Flow](https://docs.github.com/en/get-started/quickstart/github-flow)). We actively welcome your pull requests:

1. Fork the repo and create your branch from `dev`.
2. If you've added code that should be tested, add tests.
Expand All @@ -22,7 +22,7 @@ Pull requests are the best way to propose changes to the codebase (we use [Githu
## Any contributions you make will be under the Apache License 2.0
In short, when you submit code changes, your submissions are understood to be under the same [Apache License 2.0](https://choosealicense.com/licenses/apache-2.0/) that covers the project. Feel free to contact the maintainers if that's a concern.

## Report bugs using Github's [issues](https://github.com/qdrant/qdrant/issues)
## Report bugs using GitHub's [issues](https://github.com/qdrant/qdrant/issues)
We use GitHub issues to track public bugs. Report a bug by [opening a new issue](); it's that easy!

## Write bug reports with detail, background, and sample code
Expand Down
2 changes: 1 addition & 1 deletion QUICK_START_GRPC.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Execute the following command
```bash
grpcurl -plaintext -import-path ./lib/api/src/grpc/proto/ -proto qdrant.proto -d '{}' 0.0.0.0:6334 qdrant.Qdrant/HealthCheck
```
Here and below the ```./lib/api/src/grpc/proto/``` should be a path to the folder with a probuf schemas.
Here and below the ```./lib/api/src/grpc/proto/``` should be a path to the folder with a protobuf schemas.
Expected response:
```json
{
Expand Down
4 changes: 2 additions & 2 deletions docs/DEVELOPMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ So the expected approach to benchmarking is to run only ones which might be affe
To run benchmark, use the following command inside a related sub-crate:

```bash
cargo bench --bench name_of_banchmark
cargo bench --bench name_of_benchmark
```

In this case you will see the execution timings and, if you launched this bench earlier, the difference in execution time.
Expand Down Expand Up @@ -119,7 +119,7 @@ Found 1 outliers among 100 measurements (1.00%)
To run benchmarks with profiler to generate FlameGraph - use the following command:

```bash
cargo bench --bench name_of_banchmark -- --profile-time=60
cargo bench --bench name_of_benchmark -- --profile-time=60
```

This command will run each benchmark iterator for `60` seconds and generate FlameGraph svg along with profiling records files.
Expand Down
2 changes: 1 addition & 1 deletion docs/grpc/docs.md
Original file line number Diff line number Diff line change
Expand Up @@ -2937,7 +2937,7 @@ For example, if `oversampling` is 2.4 and `limit` is 100, then 240 vectors will
| Search | [SearchPoints](#qdrant-SearchPoints) | [SearchResponse](#qdrant-SearchResponse) | Retrieve closest points based on vector similarity and given filtering conditions |
| SearchBatch | [SearchBatchPoints](#qdrant-SearchBatchPoints) | [SearchBatchResponse](#qdrant-SearchBatchResponse) | Retrieve closest points based on vector similarity and given filtering conditions |
| SearchGroups | [SearchPointGroups](#qdrant-SearchPointGroups) | [SearchGroupsResponse](#qdrant-SearchGroupsResponse) | Retrieve closest points based on vector similarity and given filtering conditions, grouped by a given field |
| Scroll | [ScrollPoints](#qdrant-ScrollPoints) | [ScrollResponse](#qdrant-ScrollResponse) | Iterate over all or filtered points points |
| Scroll | [ScrollPoints](#qdrant-ScrollPoints) | [ScrollResponse](#qdrant-ScrollResponse) | Iterate over all or filtered points |
| Recommend | [RecommendPoints](#qdrant-RecommendPoints) | [RecommendResponse](#qdrant-RecommendResponse) | Look for the points which are closer to stored positive examples and at the same time further to negative examples. |
| RecommendBatch | [RecommendBatchPoints](#qdrant-RecommendBatchPoints) | [RecommendBatchResponse](#qdrant-RecommendBatchResponse) | Look for the points which are closer to stored positive examples and at the same time further to negative examples. |
| RecommendGroups | [RecommendPointGroups](#qdrant-RecommendPointGroups) | [RecommendGroupsResponse](#qdrant-RecommendGroupsResponse) | Look for the points which are closer to stored positive examples and at the same time further to negative examples, grouped by a given field |
Expand Down
2 changes: 1 addition & 1 deletion lib/api/src/grpc/proto/points_service.proto
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ service Points {
*/
rpc SearchGroups (SearchPointGroups) returns (SearchGroupsResponse) {}
/*
Iterate over all or filtered points points
Iterate over all or filtered points
*/
rpc Scroll (ScrollPoints) returns (ScrollResponse) {}
/*
Expand Down
4 changes: 2 additions & 2 deletions lib/api/src/grpc/qdrant.rs
Original file line number Diff line number Diff line change
Expand Up @@ -4509,7 +4509,7 @@ pub mod points_client {
self.inner.unary(req, path, codec).await
}
///
/// Iterate over all or filtered points points
/// Iterate over all or filtered points
pub async fn scroll(
&mut self,
request: impl tonic::IntoRequest<super::ScrollPoints>,
Expand Down Expand Up @@ -4784,7 +4784,7 @@ pub mod points_server {
tonic::Status,
>;
///
/// Iterate over all or filtered points points
/// Iterate over all or filtered points
async fn scroll(
&self,
request: tonic::Request<super::ScrollPoints>,
Expand Down
10 changes: 5 additions & 5 deletions lib/collection/src/collection_manager/holders/proxy_segment.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1282,13 +1282,13 @@ mod tests {
assert_eq!(segment_info.num_points, 5);
assert_eq!(segment_info.num_vectors, 5);

// Delete non-existent point, counts should remain the same
// Delete nonexistent point, counts should remain the same
proxy_segment.delete_point(101, 99999.into()).unwrap();
let segment_info = proxy_segment.info();
assert_eq!(segment_info.num_points, 5);
assert_eq!(segment_info.num_vectors, 5);

// Delete point 1, counts should derease by 1
// Delete point 1, counts should decrease by 1
proxy_segment.delete_point(102, 4.into()).unwrap();
let segment_info = proxy_segment.info();
assert_eq!(segment_info.num_points, 4);
Expand All @@ -1308,7 +1308,7 @@ mod tests {
use segment::segment_constructor::build_segment;
use segment::types::{Distance, Indexes, VectorDataConfig, VectorStorageType};

// Create proxyied multivec segment
// Create proxied multivec segment
let dir = Builder::new().prefix("segment_dir").tempdir().unwrap();
let dim = 1;
let config = SegmentConfig {
Expand Down Expand Up @@ -1391,13 +1391,13 @@ mod tests {
assert_eq!(segment_info.num_points, 4);
assert_eq!(segment_info.num_vectors, 6);

// Delete non-existent point, counts should remain the same
// Delete nonexistent point, counts should remain the same
proxy_segment.delete_point(104, 1.into()).unwrap();
let segment_info = proxy_segment.info();
assert_eq!(segment_info.num_points, 4);
assert_eq!(segment_info.num_vectors, 6);

// Delete point 4, counts should derease by 1
// Delete point 4, counts should decrease by 1
proxy_segment.delete_point(105, 4.into()).unwrap();
let segment_info = proxy_segment.info();
assert_eq!(segment_info.num_points, 3);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -263,7 +263,7 @@ pub trait SegmentOptimizer {
///
/// # Result
///
/// Rolls back back optimization state.
/// Rolls back optimization state.
/// All processed changes will still be there, but the collection should be returned into state
/// before optimization.
fn handle_cancellation(
Expand Down
4 changes: 2 additions & 2 deletions lib/collection/src/common/file_utils.rs
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ use crate::operations::types::{CollectionError, CollectionResult};
/// Move directory from one location to another.
/// Handles the case when the source and destination are on different filesystems.
pub async fn move_dir(from: impl Into<PathBuf>, to: impl Into<PathBuf>) -> CollectionResult<()> {
// Try to rename first and fallback to copy to prevert TOCTOU
// Try to rename first and fallback to copy to prevent TOCTOU
let from = from.into();
let to = to.into();

Expand All @@ -31,7 +31,7 @@ pub async fn move_dir(from: impl Into<PathBuf>, to: impl Into<PathBuf>) -> Colle
}

pub async fn move_file(from: impl AsRef<Path>, to: impl AsRef<Path>) -> CollectionResult<()> {
// Try to rename first and fallback to copy to prevert TOCTOU
// Try to rename first and fallback to copy to prevent TOCTOU
let from = from.as_ref();
let to = to.as_ref();

Expand Down
2 changes: 1 addition & 1 deletion lib/collection/tests/integration/lookup_test.rs
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ fn non_parsable_pseudo_id_to_point_id(#[case] value: impl Into<PseudoId>) {
#[case::uuid(Uuid::new_v4().to_string())]
#[case::int(1001u64)]
#[tokio::test(flavor = "multi_thread")]
async fn inexisting_lookup_ids_are_ignored(#[case] value: impl Into<PseudoId>) {
async fn nonexistent_lookup_ids_are_ignored(#[case] value: impl Into<PseudoId>) {
let value = value.into();

let Resources {
Expand Down
4 changes: 2 additions & 2 deletions lib/segment/src/common/rocksdb_wrapper.rs
Original file line number Diff line number Diff line change
Expand Up @@ -53,10 +53,10 @@ pub fn db_options() -> Options {

pub fn open_db<T: AsRef<str>>(
path: &Path,
vector_pathes: &[T],
vector_paths: &[T],
) -> Result<Arc<RwLock<DB>>, rocksdb::Error> {
let mut column_families = vec![DB_PAYLOAD_CF, DB_MAPPING_CF, DB_VERSIONS_CF];
for vector_path in vector_pathes {
for vector_path in vector_paths {
column_families.push(vector_path.as_ref());
}
let db = DB::open_cf(&db_options(), path, column_families)?;
Expand Down
2 changes: 1 addition & 1 deletion lib/segment/src/id_tracker/id_tracker_base.rs
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ use crate::common::Flusher;
use crate::entry::entry_point::OperationResult;
use crate::types::{PointIdType, PointOffsetType, SeqNumberType};

/// Sampling randomess seed
/// Sampling randomness seed
///
/// Using seeded randomness so search results don't show randomness or 'inconsistencies' which
/// would otherwise be introduced by HNSW/ID tracker point sampling.
Expand Down
2 changes: 1 addition & 1 deletion lib/segment/src/index/hnsw_index/graph_links.rs
Original file line number Diff line number Diff line change
Expand Up @@ -680,7 +680,7 @@ mod tests {
);
assert_eq!(links, cmp_links);

// 4 levels with random unexists links
// 4 levels with random nonexistent links
let links: Vec<Vec<Vec<PointOffsetType>>> = vec![
vec![vec![1, 2, 5, 6]],
vec![vec![0, 2, 7, 8], vec![], vec![34, 45, 10]],
Expand Down
8 changes: 4 additions & 4 deletions lib/segment/src/segment.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1974,13 +1974,13 @@ mod tests {
assert_eq!(segment_info.num_points, 2);
assert_eq!(segment_info.num_vectors, 2);

// Delete non-existent point, counts should remain the same
// Delete nonexistent point, counts should remain the same
segment.delete_point(102, 1.into()).unwrap();
let segment_info = segment.info();
assert_eq!(segment_info.num_points, 2);
assert_eq!(segment_info.num_vectors, 2);

// Delete point 4, counts should derease by 1
// Delete point 4, counts should decrease by 1
segment.delete_point(103, 4.into()).unwrap();
let segment_info = segment.info();
assert_eq!(segment_info.num_points, 1);
Expand Down Expand Up @@ -2055,13 +2055,13 @@ mod tests {
assert_eq!(segment_info.num_points, 4);
assert_eq!(segment_info.num_vectors, 6);

// Delete non-existent point, counts should remain the same
// Delete nonexistent point, counts should remain the same
segment.delete_point(104, 1.into()).unwrap();
let segment_info = segment.info();
assert_eq!(segment_info.num_points, 4);
assert_eq!(segment_info.num_vectors, 6);

// Delete point 4, counts should derease by 1
// Delete point 4, counts should decrease by 1
segment.delete_point(105, 4.into()).unwrap();
let segment_info = segment.info();
assert_eq!(segment_info.num_points, 3);
Expand Down
2 changes: 1 addition & 1 deletion lib/segment/tests/integration/fail_recovery_test.rs
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ fn test_insert_fail_recovery() {
assert!(ok_res.is_ok());
assert!(segment.error_status.is_some());

// Perform operation anf recover the error - operation is fixed now
// Perform operation and recover the error - operation is fixed now
let recover_res = segment.set_payload(
2,
1.into(),
Expand Down
6 changes: 3 additions & 3 deletions lib/segment/tests/integration/hnsw_quantized_search_test.rs
Original file line number Diff line number Diff line change
Expand Up @@ -124,15 +124,15 @@ fn hnsw_quantized_search_test(

// check oversampling
for _i in 0..attempts {
let ef_oversamling = ef / 8;
let ef_oversampling = ef / 8;
let oversampling_query = random_vector(&mut rnd, dim).into();

let oversampling_1_result = hnsw_index.search(
&[&oversampling_query],
None,
top,
Some(&SearchParams {
hnsw_ef: Some(ef_oversamling),
hnsw_ef: Some(ef_oversampling),
quantization: Some(QuantizationSearchParams {
rescore: true,
..Default::default()
Expand All @@ -149,7 +149,7 @@ fn hnsw_quantized_search_test(
None,
top,
Some(&SearchParams {
hnsw_ef: Some(ef_oversamling),
hnsw_ef: Some(ef_oversampling),
quantization: Some(QuantizationSearchParams {
oversampling: Some(4.0),
rescore: true,
Expand Down
2 changes: 1 addition & 1 deletion lib/storage/src/content_manager/toc.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1133,7 +1133,7 @@ impl TableOfContent {
.map_err(|err| err.into())
}

/// Recommend points in a batchig fashion using positive and negative example from the request
/// Recommend points in a batching fashion using positive and negative example from the request
///
/// # Arguments
///
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ def test_points_retrieve():


def test_retrieve_invalid_vector():
# Retrieve non-existent vector name
# Retrieve nonexistent vector name
response = request_with_validation(
api='/collections/{collection_name}/points',
method="POST",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ def create_multi_from_collection(collection_name, source_collection_name, vector

@pytest.mark.parametrize("ok,source,size,distance", [
(True, source_collection_name, 4, 'Dot'), # ok
(False, "i-do-not-exist", 4, 'Dot'), # fail: non existing source collection
(False, "i-do-not-exist", 4, 'Dot'), # fail: nonexistent source collection
(False, source_collection_name, 8, 'Dot'), # fail: bad size
(False, source_collection_name, 4, 'Cosine'), # fail: bad distance
])
Expand Down
2 changes: 1 addition & 1 deletion src/consensus.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1042,7 +1042,7 @@ impl RaftMessageSender {

// Should we ignore the error? Seems like it will only produce noise.
//
// - `send_message` is only called by the sub-task spawned by the consnsus thread.
// - `send_message` is only called by the sub-task spawned by the consensus thread.
// - `report_snapshot` sends a message back to the consensus thread.
// - It can only fail, if the "receiver" end of the channel is closed.
// - Which means consensus thread either resolved successfully, or failed.
Expand Down
2 changes: 1 addition & 1 deletion src/startup.rs
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ pub fn setup_panic_hook(reporting_enabled: bool, reporting_id: String) {
}

/// Creates a file that indicates that the server has been started.
/// This file is used to check if the server has been been successfully started before potential kill.
/// This file is used to check if the server has been successfully started before potential kill.
pub fn touch_started_file_indicator() {
if let Err(err) = std::fs::write(get_init_file_path(), "") {
log::warn!("Failed to create init file indicator: {}", err);
Expand Down
2 changes: 1 addition & 1 deletion tests/shard-snapshot-api.sh
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@ function recover-local-concurrent {
check-recovered - "$SNAPSHOT_POINTS" "$@"
}

function recocover-local-concurrent-priority-snapshot {
function recover-local-concurrent-priority-snapshot {
recover-local-concurrent snapshot
}

Expand Down

0 comments on commit a18573b

Please sign in to comment.