Compare commits

...

80 Commits

Author SHA1 Message Date
5b0157c6c6 Merge #3955
3955: Update mini-dashboard to version 0.2.11 r=curquiza a=bidoubiwa

# Pull Request

## What does this PR do?
- Updates the mini-dashboard to version [0.2.11](https://github.com/meilisearch/mini-dashboard/releases/tag/v0.2.11)

## PR checklist
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Charlotte Vermandel <charlottevermandel@gmail.com>
2023-07-27 11:59:55 +00:00
3b9a87c790 Update mini-dashboard to version 0.2.11 2023-07-27 13:16:32 +02:00
3a3414270d Merge #3952
3952: Use the new safe `read-txn-no-tls` heed feature r=ManyTheFish a=Kerollmops

[We recently found out](https://github.com/meilisearch/heed/issues/191#issuecomment-1650280513) that the `read-sync-txn` heed feature was invalid and must be removed from this crate. We were declaring it in milli/meilisearch but, fortunately, not sharing the `RoTxn`s across threads 😮‍💨

[I recently introduced the `read-txn-no-tls` heed feature](https://github.com/meilisearch/heed/pull/194), which implements `RoTxn: Send` and allows multiple read transactions on a single thread (which we use).

This PR removes the `sync-read-txn` heed feature from the _Cargo.toml_ file. I will fix this in heed v0.20.0 and will fill a RustSec advisory in the meantime.

Co-authored-by: Clément Renault <clement@meilisearch.com>
2023-07-26 16:40:58 +00:00
d06e0905db Merge #3953
3953: Update UTM campaign r=curquiza a=macraig

# Pull Request

## What does this PR do?
Redirect CTAs to Cloud landing page



Co-authored-by: María <maria@Marias-MacBook-Pro.local>
2023-07-26 15:20:40 +00:00
939b2fc6fd Merge #3949
3949: Fix score details casing r=Kerollmops a=ManyTheFish

# Pull Request

Fixes #3941


Co-authored-by: ManyTheFish <many@meilisearch.com>
2023-07-26 14:14:59 +00:00
fae61372be Redirect CTAs to Cloud landing page 2023-07-26 15:54:43 +02:00
d8b47b689e Use the new read-txn-no-tls heed feature 2023-07-26 15:45:15 +02:00
be72be7c0d Merge #3942
3942: Normalize for the search the facets values r=ManyTheFish a=Kerollmops

This PR improves and fixes the search for facet values feature. Searching for _bre_ wasn't returning facet values like _brévent_ or _brô_.

The issue was related to the fact that facets are normalized but not in the same way as the `searchableAttributes` are. We decided to normalize them further and add another intermediate database where the key is the normalized facet value, and the value is a set of the non-normalized facets. We then use these non-normalized ones to get the correct counts by fetching the associated databases.

### What's missing in this PR?
 - [x] Apply the change to the whole set of `SearchForFacetValue::execute` conditions.
 - [x] Factorize the code that does an intermediate normalized value fetch in a function.
 - [x] Add or modify the search for facet value test.

Co-authored-by: Clément Renault <clement@meilisearch.com>
Co-authored-by: Kerollmops <clement@meilisearch.com>
2023-07-25 14:37:17 +00:00
88559a2d54 Fix score details casing 2023-07-25 15:49:33 +02:00
59201a7852 Use snapshot instead of asserts
Co-authored-by: Many the fish <many@meilisearch.com>
2023-07-25 15:34:05 +02:00
9e3e69373e Merge #3948
3948: Fix hnsw internal panic by using another library r=ManyTheFish a=Kerollmops

This pull request fixes #3923. The issue concerns the `hnsw` crate panicking due to a wrong call to the `[T]::copy_from_slice` function.

I decided to switch the library to `instant-distance`, which is maintained [by someone of trust](https://lib.rs/~djc), who maintains a lot of very important crates.

- [x] Make Clippy happy with the first commit.
- [x] Reproduce the #3923 bug without this patch
- [x] Check if the bug disappeared with this PR.
- [x] Test with [the Algolia e-commerce dataset](https://www.notion.so/meilisearch/Algolia-Ecommerce-c5fa3b5f23a7485295df7e87306d5859).

Co-authored-by: Kerollmops <clement@meilisearch.com>
2023-07-25 13:28:25 +00:00
29ab54b259 Replace the hnsw crate by the instant-distance one 2023-07-25 12:37:35 +02:00
86d8bb3a3e Make clippy happy (again) 2023-07-25 10:30:50 +02:00
0e2a5951b4 Add more advanced tests 2023-07-24 18:04:58 +02:00
691a536893 Implement the facet search with the normalized index 2023-07-24 17:56:17 +02:00
df528b41d8 Normalize for the search the facets values 2023-07-20 17:57:07 +02:00
2452ec55b4 Merge #3940
3940: Update mini dashboard v0.2.9 r=gillian-meilisearch a=bidoubiwa

# Pull Request


## What does this PR do?
- Updates the mini-dashboard to version [0.2.9](https://github.com/meilisearch/mini-dashboard/releases/tag/v0.2.9)

## PR checklist
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Charlotte Vermandel <charlottevermandel@gmail.com>
2023-07-20 15:08:59 +00:00
54ae1b5a67 Update mini-dashboard to version 0.2.9 2023-07-20 14:11:17 +02:00
3070a20580 Merge #3937
3937: Update Charabia to the last version r=Kerollmops a=ManyTheFish

# Pull Request

## Related issue
Fixes #3924

## What does this PR do?
- Update Charabia


Co-authored-by: ManyTheFish <many@meilisearch.com>
2023-07-19 14:57:38 +00:00
0497f93494 Update Charabia to the last version 2023-07-19 15:19:32 +02:00
d5ab750627 Merge #3935
3935: Update mini-dashboard to version 0.2.8 r=Kerollmops a=bidoubiwa

# Pull Request


## What does this PR do?
- Updates the mini-dashboard to version [0.2.8](https://github.com/meilisearch/mini-dashboard/releases/tag/v0.2.8)

## PR checklist
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Charlotte Vermandel <charlottevermandel@gmail.com>
2023-07-18 12:59:29 +00:00
2afd10f96d Update mini-dashboard to version 0.2.8 2023-07-18 14:49:36 +02:00
2d2619bd90 Merge #3933
3933: Stop computing the update files size r=ManyTheFish a=Kerollmops

This PR, related #3934, removes the part which computes the total size of the `data.ms/update_files` folder, which can take a lot of time when many updates must be processed.

It is not breaking API-side but is breaking on the result we will show to the user. The `databaseSize` field returned by the `/stats` endpoint will be reduced.

Co-authored-by: Kerollmops <clement@meilisearch.com>
2023-07-18 12:02:08 +00:00
516d2df862 Stop computing the update files size 2023-07-18 11:51:30 +02:00
c76b488ab1 Merge #3929
3929: Fix a panic when sorting geo fields represented by strings r=Kerollmops a=Kerollmops

This issue fixes #3927 by retrieving and parsing the original string values into f64s. I also added a test to ensure we don't break it in a future version.

Co-authored-by: Kerollmops <clement@meilisearch.com>
2023-07-18 09:13:22 +00:00
d383afc82b Fix the geo sort when lat and lng are strings 2023-07-17 18:28:04 +02:00
f9d94c5845 Test geo sort with string lat/lng 2023-07-17 18:28:03 +02:00
7745cc9d3c Merge #3921
3921: Deactivate camel case segmentation r=dureuill a=ManyTheFish

# Pull Request
This PR deactivates the camel case segmentation to retrieve the possibility to accept typos over camel-cased words

## Related issue
Fixes #3869
Fixes #3818

## What does this PR do?
- deactivates camelcase segmentation

related to #3919



Co-authored-by: ManyTheFish <many@meilisearch.com>
2023-07-13 11:00:14 +00:00
657f24ec5f Merge #3907
3907: Add telemetry for define field to search on at query time r=dureuill a=ManyTheFish

Add "attributes_to_search_on" telemetry usage counter:
```json
"attributes_to_search_on": {
   "total_number_of_use": 12,
},
```

This measures the number of search queries that the user uses `attributesToSearchOn` field.

related to https://github.com/meilisearch/specifications/pull/251

## reviewers:

- `@macraig` for validating the telemetry's name
- `@dureuill` for validating the code

Co-authored-by: ManyTheFish <many@meilisearch.com>
2023-07-13 10:14:00 +00:00
c106906f8f deactivate camelCase segmentation 2023-07-13 12:06:27 +02:00
9c0691156f Add tests 2023-07-13 11:53:13 +02:00
359b90288d Use saturating add 2023-07-13 11:38:28 +02:00
13e3f8faae Fix typo 2023-07-13 11:34:50 +02:00
fd7c66fd62 Merge #3915
3915: `attributesToSearchOn` supports wildcards r=ManyTheFish a=dureuill

# Pull Request

## Related issue

Fixes #3912  and #3911 

## What does this PR do?
- Adding `*` in the list of `attributesToSearchOn` allows searching on all the `searchableAttributes`.
- If `searchableAttributes contains "*"`, then any attribute is accepted in the `attributesToSearchOn` list.


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-07-13 09:33:10 +00:00
183f23f40d More relevant test
Co-authored-by: Many the fish <many@meilisearch.com>
2023-07-12 16:06:15 +02:00
16c8437b28 Update tests 2023-07-12 11:21:19 +02:00
4310928803 Fixes #3912 2023-07-12 10:08:56 +02:00
74315b4ea8 Fixes #3911 2023-07-12 10:08:29 +02:00
177e6e27f9 Merge #3901
3901: Fix experimental analytics r=curquiza a=dureuill

# Pull Request

## Related issue
Fixes https://github.com/meilisearch/specifications/pull/250#discussion_r1253191583

## What does this PR do?
- `snake_case` instead of `camelCase` for feature fields


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-07-10 16:22:59 +00:00
50afe724ae Merge #3909
3909: Effectively send the `vector.max_vector_size` telemetry r=curquiza a=Kerollmops

This PR effectively aggregates and sends the `vector.max_vector_size` analytics value.

Co-authored-by: Kerollmops <clement@meilisearch.com>
2023-07-10 15:44:30 +00:00
012c960fad Send the vector.max_vector_size telemetry 2023-07-10 16:50:37 +02:00
76f6d3357e Merge #3908
3908: Allow a comma-separated value to the `vector` argument in GET search r=Kerollmops a=dureuill

# Pull Request

For request:

```
 curl \
  -X GET 'http://localhost:7700/indexes/movies/search?vector=0.123,1.124,244'
```

Before PR: 

```
{"message":"Invalid value type for parameter `vector`: expected a string, but found a string: `0,1,2`","code":"invalid_search_vector","type":"invalid_request","link":"https://docs.meilisearch.com/errors#invalid_search_vector"}%
```

After PR:

```
{"hits":[],"query":"","vector":[0.123,1.124,244.0],"processingTimeMs":0,"limit":20,"offset":0,"estimatedTotalHits":1000}%
```

cc `@gmourier` `@bidoubiwa` 


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-07-10 14:25:44 +00:00
d59e969c16 Allow a comma-separated value to the vector argument in GET search 2023-07-10 16:16:34 +02:00
eb7a1aa7af Merge #3904
3904: Sort by lexicographic order after normalization r=dureuill a=dureuill

# Pull Request

## Related issue
Fixes https://github.com/meilisearch/meilisearch/issues/3893

## What does this PR do?
- Re-sort stop words after normalization so they're not sent out-of-order to the FST


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-07-10 12:12:05 +00:00
c30a14cb97 Add telemetry 2023-07-10 13:12:12 +02:00
a3ca8412ce Merge #3906
3906: Add "scoring.*" analytics to multi search route r=Kerollmops a=dureuill

# Pull Request

## Related issue
Fixes https://github.com/meilisearch/specifications/pull/252#discussion_r1254375746 by implementing (3): multi search now returns the "score.show_ranking_rule" and "score.show_ranking_rule_details" analytics.


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-07-10 09:51:30 +00:00
106f98aa72 Add "scoring.*" analytics to multi search route 2023-07-10 11:45:43 +02:00
40fa59d64c Sort by lexicographic order after normalization 2023-07-10 09:26:59 +02:00
bb40ce6e35 Experimental features analytics match the spec 2023-07-10 08:57:53 +02:00
0c8dbf6fa6 Merge #3897
3897: Add automated tests for `/experimental-features` route r=Kerollmops a=dureuill

# Pull Request

## What does this PR do?
- Make `RuntimeTogglableFeatures` `Eq`
- Add various tests for the `/experimental-features` route
  - Integration tests for the route itself
  - Integration tests for the effect of enabling `scoreDetails` and `vectorStore` through this route.
  - Dump integration tests


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-07-06 13:37:56 +00:00
dd6519b64f Dump tests 2023-07-06 14:22:29 +02:00
da02a9cf32 Make RuntimeTogglableFeatures Eq 2023-07-06 14:20:58 +02:00
ff192bb480 Merge #3889
3889: Display the total number of tasks matching a filter/query r=dureuill a=Kerollmops

This PR returns a new field on the `/tasks` routes. The `total` field exposes the total number of tasks that matches the given filter/query. It is useful to display information on a user interface and can help understand when progress is made in processing tasks, i.e., the total number of tasks on `/tasks?statuses=succeeded` will increase over time.

Fixes #3888.

- [ ] Update the specs fo the `/tasks` route.

## How have I implemented it?

I found it much easier to run two times the task filtering system. Once with the original `from` and `limit` parameters and a second time without. The second call will return the total number of tasks that match the query, not only the number of tasks on the current page.

So far, in terms of performance, there doesn't seem to be any issue. I tried different filters with something like 250k tasks. Note that there is a limit of 1M tasks in the queue.

Co-authored-by: Clément Renault <clement@meilisearch.com>
2023-07-06 10:23:09 +00:00
22762808ab Fix the tests 2023-07-06 12:13:29 +02:00
86b834c9e4 Display the total number of tasks in the tasks route 2023-07-06 10:05:18 +02:00
2d3cec11a7 Search integration test to check score details and vector store 2023-07-06 09:02:02 +02:00
76e1ee9988 integration test on "/experimental-features" route 2023-07-06 09:01:28 +02:00
222615d3df Allow to get/set features in integration test server 2023-07-06 09:01:05 +02:00
11d024c613 Authentication tests 2023-07-06 09:00:51 +02:00
886c8bb647 Merge #3891
3891: Fix the way we compute the 99th percentile r=dureuill a=Kerollmops

This PR fixes how we compute the 99th percentile by avoiding using float and doing the multiplication and divisions in the correct order avoiding going out of the buffer of timings. You can see the issue on [this rust playground](https://play.rust-lang.org/?version=stable&mode=debug&edition=2021).

When there are a very small number of successful requests, the number is so tiny that the 99th percentile calculus sometimes gives an index out of the buffer. In this example, the `1`/`1.0` represent the number of timings you collected (one). As you can see, the float computation gives us the index `1.0`, with is out of a vector of only one value. This makes the engine generate a `null` value.

```rust
1 * 99 / 100 = 0 // with integers
0.99_f64 * (1.0 - 1.0) + 1.0 = 1.0 // with floats
```

Co-authored-by: Clément Renault <clement@meilisearch.com>
2023-07-06 06:04:08 +00:00
b422e5fdc3 Merge #3890
3890: Fix the analytics of the sort facet values by count feature r=dureuill a=Kerollmops

This PR ensures we return the right analytics from the settings route.

Co-authored-by: Clément Renault <clement@meilisearch.com>
2023-07-06 05:24:40 +00:00
d727ebee05 Fix the way we compute the 99th percentile 2023-07-05 17:53:09 +02:00
da39a7b29e Return the right analytics 2023-07-05 17:27:51 +02:00
377fe33aac Merge #3885
3885: Exactness missing field r=dureuill a=dureuill

# Pull Request

Adds fields to score details that were [specified](c25d758264/text/0195-ranking-score.md (322-ranking-rule-specific-fields)), but missing in the implementation:

- `exactness.matchingWords`
- `exactness.maxMatchingWords` 


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-07-04 15:14:53 +00:00
55cd7738b9 Update snapshots 2023-07-04 16:31:01 +02:00
48409c9183 Add missing exactness.matchingWords, exactness.maxMatchingWords 2023-07-04 16:31:01 +02:00
82650eaae1 Merge #3877
3877: update the total_received properties of multiple events r=dureuill a=dureuill

# Pull Request

## Related issue
Fixes #3814 

## What does this PR do?
-fix name of `total_received` for several events


Co-authored-by: Tamo <tamo@meilisearch.com>
2023-07-03 19:49:53 +00:00
b8ca09c13f Merge #3878
3878: Remove unsafe `atty` dependency r=dureuill a=Kerollmops

This PR replaces the `atty` dependency with the `is-terminal` one. We do that to fix GHSA-g98v-hv3f-hcfr.

Co-authored-by: Kerollmops <clement@meilisearch.com>
2023-07-03 19:07:03 +00:00
a442af6a7c Update the features of the either dependency to compile milli successfully 2023-07-03 18:51:43 +02:00
e7f8daaf86 Update criterion to 0.5.1 to remove the atty dependency 2023-07-03 18:51:42 +02:00
d1ff631df8 Replace the atty dependency with the is-terminal one 2023-07-03 18:51:42 +02:00
202183adf8 update the total_received properties of multiple events 2023-07-03 15:57:09 +02:00
aae099e330 Merge #3851
3851: Expose lastUpdate and isIndexing in /stats endpoint r=dureuill a=gentcys

# Pull Request

## Related issue
Fixes #3843

## What does this PR do?
- expose lastUpdate in `/stats` endpoint
- expose isIndex in `stats` endpoint
- add a method `is_task_processing` in index-scheduler/src/lib.rs.

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Cong Chen <cong.chen@ocrlabs.com>
Co-authored-by: ManyTheFish <many@meilisearch.com>
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-07-03 13:41:04 +00:00
5387cf1718 Don't unwrap in case of error/missing last_update field 2023-07-03 15:32:11 +02:00
71500a4e15 Update tests 2023-07-03 11:20:43 +02:00
9859e65d2f fix tests 2023-07-01 09:32:50 +08:00
3bdf01bc1c Fix failed test 2023-06-30 17:39:23 +08:00
a5a31667b0 fix converse result of is_task_processing() 2023-06-30 11:28:18 +08:00
e3fc7112bc use RoaringBitmap::is_empty instead 2023-06-29 11:46:47 +08:00
6d4981ec25 Expose lastUpdate and isIndexing in /stats endpoint 2023-06-23 07:24:25 +08:00
163 changed files with 2242 additions and 1072 deletions

365
Cargo.lock generated
View File

@ -8,7 +8,7 @@ version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "617a8268e3537fe1d8c9ead925fca49ef6400927ee7bc26750e90ecee14ce4b8"
dependencies = [
"bitflags",
"bitflags 1.3.2",
"bytes",
"futures-core",
"futures-sink",
@ -47,7 +47,7 @@ dependencies = [
"actix-utils",
"ahash 0.8.3",
"base64 0.21.2",
"bitflags",
"bitflags 1.3.2",
"brotli",
"bytes",
"bytestring",
@ -405,7 +405,7 @@ checksum = "16e62a023e7c117e27523144c5d2459f4397fcc3cab0085af8e2224f643a0193"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -416,7 +416,7 @@ checksum = "b9ccdd8f2a161be9bd5c023df56f1b2a0bd1d83872ae53b71a84a12c9bf6e842"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -428,17 +428,6 @@ dependencies = [
"critical-section",
]
[[package]]
name = "atty"
version = "0.2.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d9b39be18770d11421cdb1b9947a45dd3f37e93092cbf377614828a319d5fee8"
dependencies = [
"hermit-abi 0.1.19",
"libc",
"winapi",
]
[[package]]
name = "autocfg"
version = "1.1.0"
@ -527,6 +516,12 @@ version = "1.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
[[package]]
name = "bitflags"
version = "2.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "630be753d4e58660abd17930c71b647fe46c27ea6b63cc59e1e3851406972e42"
[[package]]
name = "block-buffer"
version = "0.10.4"
@ -608,7 +603,7 @@ checksum = "fdde5c9cd29ebd706ce1b35600920a33550e402fc998a2e53ad3b42c3c47a192"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -705,16 +700,15 @@ dependencies = [
[[package]]
name = "charabia"
version = "0.8.1"
version = "0.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bb49850f555eb71aa6fc6d4d79420e81f4d89fa56e0e9c0f6d19aace2f56c554"
checksum = "57aa1b4a8dda126c03ebf2f7e31d16cfc8781c2fe80dedd1a33459efc3e07578"
dependencies = [
"aho-corasick",
"cow-utils",
"csv",
"deunicode",
"either",
"finl_unicode",
"fst",
"irg-kvariants",
"jieba-rs",
@ -767,18 +761,6 @@ dependencies = [
"inout",
]
[[package]]
name = "clap"
version = "3.2.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4ea181bf566f71cb9a5d17a59e1871af638180a18fb0035c92ae62b705207123"
dependencies = [
"bitflags",
"clap_lex 0.2.4",
"indexmap",
"textwrap",
]
[[package]]
name = "clap"
version = "4.3.0"
@ -798,8 +780,8 @@ checksum = "4f423e341edefb78c9caba2d9c7f7687d0e72e89df3ce3394554754393ac3990"
dependencies = [
"anstream",
"anstyle",
"bitflags",
"clap_lex 0.5.0",
"bitflags 1.3.2",
"clap_lex",
"strsim",
]
@ -812,16 +794,7 @@ dependencies = [
"heck",
"proc-macro2",
"quote",
"syn 2.0.18",
]
[[package]]
name = "clap_lex"
version = "0.2.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2850f2f5a82cbf437dd5af4d49848fbdfc27c157c3d010345776f952765261c5"
dependencies = [
"os_str_bytes",
"syn 2.0.26",
]
[[package]]
@ -929,19 +902,19 @@ dependencies = [
[[package]]
name = "criterion"
version = "0.4.0"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e7c76e09c1aae2bc52b3d2f29e13c6572553b30c4aa1b8a49fd70de6412654cb"
checksum = "f2b12d017a929603d80db1831cd3a24082f8137ce19c69e6447f54f5fc8d692f"
dependencies = [
"anes",
"atty",
"cast",
"ciborium",
"clap 3.2.25",
"clap",
"criterion-plot",
"is-terminal",
"itertools",
"lazy_static",
"num-traits",
"once_cell",
"oorandom",
"plotters",
"rayon",
@ -1048,9 +1021,9 @@ dependencies = [
[[package]]
name = "csv"
version = "1.2.1"
version = "1.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0b015497079b9a9d69c02ad25de6c0a6edef051ea6360a327d0bd05802ef64ad"
checksum = "626ae34994d3d8d668f4269922248239db4ae42d538b14c398b74a52208e8086"
dependencies = [
"csv-core",
"itoa",
@ -1224,12 +1197,6 @@ dependencies = [
"winapi",
]
[[package]]
name = "doc-comment"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fea41bba32d969b513997752735605054bc0dfa92b4c56bf1189f2e174be7a10"
[[package]]
name = "dump"
version = "1.3.0"
@ -1369,7 +1336,7 @@ checksum = "eecf8589574ce9b895052fa12d69af7a233f99e6107f5cb8dd1044f2a17bfdcb"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -1385,6 +1352,12 @@ dependencies = [
"termcolor",
]
[[package]]
name = "equivalent"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5443807d6dff69373d433ab9ef5378ad8df50ca6298caf15de6e52e24aaf54d5"
[[package]]
name = "errno"
version = "0.3.1"
@ -1469,12 +1442,6 @@ dependencies = [
"nom_locate",
]
[[package]]
name = "finl_unicode"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8fcfdc7a0362c9f4444381a9e697c79d435fe65b52a37466fc2c1184cee9edc6"
[[package]]
name = "flate2"
version = "1.0.26"
@ -1570,7 +1537,7 @@ checksum = "89ca545a94061b6365f2c7355b4b32bd20df3ff95f02da9329b34ccc3bd6ee72"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -1608,7 +1575,7 @@ name = "fuzzers"
version = "1.3.0"
dependencies = [
"arbitrary",
"clap 4.3.0",
"clap",
"fastrand",
"milli",
"serde",
@ -1676,7 +1643,7 @@ version = "0.16.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ccf7f68c2995f392c49fffb4f95ae2c873297830eb25c6bc4c114ce8f4562acc"
dependencies = [
"bitflags",
"bitflags 1.3.2",
"libc",
"libgit2-sys",
"log",
@ -1712,7 +1679,7 @@ dependencies = [
"futures-sink",
"futures-util",
"http",
"indexmap",
"indexmap 1.9.3",
"slab",
"tokio",
"tokio-util",
@ -1734,15 +1701,6 @@ dependencies = [
"byteorder",
]
[[package]]
name = "hashbrown"
version = "0.11.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab5ef0d4909ef3724cc8cce6ccc8572c5c817592e9285f5464f8e86f8bd3726e"
dependencies = [
"ahash 0.7.6",
]
[[package]]
name = "hashbrown"
version = "0.12.3"
@ -1752,6 +1710,12 @@ dependencies = [
"ahash 0.7.6",
]
[[package]]
name = "hashbrown"
version = "0.14.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2c6201b9ff9fd90a5a3bac2e56a830d0caa509576f0e503818ee82c181b3437a"
[[package]]
name = "heapless"
version = "0.7.16"
@ -1773,8 +1737,8 @@ checksum = "95505c38b4572b2d910cecb0281560f54b440a19336cbbcb27bf6ce6adc6f5a8"
[[package]]
name = "heed"
version = "0.12.5"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.6#8c5b94225fc949c02bb7b900cc50ffaf6b584b1e"
version = "0.12.7"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.7#061a5276b1f336f5f3302bee291e336041d88632"
dependencies = [
"byteorder",
"heed-traits",
@ -1791,12 +1755,12 @@ dependencies = [
[[package]]
name = "heed-traits"
version = "0.7.0"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.6#8c5b94225fc949c02bb7b900cc50ffaf6b584b1e"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.7#061a5276b1f336f5f3302bee291e336041d88632"
[[package]]
name = "heed-types"
version = "0.7.2"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.6#8c5b94225fc949c02bb7b900cc50ffaf6b584b1e"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.7#061a5276b1f336f5f3302bee291e336041d88632"
dependencies = [
"bincode",
"heed-traits",
@ -1805,15 +1769,6 @@ dependencies = [
"zerocopy",
]
[[package]]
name = "hermit-abi"
version = "0.1.19"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "62b467343b94ba476dcb2500d242dadbb39557df889310ac77c5d99100aaac33"
dependencies = [
"libc",
]
[[package]]
name = "hermit-abi"
version = "0.2.6"
@ -1844,22 +1799,6 @@ dependencies = [
"digest",
]
[[package]]
name = "hnsw"
version = "0.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2b9740ebf8769ec4ad6762cc951ba18f39bba6dfbc2fbbe46285f7539af79752"
dependencies = [
"ahash 0.7.6",
"hashbrown 0.11.2",
"libm",
"num-traits",
"rand_core",
"serde",
"smallvec",
"space",
]
[[package]]
name = "http"
version = "0.2.9"
@ -1994,6 +1933,16 @@ dependencies = [
"serde",
]
[[package]]
name = "indexmap"
version = "2.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d5477fe2230a79769d8dc68e0eabf5437907c0457a5614a9e8dddb67f65eb65d"
dependencies = [
"equivalent",
"hashbrown 0.14.0",
]
[[package]]
name = "inout"
version = "0.1.3"
@ -2028,6 +1977,21 @@ dependencies = [
"cfg-if",
]
[[package]]
name = "instant-distance"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c619cdaa30bb84088963968bee12a45ea5fbbf355f2c021bcd15589f5ca494a"
dependencies = [
"num_cpus",
"ordered-float",
"parking_lot",
"rand",
"rayon",
"serde",
"serde-big-array",
]
[[package]]
name = "io-lifetimes"
version = "1.0.11"
@ -2058,13 +2022,12 @@ dependencies = [
[[package]]
name = "is-terminal"
version = "0.4.7"
version = "0.4.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "adcf93614601c8129ddf72e2d5633df827ba6551541c6d8c59520a371475be1f"
checksum = "24fddda5af7e54bf7da53067d6e802dbcc381d0a8eef629df528e3ebf68755cb"
dependencies = [
"hermit-abi 0.3.1",
"io-lifetimes",
"rustix 0.37.19",
"rustix 0.38.2",
"windows-sys 0.48.0",
]
@ -2161,9 +2124,9 @@ dependencies = [
[[package]]
name = "libc"
version = "0.2.144"
version = "0.2.147"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2b00cc1c228a6782d0f076e7b232802e0c5689d41bb5df366f2a6b6621cfdfe1"
checksum = "b4668fb0ea861c1df094127ac5f1da3409a82116a4ba74fca2e58ef927159bb3"
[[package]]
name = "libgit2-sys"
@ -2207,9 +2170,9 @@ dependencies = [
[[package]]
name = "lindera-cc-cedict-builder"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c6bf79b29a90bcd22036e494d6cc9ac3abe9ab604b21f3258ba6dc1ce501801"
checksum = "2d2e8f2ca97ddf952fe340642511b9c14b373cb2eef711d526bb8ef2ca0969b8"
dependencies = [
"anyhow",
"bincode",
@ -2226,9 +2189,9 @@ dependencies = [
[[package]]
name = "lindera-compress"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8f2e99e67736352bbb6ed1c273643975822505067ca32194b0981040bc50527a"
checksum = "f72b460559bcbe8a9cee85ea4a5056133ed3abf373031191589236e656d65b59"
dependencies = [
"anyhow",
"flate2",
@ -2237,9 +2200,9 @@ dependencies = [
[[package]]
name = "lindera-core"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c3935e966409156f22cb4b334b21b0dce84b7aa1cad62214b466489d249c8e5"
checksum = "f586eb8a9393c32d5525e0e9336a3727bd1329674740097126f3b0bff8a1a1ea"
dependencies = [
"anyhow",
"bincode",
@ -2254,9 +2217,9 @@ dependencies = [
[[package]]
name = "lindera-decompress"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7476406abb63c49d7f59c88b9b868ee8d2981495ea7e2c3ad129902f9916b3c6"
checksum = "1fb1facd8da698072fcc7338bd757730db53d59f313f44dd583fa03681dcc0e1"
dependencies = [
"anyhow",
"flate2",
@ -2265,9 +2228,9 @@ dependencies = [
[[package]]
name = "lindera-dictionary"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "808b7d2b3cabc25a4022526d484a4cfd1d5924dc76a26e0379707698841acef2"
checksum = "ec7be7410b1da7017a8948986b87af67082f605e9a716f0989790d795d677f0c"
dependencies = [
"anyhow",
"bincode",
@ -2285,9 +2248,9 @@ dependencies = [
[[package]]
name = "lindera-ipadic-builder"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "31f373a280958c930e5ee4a1e4db3a0ee0542afaf02d3b5cacb8cab4e298648e"
checksum = "705d07f8a45d04fd95149f7ad41a26d1f9e56c9c00402be6f9dd05e3d88b99c6"
dependencies = [
"anyhow",
"bincode",
@ -2306,9 +2269,9 @@ dependencies = [
[[package]]
name = "lindera-ipadic-neologd-builder"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "92eff98e9ed1a7a412b91709c2343457a04ef02fa0c27c27e3a5892f5591eae9"
checksum = "633a93983ba13fba42328311a501091bd4a7aff0c94ae9eaa9d4733dd2b0468a"
dependencies = [
"anyhow",
"bincode",
@ -2327,9 +2290,9 @@ dependencies = [
[[package]]
name = "lindera-ko-dic"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "74c6d5bf7d8092bd6d10de7a5d74b70ea7cf234586235b0d6cdb903b05a6c9e2"
checksum = "a428e0d316b6c86f51bd919479692bc41ad840dba266ebc044663970f431ea18"
dependencies = [
"bincode",
"byteorder",
@ -2344,9 +2307,9 @@ dependencies = [
[[package]]
name = "lindera-ko-dic-builder"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f0a4add6d3c1e41ec9e2690d33e287d0223fb59a30ccee4980c23f31368cae1e"
checksum = "2a5288704c6b8a069c0a1705c38758e836497698b50453373ab3d56c6f9a7ef8"
dependencies = [
"anyhow",
"bincode",
@ -2364,9 +2327,9 @@ dependencies = [
[[package]]
name = "lindera-tokenizer"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cb6a8acbd068019d1cdac7316f0dcb87f8e33ede2b13aa237f45114f9750afb8"
checksum = "106ba439b2e87529d9bbedbb88d69f635baba1195c26502b308f55a85885fc81"
dependencies = [
"bincode",
"byteorder",
@ -2379,9 +2342,9 @@ dependencies = [
[[package]]
name = "lindera-unidic"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "14abf0613d350b30d3b0406a33b1de8fa8d829f26516909421702174785991c8"
checksum = "3399b6dcfe1701333451d184ff3c677f433b320153427b146360c9e4bd8cb816"
dependencies = [
"bincode",
"byteorder",
@ -2396,9 +2359,9 @@ dependencies = [
[[package]]
name = "lindera-unidic-builder"
version = "0.25.0"
version = "0.27.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e204ed53d9bd63227d1e6a6c1f122ca039e00a8634ac32e7fb0281eeec8615c4"
checksum = "b698227fdaeac32289173ab389b990d4eb00a40cbc9912020f69a0c491dabf55"
dependencies = [
"anyhow",
"bincode",
@ -2432,6 +2395,12 @@ version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ef53942eb7bf7ff43a617b3e2c1c4a5ecf5944a7c1bc12d7ee39bbb15e5c1519"
[[package]]
name = "linux-raw-sys"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09fc20d2ca12cb9f044c93e3bd6d32d523e6e2ec3db4f7b2939cd99026ecd3f0"
[[package]]
name = "lmdb-rkv-sys"
version = "0.15.1"
@ -2472,9 +2441,9 @@ dependencies = [
[[package]]
name = "log"
version = "0.4.18"
version = "0.4.19"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "518ef76f2f87365916b142844c16d8fefd85039bc5699050210a7778ee1cd1de"
checksum = "b06a4cde4c0f271a446782e3eff8de789548ce57dbc8eca9292c27f4a42004b4"
[[package]]
name = "logging_timer"
@ -2507,7 +2476,7 @@ dependencies = [
"once_cell",
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -2545,13 +2514,12 @@ dependencies = [
"assert-json-diff",
"async-stream",
"async-trait",
"atty",
"brotli",
"bstr",
"byte-unit",
"bytes",
"cargo_toml",
"clap 4.3.0",
"clap",
"crossbeam-channel",
"deserr",
"dump",
@ -2565,8 +2533,9 @@ dependencies = [
"hex",
"http",
"index-scheduler",
"indexmap",
"indexmap 1.9.3",
"insta",
"is-terminal",
"itertools",
"jsonwebtoken",
"lazy_static",
@ -2716,9 +2685,9 @@ dependencies = [
"geoutils",
"grenad",
"heed",
"hnsw",
"indexmap",
"indexmap 1.9.3",
"insta",
"instant-distance",
"itertools",
"json-depth-checker",
"levenshtein_automata",
@ -2742,7 +2711,6 @@ dependencies = [
"smallstr",
"smallvec",
"smartstring",
"space",
"tempfile",
"thiserror",
"time",
@ -2903,9 +2871,9 @@ checksum = "f69e48cd7c8e5bb52a1da1287fdbfd877c32673176583ce664cd63b201aba385"
[[package]]
name = "once_cell"
version = "1.17.1"
version = "1.18.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b7e5500299e16ebb147ae15a00a942af264cf3688f47923b8fc2cd5858f23ad3"
checksum = "dd8b5dd2ae5ed71462c540258bedcb51965123ad7e7ccf4b9a8cafaa4a63576d"
[[package]]
name = "oorandom"
@ -2922,12 +2890,6 @@ dependencies = [
"num-traits",
]
[[package]]
name = "os_str_bytes"
version = "6.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ceedf44fb00f2d1984b0bc98102627ce622e083e49a5bacdb3e514fa4238e267"
[[package]]
name = "page_size"
version = "0.4.2"
@ -3068,7 +3030,7 @@ dependencies = [
"pest_meta",
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -3213,9 +3175,9 @@ dependencies = [
[[package]]
name = "proc-macro2"
version = "1.0.59"
version = "1.0.66"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6aeca18b86b413c660b781aa319e4e2648a3e6f9eadc9b47e9038e6fe9f3451b"
checksum = "18fb31db3f9bddb2ea821cde30a9f70117e3f119938b5ee630b7403aa6e2ead9"
dependencies = [
"unicode-ident",
]
@ -3226,7 +3188,7 @@ version = "0.14.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b1de8dacb0873f77e6aefc6d71e044761fcc68060290f5b1089fcdf84626bb69"
dependencies = [
"bitflags",
"bitflags 1.3.2",
"byteorder",
"hex",
"lazy_static",
@ -3258,9 +3220,9 @@ checksum = "106dd99e98437432fed6519dedecfade6a06a73bb7b2a1e019fdd2bee5778d94"
[[package]]
name = "quote"
version = "1.0.28"
version = "1.0.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1b9ab9c7eadfd8df19006f1cf1a4aed13540ed5cbc047010ece5826e10825488"
checksum = "5fe8a65d69dd0808184ebb5f836ab526bb259db23c657efa38711b1072ee47f0"
dependencies = [
"proc-macro2",
]
@ -3333,7 +3295,7 @@ version = "0.2.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fb5a58c1855b4b6819d59012155603f0b22ad30cad752600aadfcb695265519a"
dependencies = [
"bitflags",
"bitflags 1.3.2",
]
[[package]]
@ -3342,7 +3304,7 @@ version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "567664f262709473930a4bf9e51bf2ebf3348f2e748ccc50dea20646858f8f29"
dependencies = [
"bitflags",
"bitflags 1.3.2",
]
[[package]]
@ -3484,7 +3446,7 @@ version = "0.36.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "14e4d67015953998ad0eb82887a0eb0129e18a7e2f3b7b0f6c422fddcd503d62"
dependencies = [
"bitflags",
"bitflags 1.3.2",
"errno",
"io-lifetimes",
"libc",
@ -3498,7 +3460,7 @@ version = "0.37.19"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "acf8729d8542766f1b2cf77eb034d52f40d375bb8b615d0b147089946e16613d"
dependencies = [
"bitflags",
"bitflags 1.3.2",
"errno",
"io-lifetimes",
"libc",
@ -3506,6 +3468,19 @@ dependencies = [
"windows-sys 0.48.0",
]
[[package]]
name = "rustix"
version = "0.38.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aabcb0461ebd01d6b79945797c27f8529082226cb630a9865a71870ff63532a4"
dependencies = [
"bitflags 2.3.3",
"errno",
"libc",
"linux-raw-sys 0.4.3",
"windows-sys 0.48.0",
]
[[package]]
name = "rustls"
version = "0.20.8"
@ -3608,13 +3583,22 @@ checksum = "bebd363326d05ec3e2f532ab7660680f3b02130d780c299bca73469d521bc0ed"
[[package]]
name = "serde"
version = "1.0.163"
version = "1.0.171"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2113ab51b87a539ae008b5c6c02dc020ffa39afd2d83cffcb3f4eb2722cebec2"
checksum = "30e27d1e4fd7659406c492fd6cfaf2066ba8773de45ca75e855590f856dc34a9"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde-big-array"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "11fc7cc2c76d73e0f27ee52abbd64eec84d46f370c88371120433196934e4b7f"
dependencies = [
"serde",
]
[[package]]
name = "serde-cs"
version = "0.2.4"
@ -3626,22 +3610,22 @@ dependencies = [
[[package]]
name = "serde_derive"
version = "1.0.163"
version = "1.0.171"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c805777e3930c8883389c602315a24224bcc738b63905ef87cd1420353ea93e"
checksum = "389894603bd18c46fa56231694f8d827779c0951a667087194cf9de94ed24682"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
name = "serde_json"
version = "1.0.96"
version = "1.0.103"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "057d394a50403bcac12672b2b18fb387ab6d289d957dab67dd201875391e52f1"
checksum = "d03b412469450d4404fe8499a268edd7f8b79fecb074b0d812ad64ca21f4031b"
dependencies = [
"indexmap",
"indexmap 2.0.0",
"itoa",
"ryu",
"serde",
@ -3764,9 +3748,6 @@ name = "smallvec"
version = "1.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a507befe795404456341dfab10cef66ead4c041f62b8b11bbb92bffe5d0953e0"
dependencies = [
"serde",
]
[[package]]
name = "smartstring"
@ -3789,16 +3770,6 @@ dependencies = [
"winapi",
]
[[package]]
name = "space"
version = "0.17.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c5ab9701ae895386d13db622abf411989deff7109b13b46b6173bb4ce5c1d123"
dependencies = [
"doc-comment",
"num-traits",
]
[[package]]
name = "spin"
version = "0.5.2"
@ -3862,9 +3833,9 @@ dependencies = [
[[package]]
name = "syn"
version = "2.0.18"
version = "2.0.26"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32d41677bcbe24c20c52e7c70b0d8db04134c5d1066bf98662e2871ad200ea3e"
checksum = "45c3457aacde3c65315de5031ec191ce46604304d2446e803d71ade03308d970"
dependencies = [
"proc-macro2",
"quote",
@ -3949,30 +3920,24 @@ dependencies = [
"winapi-util",
]
[[package]]
name = "textwrap"
version = "0.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "222a222a5bfe1bba4a77b45ec488a741b3cb8872e5e499451fd7d0129c9c7c3d"
[[package]]
name = "thiserror"
version = "1.0.40"
version = "1.0.43"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "978c9a314bd8dc99be594bc3c175faaa9794be04a5a5e153caba6915336cebac"
checksum = "a35fc5b8971143ca348fa6df4f024d4d55264f3468c71ad1c2f365b0a4d58c42"
dependencies = [
"thiserror-impl",
]
[[package]]
name = "thiserror-impl"
version = "1.0.40"
version = "1.0.43"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f9456a42c5b0d803c8cd86e73dd7cc9edd429499f37a3550d286d5e86720569f"
checksum = "463fe12d7993d3b327787537ce8dd4dfa058de32fc2b195ef3cde03dc4771e8f"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -4054,7 +4019,7 @@ checksum = "630bdcf245f78637c13ec01ffae6187cca34625e8c63150d424b59e55af2675e"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
]
[[package]]
@ -4130,7 +4095,7 @@ version = "0.19.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2380d56e8670370eee6566b0bfd4265f65b3f432e8c6d85623f728d4fa31f739"
dependencies = [
"indexmap",
"indexmap 1.9.3",
"serde",
"serde_spanned",
"toml_datetime",
@ -4379,7 +4344,7 @@ dependencies = [
"once_cell",
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
"wasm-bindgen-shared",
]
@ -4413,7 +4378,7 @@ checksum = "e128beba882dd1eb6200e1dc92ae6c5dbaa4311aa7bb211ca035779e5efc39f8"
dependencies = [
"proc-macro2",
"quote",
"syn 2.0.18",
"syn 2.0.26",
"wasm-bindgen-backend",
"wasm-bindgen-shared",
]

View File

@ -61,7 +61,7 @@ You may also want to check out [Meilisearch 101](https://www.meilisearch.com/doc
## ⚡ Supercharge your Meilisearch experience
Say goodbye to server deployment and manual updates with [Meilisearch Cloud](https://www.meilisearch.com/pricing?utm_campaign=oss&utm_source=engine&utm_medium=meilisearch). No credit card required.
Say goodbye to server deployment and manual updates with [Meilisearch Cloud](https://www.meilisearch.com/cloud?utm_campaign=oss&utm_source=github&utm_medium=meilisearch). No credit card required.
## 🧰 SDKs & integration tools

View File

@ -18,7 +18,7 @@ mimalloc = { version = "0.1.36", default-features = false }
serde_json = { version = "1.0.95", features = ["preserve_order"] }
[dev-dependencies]
criterion = { version = "0.4.0", features = ["html_reports"] }
criterion = { version = "0.5.1", features = ["html_reports"] }
rand = "0.8.5"
rand_chacha = "0.3.1"
roaring = "0.10.1"

View File

@ -210,6 +210,7 @@ pub(crate) mod test {
use big_s::S;
use maplit::{btreemap, btreeset};
use meilisearch_types::facet_values_sort::FacetValuesSort;
use meilisearch_types::features::RuntimeTogglableFeatures;
use meilisearch_types::index_uid_pattern::IndexUidPattern;
use meilisearch_types::keys::{Action, Key};
use meilisearch_types::milli;
@ -418,7 +419,10 @@ pub(crate) mod test {
}
keys.flush().unwrap();
// ========== TODO: create features here
// ========== experimental features
let features = create_test_features();
dump.create_experimental_features(features).unwrap();
// create the dump
let mut file = tempfile::tempfile().unwrap();
@ -428,6 +432,10 @@ pub(crate) mod test {
file
}
fn create_test_features() -> RuntimeTogglableFeatures {
RuntimeTogglableFeatures { vector_store: true, ..Default::default() }
}
#[test]
fn test_creating_and_read_dump() {
let mut file = create_test_dump();
@ -472,5 +480,9 @@ pub(crate) mod test {
for (key, expected) in dump.keys().unwrap().zip(create_test_api_keys()) {
assert_eq!(key.unwrap(), expected);
}
// ==== checking the features
let expected = create_test_features();
assert_eq!(dump.features().unwrap().unwrap(), expected);
}
}

View File

@ -195,8 +195,53 @@ pub(crate) mod test {
use meili_snap::insta;
use super::*;
use crate::reader::v6::RuntimeTogglableFeatures;
// TODO: add `features` to tests
#[test]
fn import_dump_v6_experimental() {
let dump = File::open("tests/assets/v6-with-experimental.dump").unwrap();
let mut dump = DumpReader::open(dump).unwrap();
// top level infos
insta::assert_display_snapshot!(dump.date().unwrap(), @"2023-07-06 7:10:27.21958 +00:00:00");
insta::assert_debug_snapshot!(dump.instance_uid().unwrap(), @"None");
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"d45cd8571703e58ae53c7bd7ce3f5c22");
assert_eq!(update_files.len(), 2);
assert!(update_files[0].is_none()); // the dump creation
assert!(update_files[1].is_none()); // the processed document addition
// keys
let keys = dump.keys().unwrap().collect::<Result<Vec<_>>>().unwrap();
meili_snap::snapshot_hash!(meili_snap::json_string!(keys), @"13c2da155e9729c2344688cab29af71d");
// indexes
let mut indexes = dump.indexes().unwrap().collect::<Result<Vec<_>>>().unwrap();
// the index are not ordered in any way by default
indexes.sort_by_key(|index| index.metadata().uid.to_string());
let mut test = indexes.pop().unwrap();
assert!(indexes.is_empty());
insta::assert_json_snapshot!(test.metadata(), @r###"
{
"uid": "test",
"primaryKey": "id",
"createdAt": "2023-07-06T07:07:41.364694Z",
"updatedAt": "2023-07-06T07:07:41.396114Z"
}
"###);
assert_eq!(test.documents().unwrap().count(), 1);
assert_eq!(
dump.features().unwrap().unwrap(),
RuntimeTogglableFeatures { vector_store: true, ..Default::default() }
);
}
#[test]
fn import_dump_v5() {
@ -274,6 +319,8 @@ pub(crate) mod test {
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");
assert_eq!(dump.features().unwrap(), None);
}
#[test]

View File

@ -292,6 +292,7 @@ pub(crate) mod test {
│ ├---- update_files/
│ │ └---- 1.jsonl
│ └---- queue.jsonl
├---- experimental-features.json
├---- instance_uid.uuid
├---- keys.jsonl
└---- metadata.json

Binary file not shown.

View File

@ -472,6 +472,77 @@ pub fn parse_filter(input: Span) -> IResult<FilterCondition> {
terminated(|input| parse_expression(input, 0), eof)(input)
}
impl<'a> std::fmt::Display for FilterCondition<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
FilterCondition::Not(filter) => {
write!(f, "NOT ({filter})")
}
FilterCondition::Condition { fid, op } => {
write!(f, "{fid} {op}")
}
FilterCondition::In { fid, els } => {
write!(f, "{fid} IN[")?;
for el in els {
write!(f, "{el}, ")?;
}
write!(f, "]")
}
FilterCondition::Or(els) => {
write!(f, "OR[")?;
for el in els {
write!(f, "{el}, ")?;
}
write!(f, "]")
}
FilterCondition::And(els) => {
write!(f, "AND[")?;
for el in els {
write!(f, "{el}, ")?;
}
write!(f, "]")
}
FilterCondition::GeoLowerThan { point, radius } => {
write!(f, "_geoRadius({}, {}, {})", point[0], point[1], radius)
}
FilterCondition::GeoBoundingBox {
top_right_point: top_left_point,
bottom_left_point: bottom_right_point,
} => {
write!(
f,
"_geoBoundingBox([{}, {}], [{}, {}])",
top_left_point[0],
top_left_point[1],
bottom_right_point[0],
bottom_right_point[1]
)
}
}
}
}
impl<'a> std::fmt::Display for Condition<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Condition::GreaterThan(token) => write!(f, "> {token}"),
Condition::GreaterThanOrEqual(token) => write!(f, ">= {token}"),
Condition::Equal(token) => write!(f, "= {token}"),
Condition::NotEqual(token) => write!(f, "!= {token}"),
Condition::Null => write!(f, "IS NULL"),
Condition::Empty => write!(f, "IS EMPTY"),
Condition::Exists => write!(f, "EXISTS"),
Condition::LowerThan(token) => write!(f, "< {token}"),
Condition::LowerThanOrEqual(token) => write!(f, "<= {token}"),
Condition::Between { from, to } => write!(f, "{from} TO {to}"),
}
}
}
impl<'a> std::fmt::Display for Token<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{{{}}}", self.value())
}
}
#[cfg(test)]
pub mod tests {
use super::*;
@ -852,74 +923,3 @@ pub mod tests {
assert_eq!(token.value(), s);
}
}
impl<'a> std::fmt::Display for FilterCondition<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
FilterCondition::Not(filter) => {
write!(f, "NOT ({filter})")
}
FilterCondition::Condition { fid, op } => {
write!(f, "{fid} {op}")
}
FilterCondition::In { fid, els } => {
write!(f, "{fid} IN[")?;
for el in els {
write!(f, "{el}, ")?;
}
write!(f, "]")
}
FilterCondition::Or(els) => {
write!(f, "OR[")?;
for el in els {
write!(f, "{el}, ")?;
}
write!(f, "]")
}
FilterCondition::And(els) => {
write!(f, "AND[")?;
for el in els {
write!(f, "{el}, ")?;
}
write!(f, "]")
}
FilterCondition::GeoLowerThan { point, radius } => {
write!(f, "_geoRadius({}, {}, {})", point[0], point[1], radius)
}
FilterCondition::GeoBoundingBox {
top_right_point: top_left_point,
bottom_left_point: bottom_right_point,
} => {
write!(
f,
"_geoBoundingBox([{}, {}], [{}, {}])",
top_left_point[0],
top_left_point[1],
bottom_right_point[0],
bottom_right_point[1]
)
}
}
}
}
impl<'a> std::fmt::Display for Condition<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Condition::GreaterThan(token) => write!(f, "> {token}"),
Condition::GreaterThanOrEqual(token) => write!(f, ">= {token}"),
Condition::Equal(token) => write!(f, "= {token}"),
Condition::NotEqual(token) => write!(f, "!= {token}"),
Condition::Null => write!(f, "IS NULL"),
Condition::Empty => write!(f, "IS EMPTY"),
Condition::Exists => write!(f, "EXISTS"),
Condition::LowerThan(token) => write!(f, "< {token}"),
Condition::LowerThanOrEqual(token) => write!(f, "<= {token}"),
Condition::Between { from, to } => write!(f, "{from} TO {to}"),
}
}
}
impl<'a> std::fmt::Display for Token<'a> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{{{}}}", self.value())
}
}

View File

@ -16,7 +16,7 @@ license.workspace = true
serde_json = "1.0"
[dev-dependencies]
criterion = { version = "0.4.0", features = ["html_reports"] }
criterion = { version = "0.5.1", features = ["html_reports"] }
[[bench]]
name = "benchmarks"

View File

@ -138,6 +138,12 @@ impl Query {
index_vec.push(index_uid);
Self { index_uids: Some(index_vec), ..self }
}
// Removes the `from` and `limit` restrictions from the query.
// Useful to get the total number of tasks matching a filter.
pub fn without_limits(self) -> Self {
Query { limit: None, from: None, ..self }
}
}
#[derive(Debug, Clone)]
@ -807,6 +813,11 @@ impl IndexScheduler {
Ok(res)
}
// Return true if there is at least one task that is processing.
pub fn is_task_processing(&self) -> Result<bool> {
Ok(!self.processing_tasks.read().unwrap().processing.is_empty())
}
/// Return true iff there is at least one task associated with this index
/// that is processing.
pub fn is_index_processing(&self, index: &str) -> Result<bool> {
@ -817,7 +828,8 @@ impl IndexScheduler {
Ok(nbr_index_processing_tasks > 0)
}
/// Return the task ids matching the query from the user's point of view.
/// Return the task ids matching the query along with the total number of tasks
/// by ignoring the from and limit parameters from the user's point of view.
///
/// There are two differences between an internal query and a query executed by
/// the user.
@ -830,7 +842,13 @@ impl IndexScheduler {
rtxn: &RoTxn,
query: &Query,
filters: &meilisearch_auth::AuthFilter,
) -> Result<RoaringBitmap> {
) -> Result<(RoaringBitmap, u64)> {
// compute all tasks matching the filter by ignoring the limits, to find the number of tasks matching
// the filter.
// As this causes us to compute the filter twice it is slightly inefficient, but doing it this way spares
// us from modifying the underlying implementation, and the performance remains sufficient.
// Should this change, we would modify `get_task_ids` to directly return the number of matching tasks.
let total_tasks = self.get_task_ids(rtxn, &query.clone().without_limits())?;
let mut tasks = self.get_task_ids(rtxn, query)?;
// If the query contains a list of index uid or there is a finite list of authorized indexes,
@ -853,10 +871,11 @@ impl IndexScheduler {
}
}
Ok(tasks)
Ok((tasks, total_tasks.len()))
}
/// Return the tasks matching the query from the user's point of view.
/// Return the tasks matching the query from the user's point of view along
/// with the total number of tasks matching the query, ignoring from and limit.
///
/// There are two differences between an internal query and a query executed by
/// the user.
@ -868,11 +887,10 @@ impl IndexScheduler {
&self,
query: Query,
filters: &meilisearch_auth::AuthFilter,
) -> Result<Vec<Task>> {
) -> Result<(Vec<Task>, u64)> {
let rtxn = self.env.read_txn()?;
let tasks = self.get_task_ids_from_authorized_indexes(&rtxn, &query, filters)?;
let (tasks, total) = self.get_task_ids_from_authorized_indexes(&rtxn, &query, filters)?;
let tasks = self.get_existing_tasks(
&rtxn,
tasks.into_iter().rev().take(query.limit.unwrap_or(u32::MAX) as usize),
@ -883,16 +901,19 @@ impl IndexScheduler {
let ret = tasks.into_iter();
if processing.is_empty() {
Ok(ret.collect())
Ok((ret.collect(), total))
} else {
Ok(ret
.map(|task| match processing.contains(task.uid) {
true => {
Ok((
ret.map(|task| {
if processing.contains(task.uid) {
Task { status: Status::Processing, started_at: Some(started_at), ..task }
} else {
task
}
false => task,
})
.collect())
.collect(),
total,
))
}
}
@ -1833,6 +1854,17 @@ mod tests {
snapshot!(snapshot_index_scheduler(&index_scheduler), name: "registered_the_third_task");
}
#[test]
fn test_task_is_processing() {
let (index_scheduler, mut handle) = IndexScheduler::test(true, vec![]);
index_scheduler.register(index_creation_task("index_a", "id")).unwrap();
snapshot!(snapshot_index_scheduler(&index_scheduler), name: "registered_a_task");
handle.advance_till([Start, BatchCreated]);
assert!(index_scheduler.is_task_processing().unwrap());
}
/// We send a lot of tasks but notify the tasks scheduler only once as
/// we send them very fast, we must make sure that they are all processed.
#[test]
@ -2765,43 +2797,43 @@ mod tests {
let rtxn = index_scheduler.env.read_txn().unwrap();
let query = Query { limit: Some(0), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[]");
let query = Query { limit: Some(1), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[2,]");
let query = Query { limit: Some(2), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[1,2,]");
let query = Query { from: Some(1), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[0,1,]");
let query = Query { from: Some(2), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[0,1,2,]");
let query = Query { from: Some(1), limit: Some(1), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[1,]");
let query = Query { from: Some(1), limit: Some(2), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[0,1,]");
@ -2828,13 +2860,13 @@ mod tests {
let rtxn = index_scheduler.env.read_txn().unwrap();
let query = Query { statuses: Some(vec![Status::Processing]), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[0,]"); // only the processing tasks in the first tick
let query = Query { statuses: Some(vec![Status::Enqueued]), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[1,2,]"); // only the enqueued tasks in the first tick
@ -2843,7 +2875,7 @@ mod tests {
statuses: Some(vec![Status::Enqueued, Status::Processing]),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
snapshot!(snapshot_bitmap(&tasks), @"[0,1,2,]"); // both enqueued and processing tasks in the first tick
@ -2853,7 +2885,7 @@ mod tests {
after_started_at: Some(start_time),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// both enqueued and processing tasks in the first tick, but limited to those with a started_at
@ -2865,7 +2897,7 @@ mod tests {
before_started_at: Some(start_time),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// both enqueued and processing tasks in the first tick, but limited to those with a started_at
@ -2878,7 +2910,7 @@ mod tests {
before_started_at: Some(start_time + Duration::minutes(1)),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// both enqueued and processing tasks in the first tick, but limited to those with a started_at
@ -2905,7 +2937,7 @@ mod tests {
before_started_at: Some(start_time + Duration::minutes(1)),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// both succeeded and processing tasks in the first tick, but limited to those with a started_at
@ -2918,7 +2950,7 @@ mod tests {
before_started_at: Some(start_time),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// both succeeded and processing tasks in the first tick, but limited to those with a started_at
@ -2931,7 +2963,7 @@ mod tests {
before_started_at: Some(second_start_time + Duration::minutes(1)),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// both succeeded and processing tasks in the first tick, but limited to those with a started_at
@ -2951,7 +2983,7 @@ mod tests {
let rtxn = index_scheduler.env.read_txn().unwrap();
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// we run the same query to verify that, and indeed find that the last task is matched
@ -2963,7 +2995,7 @@ mod tests {
before_started_at: Some(second_start_time + Duration::minutes(1)),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// enqueued, succeeded, or processing tasks started after the second part of the test, should
@ -2975,7 +3007,7 @@ mod tests {
// now the last task should have failed
snapshot!(snapshot_index_scheduler(&index_scheduler), name: "end");
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// so running the last query should return nothing
@ -2987,7 +3019,7 @@ mod tests {
before_started_at: Some(second_start_time + Duration::minutes(1)),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// but the same query on failed tasks should return the last task
@ -2999,7 +3031,7 @@ mod tests {
before_started_at: Some(second_start_time + Duration::minutes(1)),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// but the same query on failed tasks should return the last task
@ -3012,7 +3044,7 @@ mod tests {
before_started_at: Some(second_start_time + Duration::minutes(1)),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// same query but with an invalid uid
@ -3025,7 +3057,7 @@ mod tests {
before_started_at: Some(second_start_time + Duration::minutes(1)),
..Default::default()
};
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// same query but with a valid uid
@ -3057,14 +3089,14 @@ mod tests {
let rtxn = index_scheduler.env.read_txn().unwrap();
let query = Query { index_uids: Some(vec!["catto".to_owned()]), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// only the first task associated with catto is returned, the indexSwap tasks are excluded!
snapshot!(snapshot_bitmap(&tasks), @"[0,]");
let query = Query { index_uids: Some(vec!["catto".to_owned()]), ..Default::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(
&rtxn,
&query,
@ -3078,7 +3110,7 @@ mod tests {
snapshot!(snapshot_bitmap(&tasks), @"[]");
let query = Query::default();
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(
&rtxn,
&query,
@ -3092,7 +3124,7 @@ mod tests {
snapshot!(snapshot_bitmap(&tasks), @"[1,]");
let query = Query::default();
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(
&rtxn,
&query,
@ -3111,7 +3143,7 @@ mod tests {
snapshot!(snapshot_bitmap(&tasks), @"[0,1,]");
let query = Query::default();
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// we asked for all the tasks with all index authorized -> all tasks returned
@ -3144,7 +3176,7 @@ mod tests {
let rtxn = index_scheduler.read_txn().unwrap();
let query = Query { canceled_by: Some(vec![task_cancelation.uid]), ..Query::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(&rtxn, &query, &AuthFilter::default())
.unwrap();
// 0 is not returned because it was not canceled, 3 is not returned because it is the uid of the
@ -3152,7 +3184,7 @@ mod tests {
snapshot!(snapshot_bitmap(&tasks), @"[1,2,]");
let query = Query { canceled_by: Some(vec![task_cancelation.uid]), ..Query::default() };
let tasks = index_scheduler
let (tasks, _) = index_scheduler
.get_task_ids_from_authorized_indexes(
&rtxn,
&query,

View File

@ -0,0 +1,36 @@
---
source: index-scheduler/src/lib.rs
---
### Autobatching Enabled = true
### Processing Tasks:
[]
----------------------------------------------------------------------
### All Tasks:
0 {uid: 0, status: enqueued, details: { primary_key: Some("id") }, kind: IndexCreation { index_uid: "index_a", primary_key: Some("id") }}
----------------------------------------------------------------------
### Status:
enqueued [0,]
----------------------------------------------------------------------
### Kind:
"indexCreation" [0,]
----------------------------------------------------------------------
### Index Tasks:
index_a [0,]
----------------------------------------------------------------------
### Index Mapper:
----------------------------------------------------------------------
### Canceled By:
----------------------------------------------------------------------
### Enqueued At:
[timestamp] [0,]
----------------------------------------------------------------------
### Started At:
----------------------------------------------------------------------
### Finished At:
----------------------------------------------------------------------
### File Store:
----------------------------------------------------------------------

View File

@ -15,7 +15,7 @@ license.workspace = true
serde_json = "1.0"
[dev-dependencies]
criterion = "0.4.0"
criterion = "0.5.1"
[[bench]]
name = "depth"

View File

@ -199,6 +199,30 @@ macro_rules! snapshot {
};
}
/// Create a string from the value by serializing it as Json, optionally
/// redacting some parts of it.
///
/// The second argument to the macro can be an object expression for redaction.
/// It's in the form { selector => replacement }. For more information about redactions
/// refer to the redactions feature in the `insta` guide.
#[macro_export]
macro_rules! json_string {
($value:expr, {$($k:expr => $v:expr),*$(,)?}) => {
{
let (_, snap) = meili_snap::insta::_prepare_snapshot_for_redaction!($value, {$($k => $v),*}, Json, File);
snap
}
};
($value:expr) => {{
let value = meili_snap::insta::_macro_support::serialize_value(
&$value,
meili_snap::insta::_macro_support::SerializationFormat::Json,
meili_snap::insta::_macro_support::SnapshotLocation::File
);
value
}};
}
#[cfg(test)]
mod tests {
use crate as meili_snap;
@ -250,27 +274,3 @@ mod tests {
}
}
}
/// Create a string from the value by serializing it as Json, optionally
/// redacting some parts of it.
///
/// The second argument to the macro can be an object expression for redaction.
/// It's in the form { selector => replacement }. For more information about redactions
/// refer to the redactions feature in the `insta` guide.
#[macro_export]
macro_rules! json_string {
($value:expr, {$($k:expr => $v:expr),*$(,)?}) => {
{
let (_, snap) = meili_snap::insta::_prepare_snapshot_for_redaction!($value, {$($k => $v),*}, Json, File);
snap
}
};
($value:expr) => {{
let value = meili_snap::insta::_macro_support::serialize_value(
&$value,
meili_snap::insta::_macro_support::SerializationFormat::Json,
meili_snap::insta::_macro_support::SnapshotLocation::File
);
value
}};
}

View File

@ -1,6 +1,6 @@
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Debug, Clone, Copy, Default)]
#[derive(Serialize, Deserialize, Debug, Clone, Copy, Default, PartialEq, Eq)]
#[serde(rename_all = "camelCase", default)]
pub struct RuntimeTogglableFeatures {
pub score_details: bool,

View File

@ -19,6 +19,7 @@ actix-http = { version = "3.3.1", default-features = false, features = [
"compress-gzip",
"rustls",
] }
actix-utils = "3.0.1"
actix-web = { version = "4.3.1", default-features = false, features = [
"macros",
"compress-brotli",
@ -50,6 +51,7 @@ futures-util = "0.3.28"
http = "0.2.9"
index-scheduler = { path = "../index-scheduler" }
indexmap = { version = "1.9.3", features = ["serde-1"] }
is-terminal = "0.4.8"
itertools = "0.10.5"
jsonwebtoken = "8.3.0"
lazy_static = "1.4.0"
@ -100,8 +102,6 @@ uuid = { version = "1.3.1", features = ["serde", "v4"] }
walkdir = "2.3.3"
yaup = "0.2.1"
serde_urlencoded = "0.7.1"
actix-utils = "3.0.1"
atty = "0.2.14"
termcolor = "1.2.0"
[dev-dependencies]
@ -133,17 +133,7 @@ zip = { version = "0.6.4", optional = true }
[features]
default = ["analytics", "meilisearch-types/all-tokenizations", "mini-dashboard"]
analytics = ["segment"]
mini-dashboard = [
"actix-web-static-files",
"static-files",
"anyhow",
"cargo_toml",
"hex",
"reqwest",
"sha-1",
"tempfile",
"zip",
]
mini-dashboard = ["actix-web-static-files", "static-files", "anyhow", "cargo_toml", "hex", "reqwest", "sha-1", "tempfile", "zip"]
chinese = ["meilisearch-types/chinese"]
hebrew = ["meilisearch-types/hebrew"]
japanese = ["meilisearch-types/japanese"]
@ -151,5 +141,5 @@ thai = ["meilisearch-types/thai"]
greek = ["meilisearch-types/greek"]
[package.metadata.mini-dashboard]
assets-url = "https://github.com/meilisearch/mini-dashboard/releases/download/v0.2.7/build.zip"
sha1 = "28b45bf772c84f9a6e16bc1689b393bfce8da7d6"
assets-url = "https://github.com/meilisearch/mini-dashboard/releases/download/v0.2.11/build.zip"
sha1 = "83cd44ed1e5f97ecb581dc9f958a63f4ccc982d9"

View File

@ -574,6 +574,10 @@ pub struct SearchAggregator {
filter_total_number_of_criteria: usize,
used_syntax: HashMap<String, usize>,
// attributes_to_search_on
// every time a search is done using attributes_to_search_on
attributes_to_search_on_total_number_of_uses: usize,
// q
// The maximum number of terms in a q request
max_terms_number: usize,
@ -647,6 +651,11 @@ impl SearchAggregator {
ret.filter_sum_of_criteria_terms = RE.split(&stringified_filters).count();
}
// attributes_to_search_on
if let Some(_) = query.attributes_to_search_on {
ret.attributes_to_search_on_total_number_of_uses = 1;
}
if let Some(ref q) = query.q {
ret.max_terms_number = q.split_whitespace().count();
}
@ -720,9 +729,18 @@ impl SearchAggregator {
let used_syntax = self.used_syntax.entry(key).or_insert(0);
*used_syntax = used_syntax.saturating_add(value);
}
// attributes_to_search_on
self.attributes_to_search_on_total_number_of_uses = self
.attributes_to_search_on_total_number_of_uses
.saturating_add(other.attributes_to_search_on_total_number_of_uses);
// q
self.max_terms_number = self.max_terms_number.max(other.max_terms_number);
// vector
self.max_vector_size = self.max_vector_size.max(other.max_vector_size);
// pagination
self.max_limit = self.max_limit.max(other.max_limit);
self.max_offset = self.max_offset.max(other.max_offset);
@ -761,17 +779,17 @@ impl SearchAggregator {
if self.total_received == 0 {
None
} else {
// the index of the 99th percentage of value
let percentile_99th = 0.99 * (self.total_succeeded as f64 - 1.) + 1.;
// we get all the values in a sorted manner
let time_spent = self.time_spent.into_sorted_vec();
// the index of the 99th percentage of value
let percentile_99th = time_spent.len() * 99 / 100;
// We are only interested by the slowest value of the 99th fastest results
let time_spent = time_spent.get(percentile_99th as usize);
let time_spent = time_spent.get(percentile_99th);
let properties = json!({
"user-agent": self.user_agents,
"requests": {
"99th_response_time": time_spent.map(|t| format!("{:.2}", t)),
"99th_response_time": time_spent.map(|t| format!("{:.2}", t)),
"total_succeeded": self.total_succeeded,
"total_failed": self.total_received.saturating_sub(self.total_succeeded), // just to be sure we never panics
"total_received": self.total_received,
@ -786,9 +804,15 @@ impl SearchAggregator {
"avg_criteria_number": format!("{:.2}", self.filter_sum_of_criteria_terms as f64 / self.filter_total_number_of_criteria as f64),
"most_used_syntax": self.used_syntax.iter().max_by_key(|(_, v)| *v).map(|(k, _)| json!(k)).unwrap_or_else(|| json!(null)),
},
"attributes_to_search_on": {
"total_number_of_uses": self.attributes_to_search_on_total_number_of_uses,
},
"q": {
"max_terms_number": self.max_terms_number,
},
"vector": {
"max_vector_size": self.max_vector_size,
},
"pagination": {
"max_limit": self.max_limit,
"max_offset": self.max_offset,
@ -843,6 +867,10 @@ pub struct MultiSearchAggregator {
// sum of the number of search queries in the requests, use with total_received to compute an average
total_search_count: usize,
// scoring
show_ranking_score: bool,
show_ranking_score_details: bool,
// context
user_agents: HashSet<String>,
}
@ -856,6 +884,9 @@ impl MultiSearchAggregator {
let distinct_indexes: HashSet<_> =
query.iter().map(|query| query.index_uid.as_str()).collect();
let show_ranking_score = query.iter().any(|query| query.show_ranking_score);
let show_ranking_score_details = query.iter().any(|query| query.show_ranking_score_details);
Self {
timestamp,
total_received: 1,
@ -863,6 +894,8 @@ impl MultiSearchAggregator {
total_distinct_index_count: distinct_indexes.len(),
total_single_index: if distinct_indexes.len() == 1 { 1 } else { 0 },
total_search_count: query.len(),
show_ranking_score,
show_ranking_score_details,
user_agents,
}
}
@ -884,6 +917,9 @@ impl MultiSearchAggregator {
this.total_distinct_index_count.saturating_add(other.total_distinct_index_count);
let total_single_index = this.total_single_index.saturating_add(other.total_single_index);
let total_search_count = this.total_search_count.saturating_add(other.total_search_count);
let show_ranking_score = this.show_ranking_score || other.show_ranking_score;
let show_ranking_score_details =
this.show_ranking_score_details || other.show_ranking_score_details;
let mut user_agents = this.user_agents;
for user_agent in other.user_agents.into_iter() {
@ -899,6 +935,8 @@ impl MultiSearchAggregator {
total_single_index,
total_search_count,
user_agents,
show_ranking_score,
show_ranking_score_details,
// do not add _ or ..Default::default() here
};
@ -925,6 +963,10 @@ impl MultiSearchAggregator {
"searches": {
"total_search_count": self.total_search_count,
"avg_search_count": (self.total_search_count as f64) / (self.total_received as f64),
},
"scoring": {
"show_ranking_score": self.show_ranking_score,
"show_ranking_score_details": self.show_ranking_score_details,
}
});
@ -1145,6 +1187,7 @@ pub struct DocumentsDeletionAggregator {
#[serde(rename = "user-agent")]
user_agents: HashSet<String>,
#[serde(rename = "requests.total_received")]
total_received: usize,
per_document_id: bool,
clear_all: bool,
@ -1295,6 +1338,7 @@ pub struct HealthAggregator {
#[serde(rename = "user-agent")]
user_agents: HashSet<String>,
#[serde(rename = "requests.total_received")]
total_received: usize,
}
@ -1345,7 +1389,7 @@ pub struct DocumentsFetchAggregator {
#[serde(rename = "user-agent")]
user_agents: HashSet<String>,
#[serde(rename = "requests.max_limit")]
#[serde(rename = "requests.total_received")]
total_received: usize,
// a call on ../documents/:doc_id

View File

@ -1,5 +1,5 @@
use std::env;
use std::io::Write;
use std::io::{stderr, Write};
use std::path::PathBuf;
use std::sync::Arc;
@ -7,6 +7,7 @@ use actix_web::http::KeepAlive;
use actix_web::web::Data;
use actix_web::HttpServer;
use index_scheduler::IndexScheduler;
use is_terminal::IsTerminal;
use meilisearch::analytics::Analytics;
use meilisearch::{analytics, create_app, prototype_name, setup_meilisearch, Opt};
use meilisearch_auth::{generate_master_key, AuthController, MASTER_KEY_MIN_SIZE};
@ -186,7 +187,7 @@ Anonymous telemetry:\t\"Enabled\""
}
eprintln!();
eprintln!("Check out Meilisearch Cloud!\thttps://cloud.meilisearch.com/login?utm_campaign=oss&utm_source=engine&utm_medium=cli");
eprintln!("Check out Meilisearch Cloud!\thttps://www.meilisearch.com/cloud?utm_campaign=oss&utm_source=engine&utm_medium=cli");
eprintln!("Documentation:\t\t\thttps://www.meilisearch.com/docs");
eprintln!("Source code:\t\t\thttps://github.com/meilisearch/meilisearch");
eprintln!("Discord:\t\t\thttps://discord.meilisearch.com");
@ -197,8 +198,7 @@ const WARNING_BG_COLOR: Option<Color> = Some(Color::Ansi256(178));
const WARNING_FG_COLOR: Option<Color> = Some(Color::Ansi256(0));
fn print_master_key_too_short_warning() {
let choice =
if atty::is(atty::Stream::Stderr) { ColorChoice::Auto } else { ColorChoice::Never };
let choice = if stderr().is_terminal() { ColorChoice::Auto } else { ColorChoice::Never };
let mut stderr = StandardStream::stderr(choice);
stderr
.set_color(
@ -223,8 +223,7 @@ fn print_master_key_too_short_warning() {
}
fn print_missing_master_key_warning() {
let choice =
if atty::is(atty::Stream::Stderr) { ColorChoice::Auto } else { ColorChoice::Never };
let choice = if stderr().is_terminal() { ColorChoice::Auto } else { ColorChoice::Never };
let mut stderr = StandardStream::stderr(choice);
stderr
.set_color(

View File

@ -50,4 +50,10 @@ lazy_static! {
&["kind", "value"]
)
.expect("Can't create a metric");
pub static ref MEILISEARCH_LAST_UPDATE: IntGauge =
register_int_gauge!(opts!("meilisearch_last_update", "Meilisearch Last Update"))
.expect("Can't create a metric");
pub static ref MEILISEARCH_IS_INDEXING: IntGauge =
register_int_gauge!(opts!("meilisearch_is_indexing", "Meilisearch Is Indexing"))
.expect("Can't create a metric");
}

View File

@ -64,7 +64,20 @@ async fn patch_features(
vector_store: new_features.0.vector_store.unwrap_or(old_features.vector_store),
};
analytics.publish("Experimental features Updated".to_string(), json!(new_features), Some(&req));
// explicitly destructure for analytics rather than using the `Serialize` implementation, because
// the it renames to camelCase, which we don't want for analytics.
// **Do not** ignore fields with `..` or `_` here, because we want to add them in the future.
let meilisearch_types::features::RuntimeTogglableFeatures { score_details, vector_store } =
new_features;
analytics.publish(
"Experimental features Updated".to_string(),
json!({
"score_details": score_details,
"vector_store": vector_store,
}),
Some(&req),
);
index_scheduler.put_runtime_features(new_features)?;
Ok(HttpResponse::Ok().json(new_features))
}

View File

@ -35,7 +35,7 @@ pub struct SearchQueryGet {
#[deserr(default, error = DeserrQueryParamError<InvalidSearchQ>)]
q: Option<String>,
#[deserr(default, error = DeserrQueryParamError<InvalidSearchVector>)]
vector: Option<Vec<f32>>,
vector: Option<CS<f32>>,
#[deserr(default = Param(DEFAULT_SEARCH_OFFSET()), error = DeserrQueryParamError<InvalidSearchOffset>)]
offset: Param<usize>,
#[deserr(default = Param(DEFAULT_SEARCH_LIMIT()), error = DeserrQueryParamError<InvalidSearchLimit>)]
@ -88,7 +88,7 @@ impl From<SearchQueryGet> for SearchQuery {
Self {
q: other.q,
vector: other.vector,
vector: other.vector.map(CS::into_inner),
offset: other.offset.0,
limit: other.limit.0,
page: other.page.as_deref().copied(),

View File

@ -5,6 +5,7 @@ use index_scheduler::IndexScheduler;
use log::debug;
use meilisearch_types::deserr::DeserrJsonError;
use meilisearch_types::error::ResponseError;
use meilisearch_types::facet_values_sort::FacetValuesSort;
use meilisearch_types::index_uid::IndexUid;
use meilisearch_types::settings::{settings, RankingRuleView, Settings, Unchecked};
use meilisearch_types::tasks::KindWithContent;
@ -550,10 +551,16 @@ pub async fn update_all(
.as_ref()
.set()
.and_then(|s| s.max_values_per_facet.as_ref().set()),
"sort_facet_values_by": new_settings.faceting
"sort_facet_values_by_star_count": new_settings.faceting
.as_ref()
.set()
.and_then(|s| s.sort_facet_values_by.as_ref().set()),
.and_then(|s| {
s.sort_facet_values_by.as_ref().set().map(|s| s.iter().any(|(k, v)| k == "*" && v == &FacetValuesSort::Count))
}),
"sort_facet_values_by_total": new_settings.faceting
.as_ref()
.set()
.and_then(|s| s.sort_facet_values_by.as_ref().set().map(|s| s.len())),
},
"pagination": {
"max_total_hits": new_settings.pagination

View File

@ -49,6 +49,11 @@ pub async fn get_metrics(
}
}
if let Some(last_update) = response.last_update {
crate::metrics::MEILISEARCH_LAST_UPDATE.set(last_update.unix_timestamp());
}
crate::metrics::MEILISEARCH_IS_INDEXING.set(index_scheduler.is_task_processing()? as i64);
let encoder = TextEncoder::new();
let mut buffer = vec![];
encoder.encode(&prometheus::gather(), &mut buffer).expect("Failed to encode metrics");

View File

@ -284,9 +284,6 @@ pub fn create_all_stats(
used_database_size += index_scheduler.used_size()?;
database_size += auth_controller.size()?;
used_database_size += auth_controller.used_size()?;
let update_file_size = index_scheduler.compute_update_file_size()?;
database_size += update_file_size;
used_database_size += update_file_size;
let stats = Stats { database_size, used_database_size, last_update: last_task, indexes };
Ok(stats)

View File

@ -325,7 +325,7 @@ async fn cancel_tasks(
let query = params.into_query();
let tasks = index_scheduler.get_task_ids_from_authorized_indexes(
let (tasks, _) = index_scheduler.get_task_ids_from_authorized_indexes(
&index_scheduler.read_txn()?,
&query,
index_scheduler.filters(),
@ -370,7 +370,7 @@ async fn delete_tasks(
);
let query = params.into_query();
let tasks = index_scheduler.get_task_ids_from_authorized_indexes(
let (tasks, _) = index_scheduler.get_task_ids_from_authorized_indexes(
&index_scheduler.read_txn()?,
&query,
index_scheduler.filters(),
@ -387,6 +387,7 @@ async fn delete_tasks(
#[derive(Debug, Serialize)]
pub struct AllTasks {
results: Vec<TaskView>,
total: u64,
limit: u32,
from: Option<u32>,
next: Option<u32>,
@ -406,23 +407,17 @@ async fn get_tasks(
let limit = params.limit.0;
let query = params.into_query();
let mut tasks_results: Vec<TaskView> = index_scheduler
.get_tasks_from_authorized_indexes(query, index_scheduler.filters())?
.into_iter()
.map(|t| TaskView::from_task(&t))
.collect();
let filters = index_scheduler.filters();
let (tasks, total) = index_scheduler.get_tasks_from_authorized_indexes(query, filters)?;
let mut results: Vec<_> = tasks.iter().map(TaskView::from_task).collect();
// If we were able to fetch the number +1 tasks we asked
// it means that there is more to come.
let next = if tasks_results.len() == limit as usize {
tasks_results.pop().map(|t| t.uid)
} else {
None
};
let next = if results.len() == limit as usize { results.pop().map(|t| t.uid) } else { None };
let from = tasks_results.first().map(|t| t.uid);
let from = results.first().map(|t| t.uid);
let tasks = AllTasks { results, limit: limit.saturating_sub(1), total, from, next };
let tasks = AllTasks { results: tasks_results, limit: limit.saturating_sub(1), from, next };
Ok(HttpResponse::Ok().json(tasks))
}
@ -444,10 +439,10 @@ async fn get_task(
analytics.publish("Tasks Seen".to_string(), json!({ "per_task_uid": true }), Some(&req));
let query = index_scheduler::Query { uids: Some(vec![task_uid]), ..Query::default() };
let filters = index_scheduler.filters();
let (tasks, _) = index_scheduler.get_tasks_from_authorized_indexes(query, filters)?;
if let Some(task) =
index_scheduler.get_tasks_from_authorized_indexes(query, index_scheduler.filters())?.first()
{
if let Some(task) = tasks.first() {
let task_view = TaskView::from_task(task);
Ok(HttpResponse::Ok().json(task_view))
} else {

View File

@ -61,6 +61,8 @@ pub static AUTHORIZATIONS: Lazy<HashMap<(&'static str, &'static str), HashSet<&'
("DELETE", "/keys/mykey/") => hashset!{"keys.delete", "*"},
("POST", "/keys") => hashset!{"keys.create", "*"},
("GET", "/keys") => hashset!{"keys.get", "*"},
("GET", "/experimental-features") => hashset!{"experimental.get", "*"},
("PATCH", "/experimental-features") => hashset!{"experimental.update", "*"},
};
authorizations

View File

@ -189,6 +189,14 @@ impl Server {
let url = format!("/tasks/{}", update_id);
self.service.get(url).await
}
pub async fn get_features(&self) -> (Value, StatusCode) {
self.service.get("/experimental-features").await
}
pub async fn set_features(&self, value: Value) -> (Value, StatusCode) {
self.service.patch("/experimental-features", value).await
}
}
pub fn default_settings(dir: impl AsRef<Path>) -> Opt {

View File

@ -43,7 +43,7 @@ async fn import_dump_v1_movie_raw() {
assert_eq!(code, 200);
assert_eq!(
tasks,
json!({ "results": [{"uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31968 }, "error": null, "duration": "PT9.317060500S", "enqueuedAt": "2021-09-08T09:08:45.153219Z", "startedAt": "2021-09-08T09:08:45.3961665Z", "finishedAt": "2021-09-08T09:08:54.713227Z" }], "limit": 20, "from": 0, "next": null })
json!({ "results": [{"uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31968 }, "error": null, "duration": "PT9.317060500S", "enqueuedAt": "2021-09-08T09:08:45.153219Z", "startedAt": "2021-09-08T09:08:45.3961665Z", "finishedAt": "2021-09-08T09:08:54.713227Z" }], "total": 1, "limit": 20, "from": 0, "next": null })
);
// finally we're just going to check that we can still get a few documents by id
@ -135,7 +135,7 @@ async fn import_dump_v1_movie_with_settings() {
assert_eq!(code, 200);
assert_eq!(
tasks,
json!({ "results": [{ "uid": 1, "indexUid": "indexUID", "status": "succeeded", "type": "settingsUpdate", "canceledBy": null, "details": { "displayedAttributes": ["genres", "id", "overview", "poster", "release_date", "title"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": ["genres"], "stopWords": ["of", "the"] }, "error": null, "duration": "PT7.288826907S", "enqueuedAt": "2021-09-08T09:34:40.882977Z", "startedAt": "2021-09-08T09:34:40.883073093Z", "finishedAt": "2021-09-08T09:34:48.1719Z"}, { "uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31968 }, "error": null, "duration": "PT9.090735774S", "enqueuedAt": "2021-09-08T09:34:16.036101Z", "startedAt": "2021-09-08T09:34:16.261191226Z", "finishedAt": "2021-09-08T09:34:25.351927Z" }], "limit": 20, "from": 1, "next": null })
json!({ "results": [{ "uid": 1, "indexUid": "indexUID", "status": "succeeded", "type": "settingsUpdate", "canceledBy": null, "details": { "displayedAttributes": ["genres", "id", "overview", "poster", "release_date", "title"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": ["genres"], "stopWords": ["of", "the"] }, "error": null, "duration": "PT7.288826907S", "enqueuedAt": "2021-09-08T09:34:40.882977Z", "startedAt": "2021-09-08T09:34:40.883073093Z", "finishedAt": "2021-09-08T09:34:48.1719Z"}, { "uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31968 }, "error": null, "duration": "PT9.090735774S", "enqueuedAt": "2021-09-08T09:34:16.036101Z", "startedAt": "2021-09-08T09:34:16.261191226Z", "finishedAt": "2021-09-08T09:34:25.351927Z" }], "total": 2, "limit": 20, "from": 1, "next": null })
);
// finally we're just going to check that we can still get a few documents by id
@ -317,7 +317,7 @@ async fn import_dump_v2_movie_raw() {
assert_eq!(code, 200);
assert_eq!(
tasks,
json!({ "results": [{"uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT41.751156S", "enqueuedAt": "2021-09-08T08:30:30.550282Z", "startedAt": "2021-09-08T08:30:30.553012Z", "finishedAt": "2021-09-08T08:31:12.304168Z" }], "limit": 20, "from": 0, "next": null })
json!({ "results": [{"uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT41.751156S", "enqueuedAt": "2021-09-08T08:30:30.550282Z", "startedAt": "2021-09-08T08:30:30.553012Z", "finishedAt": "2021-09-08T08:31:12.304168Z" }], "total": 1, "limit": 20, "from": 0, "next": null })
);
// finally we're just going to check that we can still get a few documents by id
@ -409,7 +409,7 @@ async fn import_dump_v2_movie_with_settings() {
assert_eq!(code, 200);
assert_eq!(
tasks,
json!({ "results": [{ "uid": 1, "indexUid": "indexUID", "status": "succeeded", "type": "settingsUpdate", "canceledBy": null, "details": { "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "stopWords": ["of", "the"] }, "error": null, "duration": "PT37.488777S", "enqueuedAt": "2021-09-08T08:24:02.323444Z", "startedAt": "2021-09-08T08:24:02.324145Z", "finishedAt": "2021-09-08T08:24:39.812922Z" }, { "uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT39.941318S", "enqueuedAt": "2021-09-08T08:21:14.742672Z", "startedAt": "2021-09-08T08:21:14.750166Z", "finishedAt": "2021-09-08T08:21:54.691484Z" }], "limit": 20, "from": 1, "next": null })
json!({ "results": [{ "uid": 1, "indexUid": "indexUID", "status": "succeeded", "type": "settingsUpdate", "canceledBy": null, "details": { "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "stopWords": ["of", "the"] }, "error": null, "duration": "PT37.488777S", "enqueuedAt": "2021-09-08T08:24:02.323444Z", "startedAt": "2021-09-08T08:24:02.324145Z", "finishedAt": "2021-09-08T08:24:39.812922Z" }, { "uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT39.941318S", "enqueuedAt": "2021-09-08T08:21:14.742672Z", "startedAt": "2021-09-08T08:21:14.750166Z", "finishedAt": "2021-09-08T08:21:54.691484Z" }], "total": 2, "limit": 20, "from": 1, "next": null })
);
// finally we're just going to check that we can still get a few documents by id
@ -591,7 +591,7 @@ async fn import_dump_v3_movie_raw() {
assert_eq!(code, 200);
assert_eq!(
tasks,
json!({ "results": [{"uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT41.751156S", "enqueuedAt": "2021-09-08T08:30:30.550282Z", "startedAt": "2021-09-08T08:30:30.553012Z", "finishedAt": "2021-09-08T08:31:12.304168Z" }], "limit": 20, "from": 0, "next": null })
json!({ "results": [{"uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT41.751156S", "enqueuedAt": "2021-09-08T08:30:30.550282Z", "startedAt": "2021-09-08T08:30:30.553012Z", "finishedAt": "2021-09-08T08:31:12.304168Z" }], "total": 1, "limit": 20, "from": 0, "next": null })
);
// finally we're just going to check that we can still get a few documents by id
@ -683,7 +683,7 @@ async fn import_dump_v3_movie_with_settings() {
assert_eq!(code, 200);
assert_eq!(
tasks,
json!({ "results": [{ "uid": 1, "indexUid": "indexUID", "status": "succeeded", "type": "settingsUpdate", "canceledBy": null, "details": { "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "stopWords": ["of", "the"] }, "error": null, "duration": "PT37.488777S", "enqueuedAt": "2021-09-08T08:24:02.323444Z", "startedAt": "2021-09-08T08:24:02.324145Z", "finishedAt": "2021-09-08T08:24:39.812922Z" }, { "uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT39.941318S", "enqueuedAt": "2021-09-08T08:21:14.742672Z", "startedAt": "2021-09-08T08:21:14.750166Z", "finishedAt": "2021-09-08T08:21:54.691484Z" }], "limit": 20, "from": 1, "next": null })
json!({ "results": [{ "uid": 1, "indexUid": "indexUID", "status": "succeeded", "type": "settingsUpdate", "canceledBy": null, "details": { "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "stopWords": ["of", "the"] }, "error": null, "duration": "PT37.488777S", "enqueuedAt": "2021-09-08T08:24:02.323444Z", "startedAt": "2021-09-08T08:24:02.324145Z", "finishedAt": "2021-09-08T08:24:39.812922Z" }, { "uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT39.941318S", "enqueuedAt": "2021-09-08T08:21:14.742672Z", "startedAt": "2021-09-08T08:21:14.750166Z", "finishedAt": "2021-09-08T08:21:54.691484Z" }], "total": 2, "limit": 20, "from": 1, "next": null })
);
// finally we're just going to check that we can["results"] still get a few documents by id
@ -865,7 +865,7 @@ async fn import_dump_v4_movie_raw() {
assert_eq!(code, 200);
assert_eq!(
tasks,
json!({ "results": [{"uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT41.751156S", "enqueuedAt": "2021-09-08T08:30:30.550282Z", "startedAt": "2021-09-08T08:30:30.553012Z", "finishedAt": "2021-09-08T08:31:12.304168Z" }], "limit" : 20, "from": 0, "next": null })
json!({ "results": [{"uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT41.751156S", "enqueuedAt": "2021-09-08T08:30:30.550282Z", "startedAt": "2021-09-08T08:30:30.553012Z", "finishedAt": "2021-09-08T08:31:12.304168Z" }], "total": 1, "limit" : 20, "from": 0, "next": null })
);
// finally we're just going to check that we can still get a few documents by id
@ -957,7 +957,7 @@ async fn import_dump_v4_movie_with_settings() {
assert_eq!(code, 200);
assert_eq!(
tasks,
json!({ "results": [{ "uid": 1, "indexUid": "indexUID", "status": "succeeded", "type": "settingsUpdate", "canceledBy": null, "details": { "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "stopWords": ["of", "the"] }, "error": null, "duration": "PT37.488777S", "enqueuedAt": "2021-09-08T08:24:02.323444Z", "startedAt": "2021-09-08T08:24:02.324145Z", "finishedAt": "2021-09-08T08:24:39.812922Z" }, { "uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT39.941318S", "enqueuedAt": "2021-09-08T08:21:14.742672Z", "startedAt": "2021-09-08T08:21:14.750166Z", "finishedAt": "2021-09-08T08:21:54.691484Z" }], "limit": 20, "from": 1, "next": null })
json!({ "results": [{ "uid": 1, "indexUid": "indexUID", "status": "succeeded", "type": "settingsUpdate", "canceledBy": null, "details": { "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "stopWords": ["of", "the"] }, "error": null, "duration": "PT37.488777S", "enqueuedAt": "2021-09-08T08:24:02.323444Z", "startedAt": "2021-09-08T08:24:02.324145Z", "finishedAt": "2021-09-08T08:24:39.812922Z" }, { "uid": 0, "indexUid": "indexUID", "status": "succeeded", "type": "documentAdditionOrUpdate", "canceledBy": null, "details": { "receivedDocuments": 0, "indexedDocuments": 31944 }, "error": null, "duration": "PT39.941318S", "enqueuedAt": "2021-09-08T08:21:14.742672Z", "startedAt": "2021-09-08T08:21:14.750166Z", "finishedAt": "2021-09-08T08:21:54.691484Z" }], "total": 2, "limit": 20, "from": 1, "next": null })
);
// finally we're just going to check that we can still get a few documents by id

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:08:54.713227Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:08:54.713227Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:08:54.713227Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:08:54.713227Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:08:54.713227Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:08:54.713227Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:08:54.713227Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:34:25.351927Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:34:25.351927Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -40,6 +40,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:34:48.1719Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -40,6 +40,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:34:48.1719Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -40,6 +40,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:34:48.1719Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -40,6 +40,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:34:48.1719Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -40,6 +40,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:34:48.1719Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -45,6 +45,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:26:57.319083Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:28:46.369971Z"
}
],
"total": 92,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:28:46.369971Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:28:46.369971Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:28:46.369971Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:28:46.369971Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T09:28:46.369971Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:21:54.691484Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:21:54.691484Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -41,6 +41,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:40:28.669652Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 92,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:21:54.691484Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:21:54.691484Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

View File

@ -41,6 +41,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:40:28.669652Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 92,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:51:53.095314Z"
}
],
"total": 93,
"limit": 1,
"from": 92,
"next": 91

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:31:12.304168Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:21:54.691484Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -20,6 +20,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:21:54.691484Z"
}
],
"total": 1,
"limit": 1,
"from": 0,
"next": null

View File

@ -36,6 +36,7 @@ source: meilisearch/tests/dumps/mod.rs
"finishedAt": "2021-09-08T08:24:39.812922Z"
}
],
"total": 2,
"limit": 1,
"from": 1,
"next": 0

Some files were not shown because too many files have changed in this diff Show More