Compare commits

...

196 Commits

Author SHA1 Message Date
Louis Dureuil
6678491212 Use actix-governor to perform rate-limiting 2023-01-03 16:25:30 +01:00
Louis Dureuil
a82f8aacde Add actix-governor 2023-01-03 16:25:30 +01:00
Louis Dureuil
5cf71c6014 Add rate-limiting options 2023-01-03 16:25:30 +01:00
bors[bot]
ab655a85e8 Merge #3279
3279: Clarify error message when the db and engine versions are incompatible r=irevoire a=dureuill

# Pull Request

## Related issue

Related to https://github.com/meilisearch/meilisearch/issues/2752

## What does this PR do?
- Implements https://github.com/meilisearch/product/discussions/572#discussioncomment-4390616

## PR checklist
Please check if your PR fulfills the following requirements:
- [ ] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2023-01-02 17:18:11 +00:00
bors[bot]
6425e06cf2 Merge #3274
3274: Reject master keys that are less than 16 bytes and add `--generate-master-key` CLI option r=irevoire a=dureuill

# Pull Request

## Related issue
Fix #3272 
Fix #3287

## What does this PR do?

### User standpoint

---

- Adds a `--generate-master-key` CLI flag to generate a fresh Master Key and exit.

<img width="1351" alt="Capture d’écran 2022-12-22 à 14 18 58" src="https://user-images.githubusercontent.com/41078892/209142778-eab52eeb-eaa8-409b-897a-c0d5728c8aaa.png">

---

(relevant fragment of the `--help` message)

<img width="1351" alt="Capture d’écran 2022-12-22 à 14 19 40" src="https://user-images.githubusercontent.com/41078892/209142891-ebfa2ed6-f231-4f76-a3ae-b7542c7aef04.png">

---

- When `meilisearch` is started in the `development` environment and no Master Key has been provided, then the binary prints a warning before starting.

<img width="1351" alt="Capture d’écran 2022-12-22 à 14 14 49" src="https://user-images.githubusercontent.com/41078892/209142158-54eba3b7-bf71-4f3f-8840-0600b13a1a9f.png">

---

- When `meilisearch` is started in the `development` environment and the provided Master Key is shorter than 16 bytes, then the binary prints a warning before starting.

<img width="1351" alt="Capture d’écran 2022-12-22 à 14 15 58" src="https://user-images.githubusercontent.com/41078892/209142295-0209fe47-c03b-424f-a73f-cee9b633137a.png">

---

- When `meilisearch` is started in the `production` environment, and no Master Key is provided, the error message is altered to generate a fresh Master Key.

<img width="1351" alt="Capture d’écran 2022-12-22 à 17 29 02" src="https://user-images.githubusercontent.com/41078892/209180540-0def5798-15db-47f0-a6ec-8cfa081dea77.png">


---

- When `meilisearch` is started in the `production` environment, and the provided Master Key is shorter than 16 bytes, then the binary exits with an error.

<img width="1351" alt="Capture d’écran 2022-12-22 à 17 28 47" src="https://user-images.githubusercontent.com/41078892/209180567-fa54fe33-fbc4-4b9f-b281-7dfb7b33af85.png">


---

This implements the solution B described here: https://github.com/meilisearch/product/discussions/538#discussioncomment-4391346 

### Implementation standpoint

- Add a new `meilisearch-auth::generate_master_key` function that uses a Cryptographic Random Number Generator (CRNG) to fill a vector of 32 bytes before encoding these bytes as base64

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
Co-authored-by: Tamo <tamo@meilisearch.com>
2023-01-02 16:00:40 +00:00
Tamo
1692f58b83 slightly update the message associated with the cli parameter + accept an env variable 2023-01-02 16:49:35 +01:00
Tamo
9ba4d0f921 update the error messages according to the spec 2023-01-02 16:43:23 +01:00
Tamo
4b6ffe0cd1 Update meilisearch-auth/src/lib.rs 2023-01-02 16:33:02 +01:00
bors[bot]
336c77aa45 Merge #3245
3245: Enable create_raw_index(...) to specify time r=irevoire a=amab8901

# Pull Request

## Related issue
Partially fixes #2983 

## What does this PR do?
- Enables [`create_raw_index`](660be071b5/index-scheduler/src/lib.rs (L868)) to specify time

## PR checklist
Please check if your PR fulfills the following requirements:
- [ X ] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [ X ] Have you read the contributing guidelines?
- [ X ] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: amab8901 <amab8901@protonmail.com>
2023-01-02 14:11:39 +00:00
bors[bot]
776acb5ed3 Merge #3276
3276: README: Replace Slack link with Discord r=dureuill a=shivaylamba

# Pull Request

## Related issue
Fixes #3275 

## What does this PR do?

Update Slack link with Discord link in the README

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Shivay Lamba <shivaylamba@gmail.com>
2022-12-29 10:32:42 +00:00
Louis Dureuil
3e9834abff Change error message when the db version is incompatible with engine version. 2022-12-26 17:34:36 +01:00
Louis Dureuil
3cba476a9f Add --generate-master-key CLI option 2022-12-26 10:36:45 +01:00
Louis Dureuil
57e851d8a9 Check for key length 2022-12-26 10:36:45 +01:00
Shivay Lamba
9c45850bd2 README: Replace Slack link with Discord 2022-12-26 00:19:13 +05:30
Louis Dureuil
66e18eae79 auth: add generate_master_key function 2022-12-22 11:55:27 +01:00
amab8901
9a39c4e40d Get date from IndexMetaData 2022-12-22 11:46:17 +01:00
amab8901
df176aaf01 Insert dump_reader.date() into create_raw_index(_) argument 2022-12-21 15:16:31 +01:00
amab8901
0893b175dc Merge branch 'main' into 2983-forward-date-to-milli 2022-12-21 14:31:19 +01:00
amab8901
d5978d11e1 Refactor 2022-12-21 14:28:00 +01:00
bors[bot]
9925309492 Merge #3263
3263: Handle most io error instead of tagging everything as an internal r=dureuill a=irevoire

Fix https://github.com/meilisearch/meilisearch/issues/2255
Fix https://github.com/meilisearch/meilisearch/issues/2785
Close https://github.com/meilisearch/milli/pull/580

- [x] Find a way to catch the `io::Error` contained in `serde_json::Error`: We can't: https://docs.rs/serde_json/latest/serde_json/struct.Error.html
- [x] Check the `grenad::Error` as well => the `grenad::Error::Io` error are correctly converted to a `milli::Error::Io` error 
- [x] Ensure the error code mean the same thing under windows

Co-authored-by: Tamo <tamo@meilisearch.com>
2022-12-20 17:15:53 +00:00
Tamo
9e0cce5ca4 Update dump/src/error.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2022-12-20 18:08:51 +01:00
Tamo
336ea57384 Update dump/src/error.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2022-12-20 18:08:44 +01:00
Tamo
c637bfba37 convert all the document format error due to io to io::Error 2022-12-20 17:49:38 +01:00
Tamo
3040172562 update the error message as well 2022-12-20 17:31:13 +01:00
Tamo
52aa34d984 remove an unused error handling file 2022-12-20 16:32:51 +01:00
bors[bot]
2c86d42a44 Merge #3264
3264: Remove macos-latest and windows-latest usages r=curquiza a=curquiza

Related to https://github.com/meilisearch/meilisearch/issues/3109#issuecomment-1359151297

Remove the `macos-latest` and `windows-latest` to replace them with the specific version: this will avoid "surprises" in the future when GitHub changes the `latest` version.
This way, it will also allow us to let the documentation team know about the changes, since we will control the macOS/Windows version we support

Co-authored-by: curquiza <clementine@meilisearch.com>
2022-12-20 10:53:37 +00:00
curquiza
8ce3a34ffa Remove macos-latest and windows-latest usages 2022-12-20 11:10:09 +01:00
bors[bot]
259c04eb28 Merge #3261
3261: Use ubuntu-18.04 container instead of GitHub hosted actions r=curquiza a=curquiza

Related to (but does not fix totally) https://github.com/meilisearch/meilisearch/issues/3109 and https://github.com/meilisearch/product/discussions/547#discussioncomment-4109143

## For reviewers, what's the PR changes:
- Use ubuntu-latest where compiling with ubuntu-18.04 is not needed (`update-version-cargo-toml`, `fmt`, `clippy` jobs)
- Where ubuntu-18.04 is required
  - Use `ubuntu-latest` as runner
  - Use `ubuntu:18.04` as Docker container
  - Install the required dependencies (curl and cc)
  - Use `actions-rs/toolchain@v1` instead of `hecrj/setup-rust-action@master`. It's more stable and followed alternative. Plus it was easy to make it work with our container contrary to the old one. Change applied in all our CIs to be more consistent
- Remove some useless space to increase readability.

Co-authored-by: curquiza <clementine@meilisearch.com>
2022-12-20 09:28:09 +00:00
Tamo
d8fb506c92 handle most io error instead of tagging everything as an internal 2022-12-19 20:50:40 +01:00
amab8901
aa03e02fdc Apply Rustfmt 2022-12-19 19:24:56 +01:00
curquiza
7ef23addb6 Add comment to bring more context 2022-12-19 18:46:27 +01:00
curquiza
b3fce7c366 Remove useless continue-on-error 2022-12-19 18:39:35 +01:00
curquiza
5099a40484 Use ubuntu-18.04 container in publish CIs 2022-12-19 18:35:33 +01:00
bors[bot]
19ee9a828f Merge #3262
3262: Clippy fixes after updating Rust to v1.66 r=curquiza a=dureuill

Ran `cargo clippy --fix`

Fixes the CI.


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2022-12-19 14:05:59 +00:00
Louis Dureuil
869d331680 Clippy fixes after updating Rust to v1.66 2022-12-19 14:17:12 +01:00
curquiza
913eff5b2f Use ubuntu-18.04 container in rust tests 2022-12-19 10:46:29 +01:00
amab8901
b4a73f2d74 Remove redundant date-setting 2022-12-16 08:32:44 +01:00
amab8901
4e175ae882 Replace Index::new_with_creation_dates(...) with Index::new(...) 2022-12-16 08:20:13 +01:00
amab8901
5a0a0468df Combine created and added into date 2022-12-16 08:11:12 +01:00
bors[bot]
867279f2a4 Merge #3249
3249: Bring back changes from release-v0.30.3 to main r=curquiza a=curquiza

⚠️ ⚠️ I had to fix git conflicts, ensure I did not lose anything ⚠️ ⚠️ 

Co-authored-by: Kerollmops <clement@meilisearch.com>
Co-authored-by: Tamo <tamo@meilisearch.com>
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2022-12-15 14:13:30 +00:00
Louis Dureuil
ce84a59873 Re-apply some changes from #3132 2022-12-14 20:02:39 +01:00
Tamo
d66bb3a53f rename the two new functions 2022-12-14 17:27:43 +01:00
Tamo
6c0b8edab5 Fix typos
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2022-12-14 17:27:37 +01:00
Tamo
fbbc6eaeca Fix the import of dumps and snapshot.
Some flags were badly applied + the database wrongly deleted when they shouldn't
2022-12-14 17:27:28 +01:00
Kerollmops
60c3bac108 Bump milli to v0.37.3 2022-12-14 17:25:40 +01:00
bors[bot]
9491fe0704 Merge #3247
3247: Re-add push in docker CI r=curquiza a=curquiza

I made a mistake here https://github.com/meilisearch/meilisearch/pull/3229, `push` is not `true` by default, see https://github.com/docker/build-push-action#customizing

Co-authored-by: Clémentine Urquizar - curqui <clementine@meilisearch.com>
2022-12-14 13:15:41 +00:00
Clémentine Urquizar - curqui
240c73d292 Re-add push 2022-12-14 14:05:25 +01:00
amab8901
d3eb8d2d5c Enable create_raw_index(...) to specify time 2022-12-14 10:44:25 +01:00
bors[bot]
660be071b5 Merge #3236
3236: Improves clarity of the code that receives payloads r=Kerollmops a=Kerollmops

This PR makes small changes to #3164. It improves the clarity and simplicity of some parts of the code.

Co-authored-by: Kerollmops <clement@meilisearch.com>
2022-12-13 18:20:24 +00:00
bors[bot]
89542d7d8b Merge #3241
3241: Remove core mention r=curquiza a=curquiza

No impact for the users or the team

Co-authored-by: curquiza <clementine@meilisearch.com>
2022-12-13 17:35:50 +00:00
curquiza
f62e7a3501 Remove core mention 2022-12-13 17:34:43 +01:00
Kerollmops
a08cc82983 Revert "Simplify the code when array_each failed"
This reverts commit 271685cceb.
2022-12-13 16:29:49 +01:00
Kerollmops
7b2f2a4f9c Do only one convertion to u64 2022-12-13 15:31:55 +01:00
Kerollmops
5d5615ef45 Rename the ReceivePayload error variant 2022-12-13 15:07:35 +01:00
Kerollmops
526793b5b2 Handle empty arrays the same way we handle other arrays 2022-12-13 14:58:40 +01:00
Kerollmops
271685cceb Simplify the code when array_each failed 2022-12-13 14:58:05 +01:00
bors[bot]
1af590d3bc Merge #3234
3234: Update README.md r=curquiza a=tpayet

Change Slack link to Discord link

Co-authored-by: Thomas Payet <thomas@meilisearch.com>
2022-12-13 11:41:10 +00:00
bors[bot]
dab2634ca8 Merge #3164
3164: Improve the way we receive the documents payload r=Kerollmops a=jiangbo212

# Pull Request

## Related issue
Fixes #3037 

## What does this PR do?
- writing the playload to a temporary file via BufWritter
- deserialising the json tempporary file to an array of Objects by means of a memory map
- deserialising thie csv tempporary file by means of a memory map
- Adapted some read_json tests

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: jiangbo212 <peiyaoliukuan@gmail.com>
Co-authored-by: jiangbo212 <peiyaoliukuan@126.com>
2022-12-13 10:58:24 +00:00
Thomas Payet
8a7f90250c Update README.md
Change Slack link to Discord link
2022-12-13 10:46:05 +01:00
jiangbo212
23c1b223b3 Merge branch 'fix-3037' of github.com:jiangbo212/meilisearch into fix-3037 2022-12-13 10:41:50 +08:00
jiangbo212
87ae0032bf review change 2022-12-13 10:41:43 +08:00
jiangbo212
7c24fea9f2 Merge branch 'main' into fix-3037 2022-12-13 05:16:03 +08:00
jiangbo212
27d1bee0bb Merge branch 'main' into fix-3037-new 2022-12-12 22:16:22 +08:00
jiangbo212
b1c3174061 fix fmt 2022-12-12 22:06:24 +08:00
jiangbo212
fa46dfb7bb fmt fix 2022-12-12 22:02:56 +08:00
bors[bot]
40d9b73aaf Merge #3223
3223: Bring back release-v0.30.2 changes into main r=irevoire a=curquiza

Only bring back the necessary changes from `release-v0.30.2` to `main`, following v0.30.2 release

Co-authored-by: Tamo <tamo@meilisearch.com>
Co-authored-by: curquiza <clementine@meilisearch.com>
2022-12-12 13:49:01 +00:00
jiangbo212
169682d3ec Merge branch 'main' into fix-3037-new 2022-12-12 21:36:10 +08:00
bors[bot]
21b926cb00 Merge #3224
3224: Fix update-cargo-toml-version.yml r=curquiza a=mohitsaxenaknoldus

# Pull Request

## Related issue
Fixes #3219 

## What does this PR do?
- ...

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Mohit Saxena <76725454+mohitsaxenaknoldus@users.noreply.github.com>
2022-12-12 13:27:46 +00:00
bors[bot]
34a6f2598b Merge #3229
3229: Add a nightly CI: create every day a `nightly` Docker tag based on the latest commit on `main` r=Kerollmops a=curquiza

Also, fixes #3195

Easy to follow with the commits
- In the Docker CI:
  - create every day a `nightly` Docker tag based on the latest commit on `main`
  - check if the release is the latest one, before creating the `latest` Docker tag. A script has been added.
  - add the `worflow_dispatch` event to trigger the CI to build the `nightly` tag when we want (always on the latest commit on `main`)
- In multiple CIs: replace the `released` type by `published`, see [here](https://stackoverflow.com/questions/59319281/github-action-different-between-release-created-and-published) why. Will not impact anything, but will prevent to fail our future automation
- Remove a useless CI (code coverage, not used for 1 year)
- Remove useless lines (comments and CI logic) that don't have any impact

Co-authored-by: curquiza <clementine@meilisearch.com>
2022-12-12 10:46:33 +00:00
curquiza
14824cee86 Remove obsolete comment line 2022-12-11 21:46:48 +01:00
curquiza
796e61ec7e Remove useless CI 2022-12-11 21:29:23 +01:00
curquiza
9a3f9577b8 Remove useless line in CI 2022-12-11 21:26:05 +01:00
curquiza
2c8eb92537 Check before publish latest 2022-12-11 21:24:52 +01:00
Mohit Saxena
1bf5c0edb9 Update update-cargo-toml-version.yml 2022-12-10 23:04:26 +05:30
curquiza
b1ffbe561e Add nightly for docker CI 2022-12-09 20:06:59 +01:00
curquiza
84204b8cd5 Replace the released type by published 2022-12-09 19:27:58 +01:00
Mohit Saxena
346fca5608 Update update-cargo-toml-version.yml 2022-12-09 00:20:51 +05:30
curquiza
4631f4d97f Bump milli to v0.37.2 2022-12-08 18:16:48 +01:00
Tamo
6f1c30b247 Fix the instance-uid in the data.ms
We were writing the instance-uid as bytes instead of string in the data.ms and thus we were unable to parse it later.
Also it was less practical for our user to retrieve it and send it to us.
2022-12-08 18:16:43 +01:00
bors[bot]
abba54e913 Merge #3112
3112: Rename meilisearch-http r=Kerollmops a=colbsmcdolbs

# Pull Request

## Related issue
Fixes #3073 

## What does this PR do?
- Renames all references of `meilisearch-http` to `meilisearch`
- Might need to be rebased before the 1.0.0 release

## PR checklist
Please check if your PR fulfills the following requirements:
- [X] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [X] Have you read the contributing guidelines?
- [X] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Colby Allen <colbyjayallen@gmail.com>
2022-12-08 16:32:08 +00:00
bors[bot]
c426fa1478 Merge #3212
3212: Setup COMMIT_SHA and COMMIT_DATE build args in the Docker image r=curquiza a=brunoocasali

GitHub auto-closed my PR when I synced changes with my remote 🤷‍♂️  https://github.com/meilisearch/meilisearch/pull/2550
The last PR #3205 were closed to help `@curquiza` test the CI.

In any case, the summary of changes is quite similar:

- Fix `git` usage from my last attempt (when you use `actions/checkout`) you get the `git` command to use.
- Add the `build-args` definition from https://github.com/docker/build-push-action#inputs, which is supposed to work precisely as docker build `--build-arg`. 

Fixes https://github.com/meilisearch/meilisearch/issues/2028

The result will be like this:

<img width="556" alt="image" src="https://user-images.githubusercontent.com/4116980/206019608-2713559a-1f58-4ff3-9fec-7720783993ac.png">

Co-authored-by: Bruno Casali <brunoocasali@gmail.com>
2022-12-08 16:07:50 +00:00
bors[bot]
574be942cd Merge #3221
3221: Update README to reference Meilisearch Cloud r=curquiza a=davelarkan

# Pull Request

## Related issue
Fixes #3220

## What does this PR do?
- Updates the README to link to the Pricing page where people can choose a Meilisearch Cloud plan

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Dave Larkan <davelarkan@gmail.com>
2022-12-08 15:43:43 +00:00
Colby Allen
2262766494 chore: run fmt nightly on project 2022-12-08 08:31:15 -07:00
Colby Allen
ad2b1467da Renames meilisearch-http to meilisearch 2022-12-08 08:22:53 -07:00
Dave Larkan
ee37d5e724 Update README to reference Meilisearch Cloud 2022-12-08 15:02:34 +00:00
bors[bot]
ded2a50d14 Merge #3216
3216: Update version for the next release (v1.0.0) in Cargo.toml files r=curquiza a=meili-bot

⚠️ This PR is automatically generated. Check the new version is the expected one before merging.

Co-authored-by: curquiza <curquiza@users.noreply.github.com>
2022-12-08 13:49:50 +00:00
Bruno Casali
58327979f1 Use correct env vars "VERGEN_*" on Dockerfile 2022-12-08 10:48:16 -03:00
Bruno Casali
50d9fe036e Setup COMMIT_SHA and COMMIT_DATE build args in the Docker image 2022-12-08 10:48:16 -03:00
curquiza
026cf223b3 Update version for the next release (v1.0.0) in Cargo.toml files 2022-12-08 12:20:17 +00:00
bors[bot]
af6f7f8462 Merge #3215
3215: Use nightly in cargo fmt r=curquiza a=curquiza

Discussed with `@Kerollmops,` needs this change

Co-authored-by: Clémentine Urquizar - curqui <clementine@meilisearch.com>
2022-12-08 10:53:24 +00:00
Clémentine Urquizar - curqui
5023d36ee7 Use nightly in cargo fmt 2022-12-08 11:51:13 +01:00
bors[bot]
f4dc4c5d8d Merge #3210
3210: Fix `MDB_PAGE_FULL` by bumping LMDB r=Kerollmops a=Kerollmops

This PR fixes #3062 by upgrading LMDB to the latest version.

The changes were made in https://github.com/meilisearch/lmdb/pull/1 and https://github.com/meilisearch/lmdb-rs/pull/12. As heed directly depends on the latest main commit of https://github.com/meilisearch/lmdb-rs, we can bump the `lmdb-rkv-sys` dependency in the Meilisearch _Cargo.lock_ by doing a:

```
cargo update -p lmdb-rkv-sys
```

Co-authored-by: Kerollmops <clement@meilisearch.com>
2022-12-07 16:21:23 +00:00
jiangbo212
717dd36547 Merge branch 'fix-3037' of github.com:jiangbo212/meilisearch into fix-3037 2022-12-07 22:54:16 +08:00
jiangbo212
538030c2da change NameTempFile to tempfile() 2022-12-07 22:47:32 +08:00
Kerollmops
1d5294d11a Bump lmdb version 2022-12-07 15:29:56 +01:00
bors[bot]
34c3e5ec5e Merge #3208
3208: Stop snapshotting the version of meilisearch in the dump r=Kerollmops a=irevoire

It might change, and we don't want to update this test every time we make a new release.


Co-authored-by: Tamo <tamo@meilisearch.com>
2022-12-07 12:54:55 +00:00
Tamo
1c3a326199 stop snapshotting the version of meilisearch in the dump
It might change and we don't want to update this test everytime we make a new release.
2022-12-07 13:26:02 +01:00
bors[bot]
34c0f11c26 Merge #3207
3207: Add release check when starting latest CI r=Kerollmops a=curquiza

Adding this to have the same kind of check before starting to move the latest tag

<img width="737" alt="Capture d’écran 2022-12-07 à 12 18 33" src="https://user-images.githubusercontent.com/20380692/206165868-18a2be7c-78ec-48c9-acb9-d7f60797c2e3.png">

Also, removing an un-unused script

Co-authored-by: curquiza <clementine@meilisearch.com>
2022-12-07 11:27:47 +00:00
curquiza
be300138e4 Add release check when starting latest CI 2022-12-07 12:22:44 +01:00
bors[bot]
2ed6017603 Merge #3204
3204: Bring back v0.30.1 changes to `main` r=curquiza a=curquiza

I was not able to just import `release-v0.30.1` to `main`, see:
<img width="1371" alt="Capture d’écran 2022-12-06 à 20 03 50" src="https://user-images.githubusercontent.com/20380692/206000844-b39b3063-7da2-475f-b3e4-1791c39a7c2f.png">

So I cherry-picked the commits.

⚠️ ⚠️ ⚠️ I had a git conflict here

<img width="730" alt="Capture d’écran 2022-12-06 à 20 09 04" src="https://user-images.githubusercontent.com/20380692/206001007-f56bc28f-c0b1-46a0-bb60-cce4e73b9584.png">


⚠️ ⚠️ ⚠️ Check out carefully how I fixed it


Co-authored-by: curquiza <curquiza@users.noreply.github.com>
Co-authored-by: Kerollmops <clement@meilisearch.com>
2022-12-07 11:08:37 +00:00
Kerollmops
c1337f9e08 Update dump snap to new version 2022-12-07 11:48:29 +01:00
bors[bot]
9acac28574 Merge #3128
3128: Bumps cargo_toml version to most up to date r=curquiza a=colbsmcdolbs

# Pull Request

## Related issue
Fixes #3127

## What does this PR do?
- The README of this repository declares that one package is not up to date. In order to ensure Due Diligence, I have bumped the version number of the package. No test failures running on Windows.

## PR checklist
Please check if your PR fulfills the following requirements:
- [X] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [X] Have you read the contributing guidelines?
- [X] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Colby Allen <colbyjayallen@gmail.com>
2022-12-07 10:31:25 +00:00
jiangbo212
cb1d184904 fmt fix 2022-12-07 17:04:24 +08:00
jiangbo212
2841b09789 Merge branch 'meilisearch:main' into fix-3037 2022-12-07 16:30:21 +08:00
jiangbo212
35f3dd68b6 error change and tokio file use change 2022-12-07 16:20:36 +08:00
Kerollmops
f1de3aa75a Make the tests use MB to trigger page size issues 2022-12-06 20:10:10 +01:00
Kerollmops
e4e4370a3c Clamp the databases size to the page size 2022-12-06 20:09:49 +01:00
Kerollmops
24c79b79f9 Bump milli to v0.37.1 2022-12-06 20:05:52 +01:00
curquiza
5db7c4057c Update version for the next release (v0.30.1) in Cargo.toml files 2022-12-06 20:05:46 +01:00
bors[bot]
2867d2e91a Merge #3190
3190: Fix the dump date-import of the dumpv4 r=irevoire a=irevoire

# Pull Request
After merging https://github.com/meilisearch/meilisearch/pull/3012 I realized that the tests on the date of the dump-v4 were still ignored, thus, I fixed them and then noticed #3012 wasn't working properly.

## Related issue
Fixes https://github.com/meilisearch/meilisearch/issues/2987 a second time


`@funilrys` since you wrote most of the code you might be interested, but don't feel obligated to review this code. 
Someone from the team will double-check it works 😁 

Co-authored-by: Tamo <tamo@meilisearch.com>
2022-12-06 10:47:00 +00:00
bors[bot]
1458a12531 Merge #3197
3197: Revert "Upgrade alpine 3.16 to 3.17" r=irevoire a=curquiza

Reverts meilisearch/meilisearch#3189

Because `rust:alpine3.17` does not exist, and our scheduled CI failed: https://github.com/meilisearch/meilisearch/actions/runs/3626327181

`@ivanionut` for your information, I'm sorry I should have better checked before accepting the PR, this is my bad


Co-authored-by: Clémentine Urquizar - curqui <clementine@meilisearch.com>
2022-12-06 10:25:11 +00:00
Clémentine Urquizar - curqui
cbb8d0f97b Revert "Upgrade alpine 3.16 to 3.17" 2022-12-06 11:09:57 +01:00
Tamo
bef81065f9 return the same time in case we didn't found a created or updated at 2022-12-06 11:03:23 +01:00
Tamo
180511795b Update dump/src/reader/v4/mod.rs fix typo
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2022-12-06 10:53:43 +01:00
bors[bot]
3bef6e6690 Merge #3175
3175: Rename dump command from --dumps-dir to --dump-dir r=dureuill a=dureuill

# Pull Request

## Related issue
Fixes #3132 

## What does this PR do?
- Rename the dump commands, env variables and default config

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2022-12-06 09:49:42 +00:00
bors[bot]
9b23885e85 Merge #3188
3188: re-enable the dump test on the dates r=irevoire a=irevoire

I just noticed that we have the real date in the dump-v1 contrarily to the dump-v2/3/4/5, thus we can ensure it doesn't change unexpectedly 👍 

Co-authored-by: Tamo <tamo@meilisearch.com>
2022-12-05 18:10:22 +00:00
bors[bot]
8b46093117 Merge #3189
3189: Upgrade alpine 3.16 to 3.17 r=curquiza a=ivanionut

Upgrade alpine 3.16 to 3.17

Co-authored-by: Ivan Ionut <ivan.ionut@gmail.com>
2022-12-05 17:39:10 +00:00
Tamo
9c89e3dadc uncomment more test for the dump v4 2022-12-05 18:15:29 +01:00
Tamo
b0cf431614 Fix the dump date-import of the dumpv4 2022-12-05 18:08:35 +01:00
Ivan Ionut
afe520a67e Upgrade alpine 3.16 to 3.17 2022-12-05 17:49:15 +01:00
Tamo
688911ed34 re-enable the dump test on the dates 2022-12-05 17:05:37 +01:00
bors[bot]
776af129bf Merge #3012
3012: Extract the dates out of the dumpv4. r=irevoire a=funilrys

Hi there, 

please review this PR that tries to fix #2987. I'm still learning Rust and I found that #2987 is an excellent way for me to read and learn what others do with Rust. So please excuse my semantics ...

Stay safe and healthy.

---

# Pull Request

This patch possibly fixes #2987.

This patch introduces a way to fill the IndexMetadata.created_at and IndexMetadata.updated_at keys from the tasks events. This is done by reading the creation date of the first event (created_at) and the creation date of the last event (updated_at).


## Related issue
Fixes #2987

## What does this PR do?
- Extract the dates out of the dumpv4.

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: funilrys <contact@funilrys.com>
2022-12-05 15:57:07 +00:00
Louis Dureuil
492fd2829a use a consistent dump directory name in tests
changed from 'dump' to 'dumps' to be consistent with the default settings

Co-authored-by: Tamo <tamo@meilisearch.com>
2022-12-05 16:56:28 +01:00
bors[bot]
ffa6d1ed4e Merge #3186
3186: Update mini-dashboard to v0.2.4 r=curquiza a=mdubus



Co-authored-by: Morgane Dubus <30866152+mdubus@users.noreply.github.com>
2022-12-05 14:45:27 +00:00
Morgane Dubus
293efb7485 Update Cargo.toml 2022-12-05 14:54:01 +01:00
jiangbo212
6766712840 fmt fix 2022-12-04 23:05:34 +08:00
jiangbo212
980776b646 test fail fix 2022-12-04 22:31:23 +08:00
jiangbo212
6bdd37beb8 tokio file write update 2022-12-04 18:25:06 +08:00
funilrys
8b6eba4f0b Apply fmt. 2022-12-03 17:47:02 +01:00
funilrys
e510ace179 fixup! Re-open tasks queue. 2022-12-03 17:41:33 +01:00
funilrys
f056fc118f Re-open tasks queue.
Indeed, before this patch, I was (probably) breaking every usage
of the tasks BufReader. This patch solves the issue by reopening
the the tasks file every time its needed.
2022-12-03 17:29:41 +01:00
jiangbo212
5a770ffe47 test fail fix 2022-12-03 22:48:38 +08:00
jiangbo212
7b08d700f7 requested changes fix 2022-12-03 18:52:20 +08:00
jiangbo212
c63748723d Merge branch 'meilisearch:main' into fix-3037 2022-12-03 15:56:41 +08:00
bors[bot]
40339919ad Merge #3170 #3180 #3181 #3183
3170: Re-enable importing from dumps v1 r=irevoire a=dureuill

# Pull Request

## Related issue
Fixes #2985 

## What does this PR do?

### User standpoint

- Allows importing dumps version 1 (exported with Meilisearch <=v0.20) to modern-day Meilisearch
- Tasks of type "Customs" are skipped, with a warning
- Tasks of status "enqueued" are skipped, with a warning
- The "WordsPosition" ranking rule is skipped when encountered in the ranking rules, with a warning.

After an import from a v1 dump, it is recommended that a user checks each index and its settings.

### Implementation standpoint

- Add a dump v1 reader based on the one by `@irevoire` 
- Add a v1_to_v2 compatibility layer based on the v2_to_v3 one
  - as v2 requires UUIDs, the v1 indexes are mapped to UUIDs built from their position in the metada file: the first index is given UUID all zeroes, the second one UUID `00000000-0000-0000-0000-000000000001`, and so on... This should have no bearing on the final indexes because v6 is not using UUIDs, but this allows us to correctly identify which tasks belong to which index.
- Modify the v2_to_v3 compatibility layer to account for the fact that the reader can actually be a v1_to_v2 compat layer
- Make some base dump types Clone
- impl Display for v2::settings::Criterion

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


3180: Bump mislav/bump-homebrew-formula-action from 1 to 2 r=curquiza a=dependabot[bot]

Bumps [mislav/bump-homebrew-formula-action](https://github.com/mislav/bump-homebrew-formula-action) from 1 to 2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/mislav/bump-homebrew-formula-action/releases">mislav/bump-homebrew-formula-action's releases</a>.</em></p>
<blockquote>
<h2>bump-homebrew-formula 2.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Use Node 16 by <a href="https://github.com/chenrui333"><code>`@​chenrui333</code></a>` in <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/pull/36">mislav/bump-homebrew-formula-action#36</a></li>
<li>Bump minimist from 1.2.5 to 1.2.6 by <a href="https://github.com/dependabot"><code>`@​dependabot</code></a>` in <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/pull/33">mislav/bump-homebrew-formula-action#33</a></li>
<li>Bump node-fetch from 2.6.6 to 2.6.7 by <a href="https://github.com/dependabot"><code>`@​dependabot</code></a>` in <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/pull/34">mislav/bump-homebrew-formula-action#34</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/chenrui333"><code>`@​chenrui333</code></a>` made their first contribution in <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/pull/36">mislav/bump-homebrew-formula-action#36</a></li>
</ul>
<h2>bump-homebrew-formula 1.16</h2>
<ul>
<li>Replaces broken v1.15 tag, thanks <a href="https://github.com/hendrikmaus"><code>`@​hendrikmaus</code></a>` <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/issues/32">mislav/bump-homebrew-formula-action#32</a></li>
<li>Add <code>push-to</code> option, thanks <a href="https://github.com/codefromthecrypt"><code>`@​codefromthecrypt</code></a>` <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/pull/30">mislav/bump-homebrew-formula-action#30</a></li>
<li>Fix syntax error, thanks <a href="https://github.com/hendrikmaus"><code>`@​hendrikmaus</code></a>` <a href="https://github.com/wata727"><code>`@​wata727</code></a>` <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/pull/27">mislav/bump-homebrew-formula-action#27</a></li>
<li>Ensure repeated placeholders in <code>commit-message</code> are expanded, thanks <a href="https://github.com/hendrikmaus"><code>`@​hendrikmaus</code></a>` <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/pull/29">mislav/bump-homebrew-formula-action#29</a></li>
</ul>
<h2>bump-homebrew-formula 1.14</h2>
<ul>
<li>Ignore HTTP 409 error when fast-forwading the main branch of <code>homebrew-tap</code> fork</li>
</ul>
<h2>bump-homebrew-formula 1.13</h2>
<ul>
<li>Add <code>create-pullrequest</code> input to control whether or not a PR is submitted to <code>homebrew-tap</code></li>
<li>Add <code>download-sha256</code> input to define the SHA256 checksum of the archive at <code>download-url</code></li>
<li>Fix creating a new branch in the forked repo failing with HTTP 404</li>
</ul>
<h2>bump-homebrew-formula 1.12</h2>
<ul>
<li>Fix Actions CJS loader halting on <code>foo?.bar</code> JS syntax</li>
</ul>
<h2>bump-homebrew-formula 1.11</h2>
<ul>
<li>New optional <code>formula-path</code> input accepts the filename of the formula file to edit (default <code>Formula/&lt;formula-name&gt;.rb</code>).</li>
<li>Remove <code>revision N</code> lines when bumping Homebrew formulae.</li>
</ul>
<h2>bump-homebrew-formula 1.10</h2>
<ul>
<li>The new optional <code>tag-name</code> input allows this action to be <a href="https://docs.github.com/en/actions/managing-workflow-runs/manually-running-a-workflow">manually triggered via <code>workflow_dispatch</code></a> instead of on git push to a tag.</li>
</ul>
<h2>bump-homebrew-formula 1.9</h2>
<ul>
<li>Fix following multiple HTTP redirects while calculating checksum for <code>download-url</code></li>
</ul>
<h2>bump-homebrew-formula 1.8</h2>
<ul>
<li>Enable JavaScript source maps for better failure debugging</li>
</ul>
<h2>bump-homebrew-formula 1.7</h2>
<ul>
<li>
<p>Allow <code>download-url</code> as input parameter</p>
</li>
<li>
<p>Add support for git-based <code>download-url</code></p>
</li>
</ul>
<h2>bump-homebrew-formula 1.6</h2>
<ul>
<li>Control the git commit message template being used for updating the formula file via the <code>commit-message</code> action input</li>
</ul>
<h2>bump-homebrew-formula 1.5</h2>
<ul>
<li>Support detection version from <code>https://github.com/OWNER/REPO/releases/download/TAG/FILE</code> download URLs</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="fcd7e28e54"><code>fcd7e28</code></a> lib</li>
<li><a href="33989a8502"><code>33989a8</code></a> Merge branch 'main' into v2</li>
<li><a href="5983bb6c59"><code>5983bb6</code></a> Improve extracting complex tag names from URLs</li>
<li><a href="9750a1166b"><code>9750a11</code></a> lib</li>
<li><a href="64410e9c96"><code>64410e9</code></a> v2</li>
<li><a href="677d7482a3"><code>677d748</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/issues/36">#36</a> from chenrui333/node-16</li>
<li><a href="f364e76079"><code>f364e76</code></a> also update some build dependencies</li>
<li><a href="c08fd9bee5"><code>c08fd9b</code></a> deps: update to use nodev16</li>
<li><a href="280f532e9a"><code>280f532</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/mislav/bump-homebrew-formula-action/issues/34">#34</a> from mislav/dependabot/npm_and_yarn/node-fetch-2.6.7</li>
<li><a href="5d94a66af3"><code>5d94a66</code></a> Bump node-fetch from 2.6.6 to 2.6.7</li>
<li>Additional commits viewable in <a href="https://github.com/mislav/bump-homebrew-formula-action/compare/v1...v2">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mislav/bump-homebrew-formula-action&package-manager=github_actions&previous-version=1&new-version=2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting ``@dependabot` rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- ``@dependabot` rebase` will rebase this PR
- ``@dependabot` recreate` will recreate this PR, overwriting any edits that have been made to it
- ``@dependabot` merge` will merge this PR after your CI passes on it
- ``@dependabot` squash and merge` will squash and merge this PR after your CI passes on it
- ``@dependabot` cancel merge` will cancel a previously requested merge and block automerging
- ``@dependabot` reopen` will reopen this PR if it is closed
- ``@dependabot` close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- ``@dependabot` ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- ``@dependabot` ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- ``@dependabot` ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>

3181: Bump Swatinem/rust-cache from 2.0.0 to 2.2.0 r=curquiza a=dependabot[bot]

Bumps [Swatinem/rust-cache](https://github.com/Swatinem/rust-cache) from 2.0.0 to 2.2.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/Swatinem/rust-cache/releases">Swatinem/rust-cache's releases</a>.</em></p>
<blockquote>
<h2>v2.2.0</h2>
<ul>
<li>Add new <code>save-if</code> option to always restore, but only conditionally save the cache.</li>
</ul>
<h2>v2.1.0</h2>
<ul>
<li>Only hash <code>Cargo.{lock,toml}</code> files in the configured workspace directories.</li>
</ul>
<h2>v2.0.2</h2>
<ul>
<li>Avoid calling cargo metadata on pre-cleanup.</li>
<li>Added <code>prefix-key</code>, <code>cache-directories</code> and <code>cache-targets</code> options.</li>
</ul>
<h2>v2.0.1</h2>
<ul>
<li>Primarily just updating dependencies to fix GitHub deprecation notices.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/Swatinem/rust-cache/blob/master/CHANGELOG.md">Swatinem/rust-cache's changelog</a>.</em></p>
<blockquote>
<h2>2.2.0</h2>
<ul>
<li>Add new <code>save-if</code> option to always restore, but only conditionally save the cache.</li>
</ul>
<h2>2.1.0</h2>
<ul>
<li>Only hash <code>Cargo.{lock,toml}</code> files in the configured workspace directories.</li>
</ul>
<h2>2.0.2</h2>
<ul>
<li>Avoid calling <code>cargo metadata</code> on pre-cleanup.</li>
<li>Added <code>prefix-key</code>, <code>cache-directories</code> and <code>cache-targets</code> options.</li>
</ul>
<h2>2.0.1</h2>
<ul>
<li>Primarily just updating dependencies to fix GitHub deprecation notices.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="359a70e43a"><code>359a70e</code></a> 2.2.0</li>
<li><a href="ecee04e7b3"><code>ecee04e</code></a> feat: add save-if option, closes <a href="https://github-redirect.dependabot.com/Swatinem/rust-cache/issues/66">#66</a> (<a href="https://github-redirect.dependabot.com/Swatinem/rust-cache/issues/91">#91</a>)</li>
<li><a href="b894d59a8d"><code>b894d59</code></a> 2.1.0</li>
<li><a href="e78327dd9e"><code>e78327d</code></a> small code style improvements, README and CHANGELOG updates</li>
<li><a href="ccdddcc049"><code>ccdddcc</code></a> only hash Cargo.toml/Cargo.lock that belong to a configured workspace (<a href="https://github-redirect.dependabot.com/Swatinem/rust-cache/issues/90">#90</a>)</li>
<li><a href="b5ec9edd91"><code>b5ec9ed</code></a> 2.0.2</li>
<li><a href="3f2513fdf4"><code>3f2513f</code></a> avoid calling cargo metadata on pre-cleanup</li>
<li><a href="19c46583c5"><code>19c4658</code></a> update dependencies</li>
<li><a href="b8e72aae83"><code>b8e72aa</code></a> Added <code>prefix-key</code> <code>cache-directories</code> and <code>cache-targets</code> options (<a href="https://github-redirect.dependabot.com/Swatinem/rust-cache/issues/85">#85</a>)</li>
<li><a href="22c9328bcb"><code>22c9328</code></a> 2.0.1</li>
<li>Additional commits viewable in <a href="https://github.com/Swatinem/rust-cache/compare/v2.0.0...v2.2.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=Swatinem/rust-cache&package-manager=github_actions&previous-version=2.0.0&new-version=2.2.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting ``@dependabot` rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- ``@dependabot` rebase` will rebase this PR
- ``@dependabot` recreate` will recreate this PR, overwriting any edits that have been made to it
- ``@dependabot` merge` will merge this PR after your CI passes on it
- ``@dependabot` squash and merge` will squash and merge this PR after your CI passes on it
- ``@dependabot` cancel merge` will cancel a previously requested merge and block automerging
- ``@dependabot` reopen` will reopen this PR if it is closed
- ``@dependabot` close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- ``@dependabot` ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- ``@dependabot` ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- ``@dependabot` ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>

3183: Use ubuntu-latest when not impacting r=Kerollmops a=curquiza

Minor changes
- Use `ubuntu-latest` for CI where there is no compilation
- rename one of the workflow (obsolete name)

Co-authored-by: Louis Dureuil <louis@meilisearch.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: curquiza <clementine@meilisearch.com>
2022-12-01 17:33:59 +00:00
curquiza
7f3653ec31 Use ubuntu-latest when not impacting 2022-12-01 18:23:29 +01:00
bors[bot]
a1f5ec1e9e Merge #3179
3179: Bump svenstaro/upload-release-action from 1.pre.release to 2.3.0 r=curquiza a=dependabot[bot]

Bumps [svenstaro/upload-release-action](https://github.com/svenstaro/upload-release-action) from 1.pre.release to 2.3.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/svenstaro/upload-release-action/releases">svenstaro/upload-release-action's releases</a>.</em></p>
<blockquote>
<h2>2.3.0</h2>
<ul>
<li>Now defaults <code>repo_token</code> to <code>${{ github.token }}</code> and <code>tag</code> to <code>${{ github.ref }}</code> <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/69">#69</a> (thanks <a href="https://github.com/leighmcculloch"><code>`@​leighmcculloch</code></a>)**</li>`
</ul>
<h2>2.2.1</h2>
<ul>
<li>Added support for the GitHub pagination API for repositories with many releases <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/36">#36</a> (thanks <a href="https://github.com/djpohly"><code>`@​djpohly</code></a>)</li>`
</ul>
<h2>2.2.0</h2>
<ul>
<li>Add support for ceating a new release in a foreign repository <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/25">#25</a> (thanks <a href="https://github.com/kittaakos"><code>`@​kittaakos</code></a>)</li>`
<li>Upgrade all deps</li>
</ul>
<h2>2.1.1</h2>
<ul>
<li>Fix <code>release_name</code> option <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/27">#27</a> (thanks <a href="https://github.com/kittaakos"><code>`@​kittaakos</code></a>)</li>`
</ul>
<h2>2.1.0</h2>
<ul>
<li>Strip refs/heads/ from the input tag <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/23">#23</a> (thanks <a href="https://github.com/OmarEmaraDev"><code>`@​OmarEmaraDev</code></a>)</li>`
</ul>
<h2>2.0.0</h2>
<ul>
<li>Add <code>prerelease</code> input parameter. Setting this marks the created release as a pre-release.</li>
<li>Add <code>release_name</code> input parameter. Setting this explicitly sets the title of the release.</li>
<li>Add <code>body</code> input parameter. Setting this sets the text content of the created release.</li>
<li>Add <code>browser_download_url</code> output variable. This contains the publicly accessible download URL of the uploaded artifact.</li>
<li>Allow for leaving <code>asset_name</code> unset. This will cause the asset to use the filename.</li>
</ul>
<h2>1.1.0</h2>
<p>No release notes provided.</p>
<h2>1.0.1</h2>
<p>No release notes provided.</p>
<h2>1.0.0</h2>
<p>No release notes provided.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/svenstaro/upload-release-action/blob/master/CHANGELOG.md">svenstaro/upload-release-action's changelog</a>.</em></p>
<blockquote>
<h2>[2.3.0] - 2022-06-05</h2>
<ul>
<li>Now defaults <code>repo_token</code> to <code>${{ github.token }}</code> and <code>tag</code> to <code>${{ github.ref }}</code> <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/69">#69</a> (thanks <a href="https://github.com/leighmcculloch"><code>`@​leighmcculloch</code></a>)</li>`
</ul>
<h2>[2.2.1] - 2020-12-16</h2>
<ul>
<li>Added support for the GitHub pagination API for repositories with many releases <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/36">#36</a> (thanks <a href="https://github.com/djpohly"><code>`@​djpohly</code></a>)</li>`
</ul>
<h2>[2.2.0] - 2020-10-07</h2>
<ul>
<li>Add support for ceating a new release in a foreign repository <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/25">#25</a> (thanks <a href="https://github.com/kittaakos"><code>`@​kittaakos</code></a>)</li>`
<li>Upgrade all deps</li>
</ul>
<h2>[2.1.1] - 2020-09-25</h2>
<ul>
<li>Fix <code>release_name</code> option <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/27">#27</a> (thanks <a href="https://github.com/kittaakos"><code>`@​kittaakos</code></a>)</li>`
</ul>
<h2>[2.1.0] - 2020-08-10</h2>
<ul>
<li>Strip refs/heads/ from the input tag <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/pull/23">#23</a> (thanks <a href="https://github.com/OmarEmaraDev"><code>`@​OmarEmaraDev</code></a>)</li>`
</ul>
<h2>[2.0.0] - 2020-07-03</h2>
<ul>
<li>Add <code>prerelease</code> input parameter. Setting this marks the created release as a pre-release.</li>
<li>Add <code>release_name</code> input parameter. Setting this explicitly sets the title of the release.</li>
<li>Add <code>body</code> input parameter. Setting this sets the text content of the created release.</li>
<li>Add <code>browser_download_url</code> output variable. This contains the publicly accessible download URL of the uploaded artifact.</li>
<li>Allow for leaving <code>asset_name</code> unset. This will cause the asset to use the filename.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="133984371c"><code>1339843</code></a> 2.3.0</li>
<li><a href="c2b649c57e"><code>c2b649c</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/issues/72">#72</a> from svenstaro/dependabot/npm_and_yarn/node-fetch-2.6.7</li>
<li><a href="eb625cd0ad"><code>eb625cd</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/issues/71">#71</a> from svenstaro/dependabot/npm_and_yarn/ansi-regex-4.1.1</li>
<li><a href="99cbd251b2"><code>99cbd25</code></a> Bump node-fetch from 2.6.1 to 2.6.7</li>
<li><a href="a6824c9e54"><code>a6824c9</code></a> Bump ansi-regex from 4.1.0 to 4.1.1</li>
<li><a href="6eb74c809d"><code>6eb74c8</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/issues/67">#67</a> from svenstaro/dependabot/npm_and_yarn/minimist-1.2.6</li>
<li><a href="8ec375d911"><code>8ec375d</code></a> Merge pull request <a href="https://github-redirect.dependabot.com/svenstaro/upload-release-action/issues/69">#69</a> from leighmcculloch/patch-1</li>
<li><a href="8d45355ac2"><code>8d45355</code></a> Update README.md</li>
<li><a href="7d269bd712"><code>7d269bd</code></a> Add defaults to commonly set fields to reduce setup</li>
<li><a href="c71fb95114"><code>c71fb95</code></a> Bump minimist from 1.2.5 to 1.2.6</li>
<li>Additional commits viewable in <a href="https://github.com/svenstaro/upload-release-action/compare/v1-release...2.3.0">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=svenstaro/upload-release-action&package-manager=github_actions&previous-version=1.pre.release&new-version=2.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting ``@dependabot` rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- ``@dependabot` rebase` will rebase this PR
- ``@dependabot` recreate` will recreate this PR, overwriting any edits that have been made to it
- ``@dependabot` merge` will merge this PR after your CI passes on it
- ``@dependabot` squash and merge` will squash and merge this PR after your CI passes on it
- ``@dependabot` cancel merge` will cancel a previously requested merge and block automerging
- ``@dependabot` reopen` will reopen this PR if it is closed
- ``@dependabot` close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- ``@dependabot` ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- ``@dependabot` ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- ``@dependabot` ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-01 17:11:54 +00:00
dependabot[bot]
dd6593f7b6 Bump Swatinem/rust-cache from 2.0.0 to 2.2.0
Bumps [Swatinem/rust-cache](https://github.com/Swatinem/rust-cache) from 2.0.0 to 2.2.0.
- [Release notes](https://github.com/Swatinem/rust-cache/releases)
- [Changelog](https://github.com/Swatinem/rust-cache/blob/master/CHANGELOG.md)
- [Commits](https://github.com/Swatinem/rust-cache/compare/v2.0.0...v2.2.0)

---
updated-dependencies:
- dependency-name: Swatinem/rust-cache
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-12-01 17:02:25 +00:00
dependabot[bot]
fc905f29e9 Bump mislav/bump-homebrew-formula-action from 1 to 2
Bumps [mislav/bump-homebrew-formula-action](https://github.com/mislav/bump-homebrew-formula-action) from 1 to 2.
- [Release notes](https://github.com/mislav/bump-homebrew-formula-action/releases)
- [Commits](https://github.com/mislav/bump-homebrew-formula-action/compare/v1...v2)

---
updated-dependencies:
- dependency-name: mislav/bump-homebrew-formula-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-12-01 17:02:18 +00:00
dependabot[bot]
f892d122de Bump svenstaro/upload-release-action from 1.pre.release to 2.3.0
Bumps [svenstaro/upload-release-action](https://github.com/svenstaro/upload-release-action) from 1.pre.release to 2.3.0.
- [Release notes](https://github.com/svenstaro/upload-release-action/releases)
- [Changelog](https://github.com/svenstaro/upload-release-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/svenstaro/upload-release-action/compare/v1-release...2.3.0)

---
updated-dependencies:
- dependency-name: svenstaro/upload-release-action
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-12-01 17:02:14 +00:00
bors[bot]
7e314664fc Merge #3169
3169: Improve download-latest.sh script: integrate apple-silicon binary r=curquiza a=curquiza

Fixes multiples issues: https://github.com/meilisearch/meilisearch/issues/3044, https://github.com/meilisearch/meilisearch/issues/3027, https://github.com/meilisearch/meilisearch/issues/2613, https://github.com/meilisearch/meilisearch/issues/3014


Improvement/Addition
- https://github.com/meilisearch/meilisearch/issues/3044
- https://github.com/meilisearch/meilisearch/issues/3027

Simplification
- With https://github.com/meilisearch/meilisearch/issues/3014, we removed the complex part to get the latest version of Meilisearch we can now get directly with the GitHub API

Bug fixes:
- Should remove the problem the users encountered in this issue (https://github.com/meilisearch/meilisearch/issues/2613) because the related part has been removed from the script

Co-authored-by: curquiza <clementine@meilisearch.com>
Co-authored-by: Clémentine Urquizar - curqui <clementine@meilisearch.com>
2022-12-01 16:18:38 +00:00
Louis Dureuil
a92d67b79f Also update analytics 2022-12-01 16:24:02 +01:00
Louis Dureuil
daa0f2e9aa Change dumps_dir to dump_dir in config.toml 2022-12-01 15:46:54 +01:00
Louis Dureuil
e35db5e59b Rename dumps-dir to dump-dir in CLI
Also rename the associated environment variable
2022-12-01 15:46:54 +01:00
Louis Dureuil
b727fe7179 Fix integration tests 2022-12-01 14:03:15 +01:00
Louis Dureuil
1260c32d18 Add reader mod test 2022-12-01 14:03:15 +01:00
Louis Dureuil
e03c216952 Add compat_v1_to_v2 test 2022-12-01 14:03:15 +01:00
Louis Dureuil
c7749127fa Use reader v1 and compat to v2 2022-12-01 14:03:15 +01:00
Louis Dureuil
c8841344e2 v1: RankingRule::from_str 2022-12-01 14:03:15 +01:00
Louis Dureuil
b8de369e33 Add v1 reader 2022-12-01 14:03:15 +01:00
bors[bot]
93b59d55e3 Merge #3172
3172: Add CI to push a latest git tag for every stable Meilisearch release r=curquiza a=curquiza

Fixes partially #3147.

- Add a CI to add/update a `latest` git tag ONLY when releasing a stable version of Meilisearch (not for pre-release or for custom tag for prototypes)
- Update the Docker CI to avoid being triggered when `latest` git tag is pushed: the CI is still triggered when a git tag is pushed, except for the `latest` tag.
Reminder: the `latest` Docker image is already created and pushed when releasing a stable version of Meilisearch. This step is already present in our current Docker CI.

Co-authored-by: curquiza <clementine@meilisearch.com>
2022-12-01 12:37:09 +00:00
curquiza
b9a8533de1 Add CI to push a latest git tag for every stable Meilisearch release 2022-12-01 12:53:44 +01:00
Louis Dureuil
d44652209d impl Display for v2::settings::Criterion 2022-12-01 11:15:57 +01:00
Louis Dureuil
5d22c7bcce Make some dump types Clone 2022-12-01 11:15:57 +01:00
Clémentine Urquizar - curqui
da8044f91e Update download-latest.sh
Co-authored-by: Louis Dureuil <louis.dureuil@gmail.com>
2022-11-30 16:55:32 +01:00
Clémentine Urquizar - curqui
9e3b1eb7a8 Update download-latest.sh
Co-authored-by: Louis Dureuil <louis.dureuil@gmail.com>
2022-11-30 16:55:27 +01:00
curquiza
eab1156f8c Update script to be used with macOS apple silicon 2022-11-30 14:14:46 +01:00
curquiza
8d405fad12 Simplify download-latest.sh script 2022-11-30 13:53:12 +01:00
jiangbo212
bf96b6df93 clippy fix change 2022-11-30 17:59:06 +08:00
jiangbo212
9c28632498 Merge branch 'main' into fix-3037 2022-11-30 09:38:01 +08:00
jiangbo212
38982d13fe fix issue 3037 2022-11-30 00:03:22 +08:00
bors[bot]
6150aa73b0 Merge #3148
3148: Bring back `release-v0.30.0` into `main` r=Kerollmops a=curquiza

Following this message

https://github.com/meilisearch/meilisearch/pull/3145#issuecomment-1329296168

Co-authored-by: Clémentine Urquizar - curqui <clementine@meilisearch.com>
Co-authored-by: Tamo <tamo@meilisearch.com>
Co-authored-by: bors[bot] <26634292+bors[bot]@users.noreply.github.com>
2022-11-29 09:23:44 +00:00
bors[bot]
14840a24c6 Merge #3149
3149: Fix the dump tests r=Kerollmops a=irevoire

You'll need to trust me on this one. But the tests in the release-v0.30.0 were deactivated for a long time, and I don't know what was wrong with them.
Anyway, I checked that these SHA did match the tasks view we're expecting, and it looks good to me.

Co-authored-by: Tamo <tamo@meilisearch.com>
2022-11-28 15:48:36 +00:00
Tamo
41eb986f65 Fix the dump tests 2022-11-28 16:39:22 +01:00
Clémentine Urquizar - curqui
457a473b72 Bring back release-v0.30.0 into release-v0.30.0-temp (final: into main) (#3145)
* Fix error code of the "duplicate index found" error

* Use the content of the ProcessingTasks in the tasks cancelation system

* Change the missing_filters error code into missing_task_filters

* WIP Introduce the invalid_task_uid error code

* Use more precise error codes/message for the task routes

+ Allow star operator in delete/cancel tasks
+ rename originalQuery to originalFilters
+ Display error/canceled_by in task view even when they are = null
+ Rename task filter fields by using their plural forms
+ Prepare an error code for canceledBy filter
+ Only return global tasks if the API key action `index.*` is there

* Add canceledBy task filter

* Update tests following task API changes

* Rename original_query to original_filters everywhere

* Update more insta-snap tests

* Make clippy happy

They're a happy clip now.

* Make rustfmt happy

>:-(

* Fix Index name parsing error message to fit the specification

* Bump milli version to 0.35.1

* Fix the new error messages

* fix the error messages and add tests

* rename the error codes for the sake of consistency

* refactor the way we send the cli informations + add the analytics for the config file and ssl usage

* Apply suggestions from code review

Co-authored-by: Clément Renault <clement@meilisearch.com>

* add a comment over the new infos structure

* reformat, sorry @kero

* Store analytics for the documents deletions

* Add analytics on all the settings

* Spawn threads with names

* Spawn rayon threads with names

* update the distinct attributes to the spec update

* update the analytics on the search route

* implements the analytics on the health and version routes

* Fix task details serialization

* Add the question mark to the task deletion query filter

* Add the question mark to the task cancelation query filter

* Fix tests

* add analytics on the task route

* Add all the missing fields of the new task query type
* Create a new analytics for the task deletion
* Create a new analytics for the task creation

* batch the tasks seen events

* Update the finite pagination analytics

* add the analytics of the swap-indexes route

* Stop removing the DB when failing to read it

* Rename originalFilters into originalFilters

* Rename matchedDocuments into providedIds

* Add `workflow_dispatch` to flaky.yml

* Bump grenad to 0.4.4

* Bump milli to version v0.37.0

* Don't multiply total memory returned by sysinfo anymore

sysinfo now returns bytes rather than KB

* Add a dispatch to the publish binaries workflow

* Fix publish release CI

* Don't use gold but the default linker

* Always display details for the indexDeletion task

* Fix the insta tests

* refactorize the whole test suite
1. Make a call to assert_internally_consistent automatically when snapshoting the scheduler. There is no point in snapshoting something broken and expect the dumb humans to notice.
2. Replace every possible call to assert_internally_consistent by a snapshot of the scheduler. It takes as many lines and ensure we never change something without noticing in any tests ever.
3. Name every snapshots: it's easier to debug when something goes wrong and easier to review in general.
4. Stop skipping breakpoints, it's too easy to miss something. Now you must explicitely show which path is the scheduler supposed to use.
5. Add a timeout on the channel.recv, it eases the process of writing tests, now when something file you get a failure instead of a deadlock.

* rebase on release-v0.30

* makes clippy happy

* update the snapshots after a rebase

* try to remove the flakyness of the failing test

* Add more analytics on the ranking rules positions

* Update the dump test to check for the dumpUid dumpCreation task details

* send the ranking rules as a string because amplitude is too dumb to process an array as a single value

* Display a null dumpUid until we computed the dump itself on disk

* Update tests

* Check if the master key is missing before returning an error

Co-authored-by: Loïc Lecrenier <loic.lecrenier@me.com>
Co-authored-by: bors[bot] <26634292+bors[bot]@users.noreply.github.com>
Co-authored-by: Kerollmops <clement@meilisearch.com>
Co-authored-by: ManyTheFish <many@meilisearch.com>
Co-authored-by: Tamo <tamo@meilisearch.com>
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2022-11-28 16:27:41 +01:00
bors[bot]
914f8b118c Merge #3119
3119: Dump tests r=Kerollmops a=irevoire

Reenable the dump tests

Co-authored-by: Tamo <tamo@meilisearch.com>
2022-11-24 10:17:38 +00:00
Colby Allen
6ecd486d1b Bumps cargo_toml version to most up to date 2022-11-23 16:27:54 -07:00
Tamo
7b8641a7af fix the dump tests
The issue was linked to the fact that the debug implementation of the PhantomData wasn't the same between rust stable and rust nightly.
This was causing an issue while snapshsotting the settings and this commit fix it by representing the settings as json which already ignores the PhantomData
2022-11-23 16:59:20 +01:00
bors[bot]
f509d81ec9 Merge #3100
3100: Add a dispatch to the publish binaries workflow r=Kerollmops a=curquiza

Add `worklfow_dispatch` event to publish binaries workflow to allow the manually trigger

Co-authored-by: Kerollmops <clement@meilisearch.com>
2022-11-21 19:31:20 +00:00
Kerollmops
8d73ae80bb Add a dispatch to the publish binaries workflow 2022-11-21 18:50:57 +01:00
bors[bot]
00129e112a Merge #3041
3041: Add `workflow_dispatch` to flaky.yml r=irevoire a=curquiza

To be able to run the job manual and don't wait for one week

Co-authored-by: Clémentine Urquizar - curqui <clementine@meilisearch.com>
2022-11-17 13:21:11 +00:00
bors[bot]
46c275f0e4 Merge #3070
3070: Remove core and use engine r=Kerollmops a=curquiza

Following the new team name
Not mandatory since GitHub is doing redirection, but more consistent

Co-authored-by: curquiza <clementine@meilisearch.com>
2022-11-16 16:40:44 +00:00
curquiza
f4d4f313ea Use the right new link 2022-11-16 17:21:35 +01:00
bors[bot]
5e1fa53354 Merge #3055
3055: Remove Hacktoberfest sections r=curquiza a=meili-bot

_This PR is auto-generated._

Remove Hacktoberfest sections from README and CONTRIBUTING files.


Co-authored-by: meili-bot <74670311+meili-bot@users.noreply.github.com>
2022-11-15 14:29:27 +00:00
meili-bot
fe59a6f628 Update README.md 2022-11-15 15:27:18 +01:00
meili-bot
647b7a20e9 Update CONTRIBUTING.md 2022-11-15 15:27:17 +01:00
funilrys
e81b349658 Fix linting issue. 2022-11-14 18:51:34 +01:00
Clémentine Urquizar - curqui
a84fad5ce6 Add workflow_dispatch to flaky.yml 2022-11-14 10:10:00 +01:00
funilrys
0a102d601c Update Task.created_at
Indeed, before this patch we weren't considering the
TaskContent::SetingsUpdate while trying to find the creation date.
2022-11-13 10:14:20 +01:00
funilrys
8a14f6f545 Add Task.processed_at. 2022-11-13 10:13:10 +01:00
funilrys
079357ee1f Fix linting issues. 2022-11-12 20:57:27 +01:00
funilrys
06e7db7a1f fixup! Extract the dates out of the dumpv4. 2022-11-12 18:28:23 +01:00
bors[bot]
9e189f5041 Merge #3015
3015: Replace deprecated set-output in GitHub actions r=curquiza a=funilrys

# Pull Request

This patch fixes #3011.

This patch fixes the deprecation warning regarding the usage of `set-output`.
This patch fixes the issues by switching the following format:

```
echo ::set-output name=[name]::[value]
```

into the following format:

```
echo "[name]=[value]" >> ${GITHUB_OUTPUT}
```


## Related issue
Fixes #3011

## What does this PR do?
- Fix CI/CD deprecation warnings.

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: funilrys <contact@funilrys.com>
Co-authored-by: Clémentine Urquizar - curqui <clementine@meilisearch.com>
2022-11-10 09:45:55 +00:00
Clémentine Urquizar - curqui
fe980f9e88 Update .github/workflows/publish-docker-images.yml 2022-11-10 10:40:21 +01:00
Clémentine Urquizar - curqui
32cd9e4852 Update .github/workflows/publish-docker-images.yml 2022-11-10 10:40:16 +01:00
Clémentine Urquizar - curqui
8d79f501f3 Update .github/workflows/publish-binaries.yml 2022-11-10 10:40:09 +01:00
Clémentine Urquizar - curqui
c05abc2b0d Update .github/workflows/publish-binaries.yml 2022-11-10 10:40:04 +01:00
funilrys
a441fe5ae5 Remove unecessary line. 2022-11-08 21:18:24 +01:00
funilrys
7331da0410 Fix auto-formater issue.
Indeed, my editor always fixes the format for me. That's why those
2 lines were changed.
2022-11-08 21:16:47 +01:00
funilrys
72c4db4553 Rewrite: ${GITHUB_OUTPUT} -> $GITHUB_OUTPUT. 2022-11-08 21:15:28 +01:00
funilrys
953b2ec438 fixup! Extract the dates out of the dumpv4. 2022-11-02 17:49:37 +01:00
funilrys
09e71fdeb6 Replace deprecated set-output in GitHub actions
This patch fixes #3011.

This patch fixes the depracation warning regarding the usage of
`set-output`.
This patch fixes the issues by switching the following format:

```
echo ::set-output name=[name]::[value]
```

into the following format:

```
echo "[name]=[value]" >> ${GITHUB_OUTPUT}
```
2022-10-31 22:28:01 +01:00
funilrys
ab3056cc66 Extract the dates out of the dumpv4.
This patch possibly fixes #2987.

This patch introduces a way to fill the IndexMetadata.created_at
and IndexMetadata.updated_at keys from the tasks events.
This is done by reading the creation date of the first event
(created_at) and the creation date of the last event (updated_at).
2022-10-31 18:58:14 +01:00
bors[bot]
c7caadb54e Merge #3001
3001: Implement Uuid codec for heed r=Kerollmops a=elbertronnie

# Pull Request

## Related issue
Fixes #2984 

## What does this PR do?
- Created a new heed codec for uuid::Uuid named as UuidCodec
- Replaced SerdeBincode\<Uuid\> with UuidCodec
- Removed the TODO in code associated with this issue

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Elbert Ronnie <elbert.ronniep@gmail.com>
2022-10-31 16:13:20 +00:00
Elbert Ronnie
0219ef25fe Moved the struct UuidCodec to a new file 2022-10-31 12:25:19 +05:30
Elbert Ronnie
3911fd64b5 Implement Uuid codec for heed 2022-10-30 03:27:30 +05:30
340 changed files with 10130 additions and 3221 deletions

View File

@@ -1,127 +1,41 @@
#!/bin/sh
# Was used in our CIs to publish the latest docker image. Not used anymore, will be used again when v1 and v2 will be out and we will want to maintain multiple stable versions.
# Returns "true" or "false" (as a string) to be used in the `if` in GHA
# Used in our CIs to publish the latest Docker image.
# Checks if the current tag should be the latest (in terms of semver and not of release date).
# Ex: previous tag -> v2.1.1
# new tag -> v1.20.3
# The new tag (v1.20.3) should NOT be the latest
# So it returns "false", the `latest` tag should not be updated for the release v1.20.3 and still need to correspond to v2.1.1
# Checks if the current tag ($GITHUB_REF) corresponds to the latest release tag on GitHub
# Returns "true" or "false" (as a string).
# GLOBAL
GREP_SEMVER_REGEXP='v\([0-9]*\)[.]\([0-9]*\)[.]\([0-9]*\)$' # i.e. v[number].[number].[number]
GITHUB_API='https://api.github.com/repos/meilisearch/meilisearch/releases'
PNAME='meilisearch'
# FUNCTIONS
# semverParseInto and semverLT from https://github.com/cloudflare/semver_bash/blob/master/semver.sh
# usage: semverParseInto version major minor patch special
# version: the string version
# major, minor, patch, special: will be assigned by the function
semverParseInto() {
local RE='[^0-9]*\([0-9]*\)[.]\([0-9]*\)[.]\([0-9]*\)\([0-9A-Za-z-]*\)'
#MAJOR
eval $2=`echo $1 | sed -e "s#$RE#\1#"`
#MINOR
eval $3=`echo $1 | sed -e "s#$RE#\2#"`
#MINOR
eval $4=`echo $1 | sed -e "s#$RE#\3#"`
#SPECIAL
eval $5=`echo $1 | sed -e "s#$RE#\4#"`
}
# usage: semverLT version1 version2
semverLT() {
local MAJOR_A=0
local MINOR_A=0
local PATCH_A=0
local SPECIAL_A=0
local MAJOR_B=0
local MINOR_B=0
local PATCH_B=0
local SPECIAL_B=0
semverParseInto $1 MAJOR_A MINOR_A PATCH_A SPECIAL_A
semverParseInto $2 MAJOR_B MINOR_B PATCH_B SPECIAL_B
if [ $MAJOR_A -lt $MAJOR_B ]; then
return 0
fi
if [ $MAJOR_A -le $MAJOR_B ] && [ $MINOR_A -lt $MINOR_B ]; then
return 0
fi
if [ $MAJOR_A -le $MAJOR_B ] && [ $MINOR_A -le $MINOR_B ] && [ $PATCH_A -lt $PATCH_B ]; then
return 0
fi
if [ "_$SPECIAL_A" == "_" ] && [ "_$SPECIAL_B" == "_" ] ; then
return 1
fi
if [ "_$SPECIAL_A" == "_" ] && [ "_$SPECIAL_B" != "_" ] ; then
return 1
fi
if [ "_$SPECIAL_A" != "_" ] && [ "_$SPECIAL_B" == "_" ] ; then
return 0
fi
if [ "_$SPECIAL_A" < "_$SPECIAL_B" ]; then
return 0
fi
return 1
}
# Returns the tag of the latest stable release (in terms of semver and not of release date)
# Returns the version of the latest stable version of Meilisearch by setting the $latest variable.
get_latest() {
temp_file='temp_file' # temp_file needed because the grep would start before the download is over
curl -s 'https://api.github.com/repos/meilisearch/meilisearch/releases' > "$temp_file"
releases=$(cat "$temp_file" | \
grep -E "tag_name|draft|prerelease" \
| tr -d ',"' | cut -d ':' -f2 | tr -d ' ')
# Returns a list of [tag_name draft_boolean prerelease_boolean ...]
# Ex: v0.10.1 false false v0.9.1-rc.1 false true v0.9.0 false false...
# temp_file is needed because the grep would start before the download is over
temp_file=$(mktemp -q /tmp/$PNAME.XXXXXXXXX)
latest_release="$GITHUB_API/latest"
i=0
latest=""
current_tag=""
for release_info in $releases; do
if [ $i -eq 0 ]; then # Checking tag_name
if echo "$release_info" | grep -q "$GREP_SEMVER_REGEXP"; then # If it's not an alpha or beta release
current_tag=$release_info
else
current_tag=""
fi
i=1
elif [ $i -eq 1 ]; then # Checking draft boolean
if [ "$release_info" = "true" ]; then
current_tag=""
fi
i=2
elif [ $i -eq 2 ]; then # Checking prerelease boolean
if [ "$release_info" = "true" ]; then
current_tag=""
fi
i=0
if [ "$current_tag" != "" ]; then # If the current_tag is valid
if [ "$latest" = "" ]; then # If there is no latest yet
latest="$current_tag"
else
semverLT $current_tag $latest # Comparing latest and the current tag
if [ $? -eq 1 ]; then
latest="$current_tag"
fi
fi
fi
fi
done
if [ $? -ne 0 ]; then
echo "$0: Can't create temp file."
exit 1
fi
if [ -z "$GITHUB_PAT" ]; then
curl -s "$latest_release" > "$temp_file" || return 1
else
curl -H "Authorization: token $GITHUB_PAT" -s "$latest_release" > "$temp_file" || return 1
fi
latest="$(cat "$temp_file" | grep '"tag_name":' | cut -d ':' -f2 | tr -d '"' | tr -d ',' | tr -d ' ')"
rm -f "$temp_file"
echo $latest
return 0
}
# MAIN
current_tag="$(echo $GITHUB_REF | tr -d 'refs/tags/')"
latest="$(get_latest)"
get_latest
if [ "$current_tag" != "$latest" ]; then
# The current release tag is not the latest
@@ -130,3 +44,5 @@ else
# The current release tag is the latest
echo "true"
fi
exit 0

View File

@@ -1,33 +0,0 @@
---
on:
workflow_dispatch:
name: Execute code coverage
jobs:
nightly-coverage:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v3
- uses: actions-rs/toolchain@v1
with:
toolchain: nightly
override: true
- uses: actions-rs/cargo@v1
with:
command: clean
- uses: actions-rs/cargo@v1
with:
command: test
args: --all-features --no-fail-fast
env:
CARGO_INCREMENTAL: "0"
RUSTFLAGS: "-Zprofile -Ccodegen-units=1 -Cinline-threshold=0 -Clink-dead-code -Coverflow-checks=off -Cpanic=unwind -Zpanic_abort_tests"
- uses: actions-rs/grcov@v0.1
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
with:
token: ${{ secrets.CODECOV_TOKEN }}
file: ${{ steps.coverage.outputs.report }}
yml: ./codecov.yml
fail_ci_if_error: true

View File

@@ -15,7 +15,7 @@ jobs:
github_token: ${{ secrets.MEILI_BOT_GH_PAT }}
title: Upgrade dependencies
body: |
We need to update the dependencies of the Meilisearch repository, and, if possible, the dependencies of all the core-team repositories that Meilisearch depends on (milli, charabia, heed...).
We need to update the dependencies of the Meilisearch repository, and, if possible, the dependencies of all the engine-team repositories that Meilisearch depends on (milli, charabia, heed...).
⚠️ This issue should only be done at the beginning of the sprint!
labels: |

View File

@@ -1,14 +1,25 @@
name: Look for flaky tests
on:
workflow_dispatch:
schedule:
- cron: "0 12 * * FRI" # every friday at 12:00PM
- cron: "0 12 * * FRI" # Every Friday at 12:00PM
jobs:
flaky:
runs-on: ubuntu-18.04
runs-on: ubuntu-latest
container:
# Use ubuntu-18.04 to compile with glibc 2.27, which are the production expectations
image: ubuntu:18.04
steps:
- uses: actions/checkout@v3
- name: Install needed dependencies
run: |
apt-get update && apt-get install -y curl
apt-get install build-essential -y
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
override: true
- name: Install cargo-flaky
run: cargo install cargo-flaky
- name: Run cargo flaky 100 times

28
.github/workflows/latest-git-tag.yml vendored Normal file
View File

@@ -0,0 +1,28 @@
# Create or update a latest git tag when releasing a stable version of Meilisearch
name: Update latest git tag
on:
workflow_dispatch:
release:
types: [published]
jobs:
check-version:
name: Check the version validity
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Check release validity
if: github.event_name == 'release'
run: bash .github/scripts/check-release.sh
update-latest-tag:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: rickstaa/action-create-tag@v1
with:
tag: "latest"
message: "Latest stable release of Meilisearch"
# Move the tag if `latest` already exists
force_push_tag: true
github_token: ${{ secrets.MEILI_BOT_GH_PAT }}

View File

@@ -3,8 +3,8 @@ name: Milestone's workflow
# /!\ No git flow are handled here
# For each Milestone created (not opened!), and if the release is NOT a patch release (only the patch changed)
# - the roadmap issue is created, see https://github.com/meilisearch/core-team/blob/main/issue-templates/roadmap-issue.md
# - the changelog issue is created, see https://github.com/meilisearch/core-team/blob/main/issue-templates/changelog-issue.md
# - the roadmap issue is created, see https://github.com/meilisearch/engine-team/blob/main/issue-templates/roadmap-issue.md
# - the changelog issue is created, see https://github.com/meilisearch/engine-team/blob/main/issue-templates/changelog-issue.md
# For each Milestone closed
# - the `release_version` label is created
@@ -31,8 +31,6 @@ jobs:
runs-on: ubuntu-latest
outputs:
is-patch: ${{ steps.check-patch.outputs.is-patch }}
env:
MILESTONE_VERSION: ${{ github.event.milestone.title }}
steps:
- uses: actions/checkout@v3
- name: Check if this release is a patch release only
@@ -41,10 +39,10 @@ jobs:
echo version: $MILESTONE_VERSION
if [[ $MILESTONE_VERSION =~ ^v[0-9]+\.[0-9]+\.0$ ]]; then
echo 'This is NOT a patch release'
echo ::set-output name=is-patch::false
echo "is-patch=false" >> $GITHUB_OUTPUT
elif [[ $MILESTONE_VERSION =~ ^v[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
echo 'This is a patch release'
echo ::set-output name=is-patch::true
echo "is-patch=true" >> $GITHUB_OUTPUT
else
echo "Not a valid format of release, check the Milestone's title."
echo 'Should be vX.Y.Z'
@@ -61,7 +59,7 @@ jobs:
steps:
- uses: actions/checkout@v3
- name: Download the issue template
run: curl -s https://raw.githubusercontent.com/meilisearch/core-team/main/issue-templates/roadmap-issue.md > $ISSUE_TEMPLATE
run: curl -s https://raw.githubusercontent.com/meilisearch/engine-team/main/issue-templates/roadmap-issue.md > $ISSUE_TEMPLATE
- name: Replace all empty occurrences in the templates
run: |
# Replace all <<version>> occurrences
@@ -94,7 +92,7 @@ jobs:
steps:
- uses: actions/checkout@v3
- name: Download the issue template
run: curl -s https://raw.githubusercontent.com/meilisearch/core-team/main/issue-templates/changelog-issue.md > $ISSUE_TEMPLATE
run: curl -s https://raw.githubusercontent.com/meilisearch/engine-team/main/issue-templates/changelog-issue.md > $ISSUE_TEMPLATE
- name: Replace all empty occurrences in the templates
run: |
# Replace all <<version>> occurrences

View File

@@ -1,4 +1,5 @@
on:
workflow_dispatch:
schedule:
- cron: '0 2 * * *' # Every day at 2:00am
release:
@@ -17,69 +18,93 @@ jobs:
# If yes, it means we are publishing an official release.
# If no, we are releasing a RC, so no need to check the version.
- name: Check tag format
if: github.event_name != 'schedule'
if: github.event_name == 'release'
id: check-tag-format
run: |
escaped_tag=$(printf "%q" ${{ github.ref_name }})
if [[ $escaped_tag =~ ^v[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
echo ::set-output name=stable::true
echo "stable=true" >> $GITHUB_OUTPUT
else
echo ::set-output name=stable::false
echo "stable=false" >> $GITHUB_OUTPUT
fi
- name: Check release validity
if: github.event_name != 'schedule' && steps.check-tag-format.outputs.stable == 'true'
if: github.event_name == 'release' && steps.check-tag-format.outputs.stable == 'true'
run: bash .github/scripts/check-release.sh
publish:
publish-linux:
name: Publish binary for Linux
runs-on: ubuntu-latest
needs: check-version
container:
# Use ubuntu-18.04 to compile with glibc 2.27
image: ubuntu:18.04
steps:
- uses: actions/checkout@v3
- name: Install needed dependencies
run: |
apt-get update && apt-get install -y curl
apt-get install build-essential -y
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
override: true
- name: Build
run: cargo build --release --locked
# No need to upload binaries for dry run (cron)
- name: Upload binaries to release
if: github.event_name == 'release'
uses: svenstaro/upload-release-action@2.3.0
with:
repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}
file: target/release/meilisearch
asset_name: meilisearch-linux-amd64
tag: ${{ github.ref }}
publish-macos-windows:
name: Publish binary for ${{ matrix.os }}
runs-on: ${{ matrix.os }}
needs: check-version
strategy:
fail-fast: false
matrix:
os: [ubuntu-18.04, macos-latest, windows-latest]
os: [macos-12, windows-2022]
include:
- os: ubuntu-18.04
artifact_name: meilisearch
asset_name: meilisearch-linux-amd64
- os: macos-latest
- os: macos-12
artifact_name: meilisearch
asset_name: meilisearch-macos-amd64
- os: windows-latest
- os: windows-2022
artifact_name: meilisearch.exe
asset_name: meilisearch-windows-amd64.exe
steps:
- uses: hecrj/setup-rust-action@master
with:
rust-version: stable
- uses: actions/checkout@v3
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
override: true
- name: Build
run: cargo build --release --locked
# No need to upload binaries for dry run (cron)
- name: Upload binaries to release
if: github.event_name != 'schedule'
uses: svenstaro/upload-release-action@v1-release
if: github.event_name == 'release'
uses: svenstaro/upload-release-action@2.3.0
with:
repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}
file: target/release/${{ matrix.artifact_name }}
asset_name: ${{ matrix.asset_name }}
tag: ${{ github.ref }}
publish-macos-apple-silicon:
name: Publish binary for macOS silicon
runs-on: ${{ matrix.os }}
needs: check-version
continue-on-error: false
strategy:
fail-fast: false
matrix:
include:
- os: macos-latest
- os: macos-12
target: aarch64-apple-darwin
asset_name: meilisearch-macos-apple-silicon
steps:
- name: Checkout repository
uses: actions/checkout@v3
@@ -97,8 +122,8 @@ jobs:
args: --release --target ${{ matrix.target }}
- name: Upload the binary to release
# No need to upload binaries for dry run (cron)
if: github.event_name != 'schedule'
uses: svenstaro/upload-release-action@v1-release
if: github.event_name == 'release'
uses: svenstaro/upload-release-action@2.3.0
with:
repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}
file: target/${{ matrix.target }}/release/meilisearch
@@ -109,7 +134,6 @@ jobs:
name: Publish binary for aarch64
runs-on: ${{ matrix.os }}
needs: check-version
continue-on-error: false
strategy:
fail-fast: false
matrix:
@@ -120,11 +144,9 @@ jobs:
linker: gcc-aarch64-linux-gnu
use-cross: true
asset_name: meilisearch-linux-aarch64
steps:
- name: Checkout repository
uses: actions/checkout@v3
- name: Installing Rust toolchain
uses: actions-rs/toolchain@v1
with:
@@ -132,16 +154,13 @@ jobs:
profile: minimal
target: ${{ matrix.target }}
override: true
- name: APT update
run: |
sudo apt update
- name: Install target specific tools
if: matrix.use-cross
run: |
sudo apt-get install -y ${{ matrix.linker }}
- name: Configure target aarch64 GNU
if: matrix.target == 'aarch64-unknown-linux-gnu'
## Environment variable is not passed using env:
@@ -153,22 +172,18 @@ jobs:
echo '[target.aarch64-unknown-linux-gnu]' >> ~/.cargo/config
echo 'linker = "aarch64-linux-gnu-gcc"' >> ~/.cargo/config
echo 'JEMALLOC_SYS_WITH_LG_PAGE=16' >> $GITHUB_ENV
echo RUSTFLAGS="-Clink-arg=-fuse-ld=gold" >> $GITHUB_ENV
- name: Cargo build
uses: actions-rs/cargo@v1
with:
command: build
use-cross: ${{ matrix.use-cross }}
args: --release --target ${{ matrix.target }}
- name: List target output files
run: ls -lR ./target
- name: Upload the binary to release
# No need to upload binaries for dry run (cron)
if: github.event_name != 'schedule'
uses: svenstaro/upload-release-action@v1-release
if: github.event_name == 'release'
uses: svenstaro/upload-release-action@2.3.0
with:
repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}
file: target/${{ matrix.target }}/release/meilisearch

View File

@@ -1,8 +1,8 @@
name: Publish deb pkg to GitHub release & APT repository & Homebrew
name: Publish to APT repository & Homebrew
on:
release:
types: [released]
types: [published]
jobs:
check-version:
@@ -15,19 +15,27 @@ jobs:
debian:
name: Publish debian packagge
runs-on: ubuntu-18.04
runs-on: ubuntu-latest
needs: check-version
container:
# Use ubuntu-18.04 to compile with glibc 2.27
image: ubuntu:18.04
steps:
- uses: hecrj/setup-rust-action@master
- name: Install needed dependencies
run: |
apt-get update && apt-get install -y curl
apt-get install build-essential -y
- uses: actions-rs/toolchain@v1
with:
rust-version: stable
toolchain: stable
override: true
- name: Install cargo-deb
run: cargo install cargo-deb
- uses: actions/checkout@v3
- name: Build deb package
run: cargo deb -p meilisearch-http -o target/debian/meilisearch.deb
run: cargo deb -p meilisearch -o target/debian/meilisearch.deb
- name: Upload debian pkg to release
uses: svenstaro/upload-release-action@v1-release
uses: svenstaro/upload-release-action@2.3.0
with:
repo_token: ${{ secrets.MEILI_BOT_GH_PAT }}
file: target/debian/meilisearch.deb
@@ -38,11 +46,11 @@ jobs:
homebrew:
name: Bump Homebrew formula
runs-on: ubuntu-18.04
runs-on: ubuntu-latest
needs: check-version
steps:
- name: Create PR to Homebrew
uses: mislav/bump-homebrew-formula-action@v1
uses: mislav/bump-homebrew-formula-action@v2
with:
formula-name: meilisearch
env:

View File

@@ -1,10 +1,16 @@
---
on:
schedule:
- cron: '0 4 * * *' # Every day at 4:00am
push:
tags:
- '*'
# Will run for every tag pushed except `latest`
# When the `latest` git tag is created with this [CI](../latest-git-tag.yml)
# we don't need to create a Docker `latest` image again.
# The `latest` Docker image push is already done in this CI when releasing a stable version of Meilisearch.
tags-ignore:
- latest
# Both `schedule` and `workflow_dispatch` build the nightly tag
schedule:
- cron: '0 23 * * *' # Every day at 11:00pm
workflow_dispatch:
name: Publish tagged images to Docker Hub
@@ -14,27 +20,43 @@ jobs:
steps:
- uses: actions/checkout@v3
# Check if the tag has the v<nmumber>.<number>.<number> format. If yes, it means we are publishing an official release.
# If we are running a cron or manual job ('schedule' or 'workflow_dispatch' event), it means we are publishing the `nightly` tag, so not considered stable.
# If we have pushed a tag, and the tag has the v<nmumber>.<number>.<number> format, it means we are publishing an official release, so considered stable.
# In this situation, we need to set `output.stable` to create/update the following tags (additionally to the `vX.Y.Z` Docker tag):
# - a `vX.Y` (without patch version) Docker tag
# - a `latest` Docker tag
- name: Check tag format
if: github.event_name != 'schedule'
# For any other tag pushed, this is not considered stable.
- name: Define if stable and latest release
id: check-tag-format
env:
# To avoid request limit with the .github/scripts/is-latest-release.sh script
GITHUB_PATH: ${{ secrets.MEILI_BOT_GH_PAT }}
run: |
escaped_tag=$(printf "%q" ${{ github.ref_name }})
echo "latest=false" >> $GITHUB_OUTPUT
if [[ $escaped_tag =~ ^v[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
echo ::set-output name=stable::true
if [[ ${{ github.event_name }} != 'push' ]]; then
echo "stable=false" >> $GITHUB_OUTPUT
elif [[ $escaped_tag =~ ^v[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
echo "stable=true" >> $GITHUB_OUTPUT
echo "latest=$(sh .github/scripts/is-latest-release.sh)" >> $GITHUB_OUTPUT
else
echo ::set-output name=stable::false
echo "stable=false" >> $GITHUB_OUTPUT
fi
# Check only the validity of the tag for official releases (not for pre-releases or other tags)
# Check only the validity of the tag for stable releases (not for pre-releases or other tags)
- name: Check release validity
if: github.event_name != 'schedule' && steps.check-tag-format.outputs.stable == 'true'
if: steps.check-tag-format.outputs.stable == 'true'
run: bash .github/scripts/check-release.sh
- name: Set build-args for Docker buildx
id: build-metadata
run: |
# Extract commit date
commit_date=$(git show -s --format=%cd --date=iso-strict ${{ github.sha }})
echo "date=$commit_date" >> $GITHUB_OUTPUT
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
@@ -42,7 +64,6 @@ jobs:
uses: docker/setup-buildx-action@v2
- name: Login to Docker Hub
if: github.event_name != 'schedule'
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
@@ -53,25 +74,29 @@ jobs:
uses: docker/metadata-action@v4
with:
images: getmeili/meilisearch
# The latest and `vX.Y` tags are only pushed for the official Meilisearch releases
# See https://github.com/docker/metadata-action#latest-tag
# Prevent `latest` to be updated for each new tag pushed.
# We need latest and `vX.Y` tags to only be pushed for the stable Meilisearch releases.
flavor: latest=false
tags: |
type=ref,event=tag
type=raw,value=nightly,enable=${{ github.event_name != 'push' }}
type=semver,pattern=v{{major}}.{{minor}},enable=${{ steps.check-tag-format.outputs.stable == 'true' }}
type=raw,value=latest,enable=${{ steps.check-tag-format.outputs.stable == 'true' }}
type=raw,value=latest,enable=${{ steps.check-tag-format.outputs.stable == 'true' && steps.check-tag-format.outputs.latest == 'true' }}
- name: Build and push
uses: docker/build-push-action@v3
with:
# We do not push tags for the cron jobs, this is only for test purposes
push: ${{ github.event_name != 'schedule' }}
push: true
platforms: linux/amd64,linux/arm64
tags: ${{ steps.meta.outputs.tags }}
build-args: |
COMMIT_SHA=${{ github.sha }}
COMMIT_DATE=${{ steps.build-metadata.outputs.date }}
# /!\ Don't touch this without checking with Cloud team
- name: Send CI information to Cloud team
if: github.event_name != 'schedule'
# Do not send if nightly build (i.e. 'schedule' or 'workflow_dispatch' event)
if: github.event_name == 'push'
uses: peter-evans/repository-dispatch@v2
with:
token: ${{ secrets.MEILI_BOT_GH_PAT }}

View File

@@ -15,17 +15,46 @@ env:
RUSTFLAGS: "-D warnings"
jobs:
tests:
test-linux:
name: Tests on ubuntu-18.04
runs-on: ubuntu-latest
container:
# Use ubuntu-18.04 to compile with glibc 2.27, which are the production expectations
image: ubuntu:18.04
steps:
- uses: actions/checkout@v3
- name: Install needed dependencies
run: |
apt-get update && apt-get install -y curl
apt-get install build-essential -y
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
override: true
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.2.0
- name: Run cargo check without any default features
uses: actions-rs/cargo@v1
with:
command: build
args: --locked --release --no-default-features
- name: Run cargo test
uses: actions-rs/cargo@v1
with:
command: test
args: --locked --release
test-others:
name: Tests on ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: [ubuntu-18.04, macos-latest, windows-latest]
os: [macos-12, windows-2022]
steps:
- uses: actions/checkout@v3
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.0.0
uses: Swatinem/rust-cache@v2.2.0
- name: Run cargo check without any default features
uses: actions-rs/cargo@v1
with:
@@ -40,16 +69,22 @@ jobs:
# We run tests in debug also, to make sure that the debug_assertions are hit
test-debug:
name: Run tests in debug
runs-on: ubuntu-18.04
runs-on: ubuntu-latest
container:
# Use ubuntu-18.04 to compile with glibc 2.27, which are the production expectations
image: ubuntu:18.04
steps:
- uses: actions/checkout@v3
- name: Install needed dependencies
run: |
apt-get update && apt-get install -y curl
apt-get install build-essential -y
- uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.0.0
uses: Swatinem/rust-cache@v2.2.0
- name: Run tests in debug
uses: actions-rs/cargo@v1
with:
@@ -58,7 +93,7 @@ jobs:
clippy:
name: Run Clippy
runs-on: ubuntu-18.04
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions-rs/toolchain@v1
@@ -68,7 +103,7 @@ jobs:
override: true
components: clippy
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.0.0
uses: Swatinem/rust-cache@v2.2.0
- name: Run cargo clippy
uses: actions-rs/cargo@v1
with:
@@ -77,16 +112,16 @@ jobs:
fmt:
name: Run Rustfmt
runs-on: ubuntu-18.04
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
toolchain: nightly
override: true
components: rustfmt
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.0.0
uses: Swatinem/rust-cache@v2.2.0
- name: Run cargo fmt
run: cargo fmt --all -- --check

View File

@@ -16,7 +16,7 @@ jobs:
update-version-cargo-toml:
name: Update version in Cargo.toml files
runs-on: ubuntu-18.04
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions-rs/toolchain@v1
@@ -45,3 +45,4 @@ jobs:
--body '⚠️ This PR is automatically generated. Check the new version is the expected one before merging.' \
--label 'skip changelog' \
--milestone $NEW_VERSION
--base $GITHUB_REF_NAME

View File

@@ -10,24 +10,12 @@ If Meilisearch does not offer optimized support for your language, please consid
## Table of Contents
- [Hacktoberfest 2022](#hacktoberfest-2022)
- [Assumptions](#assumptions)
- [How to Contribute](#how-to-contribute)
- [Development Workflow](#development-workflow)
- [Git Guidelines](#git-guidelines)
- [Release Process (for internal team only)](#release-process-for-internal-team-only)
## Hacktoberfest 2022
It's [Hacktoberfest month](https://hacktoberfest.com)! 🥳
Thanks so much for participating with Meilisearch this year!
1. We will follow the quality standards set by the organizers of Hacktoberfest (see detail on their [website](https://hacktoberfest.com/participation/#spam)). Our reviewers will not consider any PR that doesnt match that standard.
2. PRs reviews will take place from Monday to Thursday, during usual working hours, CEST time. If you submit outside of these hours, theres no need to panic; we will get around to your contribution.
3. There will be no issue assignment as we dont want people to ask to be assigned specific issues and never return, discouraging the volunteer contributors from opening a PR to fix this issue. We take the liberty to choose the PR that best fixes the issue, so we encourage you to get to it as soon as possible and do your best!
You can check out the longer, more complete guideline documentation [here](https://github.com/meilisearch/.github/blob/main/Hacktoberfest_2022_contributors_guidelines.md).
## Assumptions
1. **You're familiar with [GitHub](https://github.com) and the [Pull Requests](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests)(PR) workflow.**
@@ -109,7 +97,7 @@ _[Read more about this](https://github.com/meilisearch/integration-guides/blob/m
### How to Publish a new Release
The full Meilisearch release process is described in [this guide](https://github.com/meilisearch/core-team/blob/main/resources/meilisearch-release.md). Please follow it carefully before doing any release.
The full Meilisearch release process is described in [this guide](https://github.com/meilisearch/engine-team/blob/main/resources/meilisearch-release.md). Please follow it carefully before doing any release.
### Release assets

210
Cargo.lock generated
View File

@@ -34,6 +34,18 @@ dependencies = [
"smallvec",
]
[[package]]
name = "actix-governor"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6fbf4afa1e2f7c28040febe2a7199ad0a5fed564dd645da06ab12642c7d22483"
dependencies = [
"actix-http",
"actix-web",
"futures",
"governor",
]
[[package]]
name = "actix-http"
version = "3.2.2"
@@ -617,9 +629,9 @@ dependencies = [
[[package]]
name = "cargo_toml"
version = "0.12.4"
version = "0.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6a621d5d6d6c8d086dbaf1fe659981da41a1b63c6bdbba30b4dbb592c6d3bd49"
checksum = "aa0e3586af56b3bfa51fca452bd56e8dbbbd5d8d81cbf0b7e4e35b695b537eb8"
dependencies = [
"serde",
"toml",
@@ -1017,6 +1029,19 @@ dependencies = [
"syn 1.0.103",
]
[[package]]
name = "dashmap"
version = "5.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "907076dfda823b0b36d2a1bb5f90c96660a5bbcd7729e10727f07858f22c4edc"
dependencies = [
"cfg-if",
"hashbrown 0.12.3",
"lock_api",
"once_cell",
"parking_lot_core",
]
[[package]]
name = "derive_builder"
version = "0.11.2"
@@ -1101,7 +1126,7 @@ dependencies = [
[[package]]
name = "dump"
version = "0.30.0"
version = "1.0.0"
dependencies = [
"anyhow",
"big_s",
@@ -1310,7 +1335,7 @@ dependencies = [
[[package]]
name = "file-store"
version = "0.30.0"
version = "1.0.0"
dependencies = [
"faux",
"tempfile",
@@ -1332,8 +1357,8 @@ dependencies = [
[[package]]
name = "filter-parser"
version = "0.35.0"
source = "git+https://github.com/meilisearch/milli.git#2e539249cb16f5e88be9f21ab712f8b4266cad36"
version = "0.37.3"
source = "git+https://github.com/meilisearch/milli.git?tag=v0.37.3#2101e3c6d592f6ce6cc25b6e4585f3a8a6246457"
dependencies = [
"nom",
"nom_locate",
@@ -1351,8 +1376,8 @@ dependencies = [
[[package]]
name = "flatten-serde-json"
version = "0.35.0"
source = "git+https://github.com/meilisearch/milli.git#2e539249cb16f5e88be9f21ab712f8b4266cad36"
version = "0.37.3"
source = "git+https://github.com/meilisearch/milli.git?tag=v0.37.3#2101e3c6d592f6ce6cc25b6e4585f3a8a6246457"
dependencies = [
"serde_json",
]
@@ -1449,6 +1474,12 @@ version = "0.3.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2ffb393ac5d9a6eaa9d3fdf37ae2776656b706e200c8e16b1bdb227f5198e6ea"
[[package]]
name = "futures-timer"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e64b03909df88034c26dc1547e8970b91f98bdb65165d6a4e9110d94263dbb2c"
[[package]]
name = "futures-util"
version = "0.3.25"
@@ -1500,7 +1531,7 @@ checksum = "c05aeb6a22b8f62540c194aac980f2115af067bfe15a0734d7277a768d396b31"
dependencies = [
"cfg-if",
"libc",
"wasi",
"wasi 0.11.0+wasi-snapshot-preview1",
]
[[package]]
@@ -1541,10 +1572,27 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9b919933a397b79c37e33b77bb2aa3dc8eb6e165ad809e58ff75bc7db2e34574"
[[package]]
name = "grenad"
version = "0.4.3"
name = "governor"
version = "0.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5e46ef6273921c5c0cced57632b48c02a968a57f9af929ef78f980409c2e26f2"
checksum = "19775995ee20209163239355bc3ad2f33f83da35d9ef72dea26e5af753552c87"
dependencies = [
"dashmap",
"futures",
"futures-timer",
"no-std-compat",
"nonzero_ext",
"parking_lot",
"quanta",
"rand",
"smallvec",
]
[[package]]
name = "grenad"
version = "0.4.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5232b2d157b7bf63d7abe1b12177039e58db2f29e377517c0cdee1578cca4c93"
dependencies = [
"bytemuck",
"byteorder",
@@ -1616,8 +1664,8 @@ checksum = "2540771e65fc8cb83cd6e8a237f70c319bd5c29f78ed1084ba5d50eeac86f7f9"
[[package]]
name = "heed"
version = "0.12.2"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.3#076971765f4ce09591ed7e19e45ea817580a53e3"
version = "0.12.4"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.4#7a4542bc72dd60ef0f508c89900ea292218223fb"
dependencies = [
"byteorder",
"heed-traits",
@@ -1625,7 +1673,7 @@ dependencies = [
"libc",
"lmdb-rkv-sys",
"once_cell",
"page_size",
"page_size 0.4.2",
"synchronoise",
"url",
"zerocopy",
@@ -1634,12 +1682,12 @@ dependencies = [
[[package]]
name = "heed-traits"
version = "0.7.0"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.3#076971765f4ce09591ed7e19e45ea817580a53e3"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.4#7a4542bc72dd60ef0f508c89900ea292218223fb"
[[package]]
name = "heed-types"
version = "0.7.2"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.3#076971765f4ce09591ed7e19e45ea817580a53e3"
source = "git+https://github.com/meilisearch/heed?tag=v0.12.4#7a4542bc72dd60ef0f508c89900ea292218223fb"
dependencies = [
"bincode",
"heed-traits",
@@ -1767,7 +1815,7 @@ dependencies = [
[[package]]
name = "index-scheduler"
version = "0.30.0"
version = "1.0.0"
dependencies = [
"anyhow",
"big_s",
@@ -1783,6 +1831,7 @@ dependencies = [
"meili-snap",
"meilisearch-types",
"nelson",
"page_size 0.5.0",
"roaring",
"serde",
"serde_json",
@@ -1897,8 +1946,8 @@ dependencies = [
[[package]]
name = "json-depth-checker"
version = "0.35.0"
source = "git+https://github.com/meilisearch/milli.git#2e539249cb16f5e88be9f21ab712f8b4266cad36"
version = "0.37.3"
source = "git+https://github.com/meilisearch/milli.git?tag=v0.37.3#2101e3c6d592f6ce6cc25b6e4585f3a8a6246457"
dependencies = [
"serde_json",
]
@@ -2154,8 +2203,8 @@ checksum = "d4d2456c373231a208ad294c33dc5bff30051eafd954cd4caae83a712b12854d"
[[package]]
name = "lmdb-rkv-sys"
version = "0.15.0"
source = "git+https://github.com/meilisearch/lmdb-rs#8f0fe377a98d177cabbd056e777778f559df2bb6"
version = "0.15.1"
source = "git+https://github.com/meilisearch/lmdb-rs#0144fb2bac524cdc2897d7750681ed3fff2dc3ac"
dependencies = [
"cc",
"libc",
@@ -2231,6 +2280,15 @@ dependencies = [
"crc",
]
[[package]]
name = "mach"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b823e83b2affd8f40a9ee8c29dbc56404c1e34cd2710921f2801e2cf29527afa"
dependencies = [
"libc",
]
[[package]]
name = "manifest-dir-macros"
version = "0.1.16"
@@ -2257,7 +2315,7 @@ checksum = "490cc448043f947bae3cbee9c203358d62dbee0db12107a74be5c30ccfd09771"
[[package]]
name = "meili-snap"
version = "0.30.0"
version = "1.0.0"
dependencies = [
"insta",
"md5",
@@ -2265,27 +2323,11 @@ dependencies = [
]
[[package]]
name = "meilisearch-auth"
version = "0.30.0"
dependencies = [
"enum-iterator",
"hmac",
"meilisearch-types",
"rand",
"roaring",
"serde",
"serde_json",
"sha2",
"thiserror",
"time",
"uuid 1.2.1",
]
[[package]]
name = "meilisearch-http"
version = "0.30.0"
name = "meilisearch"
version = "0.30.1"
dependencies = [
"actix-cors",
"actix-governor",
"actix-http",
"actix-rt",
"actix-web",
@@ -2364,19 +2406,39 @@ dependencies = [
"zip",
]
[[package]]
name = "meilisearch-auth"
version = "1.0.0"
dependencies = [
"base64",
"enum-iterator",
"hmac",
"meilisearch-types",
"rand",
"roaring",
"serde",
"serde_json",
"sha2",
"thiserror",
"time",
"uuid 1.2.1",
]
[[package]]
name = "meilisearch-types"
version = "0.30.0"
version = "1.0.0"
dependencies = [
"actix-web",
"anyhow",
"csv",
"either",
"enum-iterator",
"file-store",
"flate2",
"fst",
"insta",
"meili-snap",
"memmap2",
"milli",
"proptest",
"proptest-derive",
@@ -2384,6 +2446,7 @@ dependencies = [
"serde",
"serde_json",
"tar",
"tempfile",
"thiserror",
"time",
"tokio",
@@ -2416,8 +2479,8 @@ dependencies = [
[[package]]
name = "milli"
version = "0.35.0"
source = "git+https://github.com/meilisearch/milli.git#2e539249cb16f5e88be9f21ab712f8b4266cad36"
version = "0.37.3"
source = "git+https://github.com/meilisearch/milli.git?tag=v0.37.3#2101e3c6d592f6ce6cc25b6e4585f3a8a6246457"
dependencies = [
"bimap",
"bincode",
@@ -2507,7 +2570,7 @@ checksum = "e5d732bc30207a6423068df043e3d02e0735b155ad7ce1a6f76fe2baa5b158de"
dependencies = [
"libc",
"log",
"wasi",
"wasi 0.11.0+wasi-snapshot-preview1",
"windows-sys 0.42.0",
]
@@ -2531,6 +2594,12 @@ name = "nelson"
version = "0.1.0"
source = "git+https://github.com/meilisearch/nelson.git?rev=675f13885548fb415ead8fbb447e9e6d9314000a#675f13885548fb415ead8fbb447e9e6d9314000a"
[[package]]
name = "no-std-compat"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b93853da6d84c2e3c7d730d6473e8817692dd89be387eb01b94d7f108ecb5b8c"
[[package]]
name = "nom"
version = "7.1.1"
@@ -2552,6 +2621,12 @@ dependencies = [
"nom",
]
[[package]]
name = "nonzero_ext"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38bf9645c8b145698bb0b18a4637dcacbc421ea49bef2317e4fd8065a387cf21"
[[package]]
name = "ntapi"
version = "0.4.0"
@@ -2663,6 +2738,16 @@ dependencies = [
"winapi",
]
[[package]]
name = "page_size"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1b7663cbd190cfd818d08efa8497f6cd383076688c49a391ef7c0d03cd12b561"
dependencies = [
"libc",
"winapi",
]
[[package]]
name = "parking_lot"
version = "0.12.1"
@@ -2747,7 +2832,7 @@ checksum = "478c572c3d73181ff3c2539045f6eb99e5491218eae919370993b890cdbdd98e"
[[package]]
name = "permissive-json-pointer"
version = "0.30.0"
version = "1.0.0"
dependencies = [
"big_s",
"serde_json",
@@ -2977,6 +3062,22 @@ version = "2.28.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "106dd99e98437432fed6519dedecfade6a06a73bb7b2a1e019fdd2bee5778d94"
[[package]]
name = "quanta"
version = "0.9.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "20afe714292d5e879d8b12740aa223c6a88f118af41870e8b6196e39a02238a8"
dependencies = [
"crossbeam-utils",
"libc",
"mach",
"once_cell",
"raw-cpuid",
"wasi 0.10.2+wasi-snapshot-preview1",
"web-sys",
"winapi",
]
[[package]]
name = "quick-error"
version = "1.2.3"
@@ -3046,6 +3147,15 @@ dependencies = [
"rand_core",
]
[[package]]
name = "raw-cpuid"
version = "10.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a6823ea29436221176fe662da99998ad3b4db2c7f31e7b6f5fe43adccd6320bb"
dependencies = [
"bitflags",
]
[[package]]
name = "rayon"
version = "1.5.3"
@@ -4068,6 +4178,12 @@ dependencies = [
"try-lock",
]
[[package]]
name = "wasi"
version = "0.10.2+wasi-snapshot-preview1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd6fbd9a79829dd1ad0cc20627bf1ed606756a7f77edff7b66b7064f9cb327c6"
[[package]]
name = "wasi"
version = "0.11.0+wasi-snapshot-preview1"

View File

@@ -1,7 +1,7 @@
[workspace]
resolver = "2"
members = [
"meilisearch-http",
"meilisearch",
"meilisearch-types",
"meilisearch-auth",
"meili-snap",

View File

@@ -7,7 +7,7 @@ WORKDIR /meilisearch
ARG COMMIT_SHA
ARG COMMIT_DATE
ENV COMMIT_SHA=${COMMIT_SHA} COMMIT_DATE=${COMMIT_DATE}
ENV VERGEN_GIT_SHA=${COMMIT_SHA} VERGEN_GIT_COMMIT_TIMESTAMP=${COMMIT_DATE}
ENV RUSTFLAGS="-C target-feature=-crt-static"
COPY . .

View File

@@ -9,7 +9,7 @@
<a href="https://blog.meilisearch.com">Blog</a> |
<a href="https://docs.meilisearch.com">Documentation</a> |
<a href="https://docs.meilisearch.com/faq/">FAQ</a> |
<a href="https://slack.meilisearch.com">Slack</a>
<a href="https://discord.meilisearch.com">Discord</a>
</h4>
<p align="center">
@@ -34,14 +34,6 @@ Meilisearch helps you shape a delightful search experience in a snap, offering f
🔥 [**Try it!**](https://where2watch.meilisearch.com/) 🔥
## 🎃 Hacktoberfest
Its Hacktoberfest 2022 @Meilisearch
[Hacktoberfest](https://hacktoberfest.com/) is a celebration of the open-source community. This year, and for the third time in a row, Meilisearch is participating in this fantastic event.
Youd like to contribute? Dont hesitate to check out our [contributing guidelines](./CONTRIBUTING.md).
## ✨ Features
- **Search-as-you-type:** find search results in less than 50 milliseconds
@@ -69,7 +61,7 @@ You may also want to check out [Meilisearch 101](https://docs.meilisearch.com/le
## ☁️ Meilisearch cloud
Join the closed beta for Meilisearch cloud by filling out [this form](https://meilisearch.typeform.com/to/VI2cI2rv).
Let us manage your infrastructure so you can focus on integrating a great search experience. Try [Meilisearch Cloud](https://meilisearch.com/pricing) today.
## 🧰 SDKs & integration tools
@@ -105,7 +97,7 @@ Meilisearch is a search engine created by [Meili](https://www.welcometothejungle
- For feature requests, please visit our [product repository](https://github.com/meilisearch/product/discussions)
- Found a bug? Open an [issue](https://github.com/meilisearch/meilisearch/issues)!
- Want to be part of our Slack community? [Join us!](https://slack.meilisearch.com/)
- Want to be part of our Discord community? [Join us!](https://discord.gg/meilisearch)
- For everything else, please check [this page listing some of the other places where you can find us](https://docs.meilisearch.com/learn/what_is_meilisearch/contact.html)
Thank you for your support!

View File

@@ -1,7 +1,7 @@
status = [
'Tests on ubuntu-18.04',
'Tests on macos-latest',
'Tests on windows-latest',
'Tests on macos-12',
'Tests on windows-2022',
'Run Clippy',
'Run Rustfmt',
'Run tests in debug',

View File

@@ -56,7 +56,7 @@ disable_auto_batching = false
### DUMPS ###
#############
dumps_dir = "dumps/"
dump_dir = "dumps/"
# Sets the directory where Meilisearch will create dump files.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#dumps-destination
@@ -73,6 +73,69 @@ ignore_dump_if_db_exists = false
# https://docs.meilisearch.com/learn/configuration/instance_options.html#ignore-dump-if-db-exists
#####################
### RATE LIMITING ###
#####################
rate_limiting_disable_all = false
# Prevents a Meilisearch instance from performing any rate limiting.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-disable-all
rate_limiting_disable_global = false
# Prevents a Meilisearch instance from performing rate limiting global to all queries.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-disable-global
rate_limiting_global_pool = 100000
# The maximum pool of search requests that can be performed before they are rejected.
#
# The pool starts full at the provided value, then each search request diminishes the pool by 1.
# When the pool is empty the search request is rejected.
# The pool is replenished by 1 depending on the cooldown period.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-global-pool
rate_limiting_global_cooldown_ns = 50000
# The amount of time, in nanoseconds, before the pool of available search requests is replenished by 1 again.
#
# The maximum number of available search requests is given by `rate_limiting_global_pool`.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-global-cooldown-ns
rate_limiting_disable_ip = false
# Prevents a Meilisearch instance from performing rate limiting per IP address.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-disable-ip
rate_limiting_ip_pool = 200
# The maximum pool of search requests that can be performed from a specific IP before they are rejected.
#
# The pool starts full at the provided value, then each search request from the same IP address diminishes the pool by 1.
# When the pool is empty the search request is rejected.
# The pool is replenished by 1 depending on the cooldown period.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-ip-pool
rate_limiting_ip_cooldown_ns = 50000000
# The amount of time, in nanoseconds, before the pool of available search requests for a specific IP address is replenished by 1 again.
#
# The maximum number of available search requests for a specific IP address is given by `rate_limiting_ip_pool`.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-ip-cooldown-ns
rate_limiting_disable_api_key = false
# Prevents a Meilisearch instance from performing rate limiting per API key.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-disable-api-key
rate_limiting_api_key_pool = 10000
# The maximum pool of search requests that can be performed using a specific API key before they are rejected.
#
# The pool starts full at the provided value, then each search request using the same API key diminishes the pool by 1.
# When the pool is empty the search request is rejected.
# The pool is replenished by 1 depending on the cooldown period.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-api-key-pool
rate_limiting_api_key_cooldown_ns = 500000
# The amount of time, in nanoseconds, before the pool of available search requests using a specific API key is replenished by 1 again.
#
# The maximum number of available search requests using a specific API key is given by `rate_limiting_api_key_pool`.
# https://docs.meilisearch.com/learn/configuration/instance_options.html#rate-limiting-api-key-cooldown-ns
#################
### SNAPSHOTS ###
#################

View File

@@ -1,5 +1,8 @@
#!/bin/sh
# This script can optionally use a GitHub token to increase your request limit (for example, if using this script in a CI).
# To use a GitHub token, pass it through the GITHUB_PAT environment variable.
# GLOBALS
# Colors
@@ -10,9 +13,6 @@ DEFAULT='\033[0m'
# Project name
PNAME='meilisearch'
# Version regexp i.e. v[number].[number].[number]
GREP_SEMVER_REGEXP='v\([0-9]*\)[.]\([0-9]*\)[.]\([0-9]*\)$'
# GitHub API address
GITHUB_API='https://api.github.com/repos/meilisearch/meilisearch/releases'
# GitHub Release address
@@ -20,126 +20,26 @@ GITHUB_REL='https://github.com/meilisearch/meilisearch/releases/download/'
# FUNCTIONS
# semverParseInto and semverLT from: https://github.com/cloudflare/semver_bash/blob/master/semver.sh
# usage: semverParseInto version major minor patch special
# version: the string version
# major, minor, patch, special: will be assigned by the function
semverParseInto() {
local RE='[^0-9]*\([0-9]*\)[.]\([0-9]*\)[.]\([0-9]*\)\([0-9A-Za-z-]*\)'
# MAJOR
eval $2=`echo $1 | sed -e "s#$RE#\1#"`
# MINOR
eval $3=`echo $1 | sed -e "s#$RE#\2#"`
# PATCH
eval $4=`echo $1 | sed -e "s#$RE#\3#"`
# SPECIAL
eval $5=`echo $1 | sed -e "s#$RE#\4#"`
}
# usage: semverLT version1 version2
semverLT() {
local MAJOR_A=0
local MINOR_A=0
local PATCH_A=0
local SPECIAL_A=0
local MAJOR_B=0
local MINOR_B=0
local PATCH_B=0
local SPECIAL_B=0
semverParseInto $1 MAJOR_A MINOR_A PATCH_A SPECIAL_A
semverParseInto $2 MAJOR_B MINOR_B PATCH_B SPECIAL_B
if [ $MAJOR_A -lt $MAJOR_B ]; then
return 0
fi
if [ $MAJOR_A -le $MAJOR_B ] && [ $MINOR_A -lt $MINOR_B ]; then
return 0
fi
if [ $MAJOR_A -le $MAJOR_B ] && [ $MINOR_A -le $MINOR_B ] && [ $PATCH_A -lt $PATCH_B ]; then
return 0
fi
if [ "_$SPECIAL_A" == '_' ] && [ "_$SPECIAL_B" == '_' ] ; then
return 1
fi
if [ "_$SPECIAL_A" == '_' ] && [ "_$SPECIAL_B" != '_' ] ; then
return 1
fi
if [ "_$SPECIAL_A" != '_' ] && [ "_$SPECIAL_B" == '_' ] ; then
return 0
fi
if [ "_$SPECIAL_A" < "_$SPECIAL_B" ]; then
return 0
fi
return 1
}
# Get a token from: https://github.com/settings/tokens to increase rate limit (from 60 to 5000),
# make sure the token scope is set to 'public_repo'.
# Create GITHUB_PAT environment variable once you acquired the token to start using it.
# Returns the tag of the latest stable release (in terms of semver and not of release date).
# Gets the version of the latest stable version of Meilisearch by setting the $latest variable.
# Returns 0 in case of success, 1 otherwise.
get_latest() {
# temp_file is needed because the grep would start before the download is over
temp_file=$(mktemp -q /tmp/$PNAME.XXXXXXXXX)
latest_release="$GITHUB_API/latest"
if [ $? -ne 0 ]; then
echo "$0: Can't create temp file, bye bye.."
echo "$0: Can't create temp file."
fetch_release_failure_usage
exit 1
fi
if [ -z "$GITHUB_PAT" ]; then
curl -s $GITHUB_API > "$temp_file" || return 1
curl -s "$latest_release" > "$temp_file" || return 1
else
curl -H "Authorization: token $GITHUB_PAT" -s $GITHUB_API > "$temp_file" || return 1
curl -H "Authorization: token $GITHUB_PAT" -s "$latest_release" > "$temp_file" || return 1
fi
releases=$(cat "$temp_file" | \
grep -E '"tag_name":|"draft":|"prerelease":' \
| tr -d ',"' | cut -d ':' -f2 | tr -d ' ')
# Returns a list of [tag_name draft_boolean prerelease_boolean ...]
# Ex: v0.10.1 false false v0.9.1-rc.1 false true v0.9.0 false false...
i=0
latest=''
current_tag=''
for release_info in $releases; do
# Checking tag_name
if [ $i -eq 0 ]; then
# If it's not an alpha or beta release
if echo "$release_info" | grep -q "$GREP_SEMVER_REGEXP"; then
current_tag=$release_info
else
current_tag=''
fi
i=1
# Checking draft boolean
elif [ $i -eq 1 ]; then
if [ "$release_info" = 'true' ]; then
current_tag=''
fi
i=2
# Checking prerelease boolean
elif [ $i -eq 2 ]; then
if [ "$release_info" = 'true' ]; then
current_tag=''
fi
i=0
# If the current_tag is valid
if [ "$current_tag" != '' ]; then
# If there is no latest yes
if [ "$latest" = '' ]; then
latest="$current_tag"
else
# Comparing latest and the current tag
semverLT $current_tag $latest
if [ $? -eq 1 ]; then
latest="$current_tag"
fi
fi
fi
fi
done
latest="$(cat "$temp_file" | grep '"tag_name":' | cut -d ':' -f2 | tr -d '"' | tr -d ',' | tr -d ' ')"
rm -f "$temp_file"
return 0
@@ -174,9 +74,9 @@ get_archi() {
archi='amd64'
;;
'arm64')
# MacOS M1
# macOS M1/M2
if [ $os = 'macos' ]; then
archi='amd64'
archi='apple-silicon'
else
archi='aarch64'
fi
@@ -210,12 +110,13 @@ fetch_release_failure_usage() {
echo ''
printf "$RED%s\n$DEFAULT" 'ERROR: Impossible to get the latest stable version of Meilisearch.'
echo 'Please let us know about this issue: https://github.com/meilisearch/meilisearch/issues/new/choose'
echo ''
echo 'In the meantime, you can manually download the appropriate binary from the GitHub release assets here: https://github.com/meilisearch/meilisearch/releases/latest'
}
fill_release_variables() {
# Fill $latest variable.
if ! get_latest; then
# TO CHANGE.
fetch_release_failure_usage
exit 1
fi

View File

@@ -1,6 +1,6 @@
[package]
name = "dump"
version = "0.30.0"
version = "1.0.0"
edition = "2021"
[dependencies]

View File

@@ -3,8 +3,6 @@ use thiserror::Error;
#[derive(Debug, Error)]
pub enum Error {
#[error("The version 1 of the dumps is not supported anymore. You can re-export your dump from a version between 0.21 and 0.24, or start fresh from a version 0.25 onwards.")]
DumpV1Unsupported,
#[error("Bad index name.")]
BadIndexName,
#[error("Malformed task.")]
@@ -21,14 +19,14 @@ pub enum Error {
impl ErrorCode for Error {
fn error_code(&self) -> Code {
match self {
// Are these three really Internal errors?
// TODO look at that later.
Error::Io(_) => Code::Internal,
Error::Io(e) => e.error_code(),
// These errors either happen when creating a dump and don't need any error code,
// or come from an internal bad deserialization.
Error::Serde(_) => Code::Internal,
Error::Uuid(_) => Code::Internal,
// all these errors should never be raised when creating a dump, thus no error code should be associated.
Error::DumpV1Unsupported => Code::Internal,
Error::BadIndexName => Code::Internal,
Error::MalformedTask => Code::Internal,
}

View File

@@ -23,7 +23,7 @@ const CURRENT_DUMP_VERSION: Version = Version::V6;
type Result<T> = std::result::Result<T, Error>;
#[derive(Debug, PartialEq, Eq, Serialize, Deserialize)]
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct Metadata {
pub dump_version: Version,
@@ -32,7 +32,7 @@ pub struct Metadata {
pub dump_date: OffsetDateTime,
}
#[derive(Debug, PartialEq, Eq, Serialize, Deserialize)]
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct IndexMetadata {
pub uid: String,
@@ -43,7 +43,7 @@ pub struct IndexMetadata {
pub updated_at: OffsetDateTime,
}
#[derive(Debug, PartialEq, Eq, Deserialize, Serialize)]
#[derive(Debug, Clone, Copy, PartialEq, Eq, Deserialize, Serialize)]
pub enum Version {
V1,
V2,
@@ -87,7 +87,7 @@ pub struct TaskDump {
pub finished_at: Option<OffsetDateTime>,
}
// A `Kind` specific version made for the dump. If modified you may break the dump.
// A `Kind` specific version made for the dump. If modified you may break the dump.
#[derive(Debug, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub enum KindDump {
@@ -125,7 +125,6 @@ pub enum KindDump {
tasks: RoaringBitmap,
},
DumpCreation {
dump_uid: String,
keys: Vec<Key>,
instance_uid: Option<InstanceUid>,
},
@@ -188,8 +187,8 @@ impl From<KindWithContent> for KindDump {
KindWithContent::TaskDeletion { query, tasks } => {
KindDump::TasksDeletion { query, tasks }
}
KindWithContent::DumpCreation { dump_uid, keys, instance_uid } => {
KindDump::DumpCreation { dump_uid, keys, instance_uid }
KindWithContent::DumpCreation { keys, instance_uid } => {
KindDump::DumpCreation { keys, instance_uid }
}
KindWithContent::SnapshotCreation => KindDump::SnapshotCreation,
}
@@ -417,7 +416,6 @@ pub(crate) mod test {
}
#[test]
#[ignore]
fn test_creating_and_read_dump() {
let mut file = create_test_dump();
let mut dump = DumpReader::open(&mut file).unwrap();

View File

@@ -1,3 +1,4 @@
pub mod v1_to_v2;
pub mod v2_to_v3;
pub mod v3_to_v4;
pub mod v4_to_v5;

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/compat/v1_to_v2.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,37 @@
---
source: dump/src/reader/compat/v1_to_v2.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,27 @@
---
source: dump/src/reader/compat/v1_to_v2.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness",
"asc(release_date)"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,27 @@
---
source: dump/src/reader/compat/v1_to_v2.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness",
"asc(release_date)"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/compat/v1_to_v2.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/compat/v2_to_v3.rs
expression: movies2.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/compat/v2_to_v3.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,37 @@
---
source: dump/src/reader/compat/v2_to_v3.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,24 @@
---
source: dump/src/reader/compat/v2_to_v3.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,25 @@
---
source: dump/src/reader/compat/v3_to_v4.rs
expression: movies2.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,25 @@
---
source: dump/src/reader/compat/v3_to_v4.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,39 @@
---
source: dump/src/reader/compat/v3_to_v4.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,31 @@
---
source: dump/src/reader/compat/v3_to_v4.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"sortableAttributes": [
"release_date"
],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,56 @@
---
source: dump/src/reader/compat/v4_to_v5.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": "Reset",
"searchableAttributes": "Reset",
"filterableAttributes": {
"Set": []
},
"sortableAttributes": {
"Set": []
},
"rankingRules": {
"Set": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
]
},
"stopWords": {
"Set": []
},
"synonyms": {
"Set": {}
},
"distinctAttribute": "Reset",
"typoTolerance": {
"Set": {
"enabled": {
"Set": true
},
"minWordSizeForTypos": {
"Set": {
"oneTypo": {
"Set": 5
},
"twoTypos": {
"Set": 9
}
}
},
"disableOnWords": {
"Set": []
},
"disableOnAttributes": {
"Set": []
}
}
},
"faceting": "NotSet",
"pagination": "NotSet"
}

View File

@@ -0,0 +1,70 @@
---
source: dump/src/reader/compat/v4_to_v5.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": "Reset",
"searchableAttributes": "Reset",
"filterableAttributes": {
"Set": []
},
"sortableAttributes": {
"Set": []
},
"rankingRules": {
"Set": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
]
},
"stopWords": {
"Set": []
},
"synonyms": {
"Set": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
}
},
"distinctAttribute": "Reset",
"typoTolerance": {
"Set": {
"enabled": {
"Set": true
},
"minWordSizeForTypos": {
"Set": {
"oneTypo": {
"Set": 5
},
"twoTypos": {
"Set": 9
}
}
},
"disableOnWords": {
"Set": []
},
"disableOnAttributes": {
"Set": []
}
}
},
"faceting": "NotSet",
"pagination": "NotSet"
}

View File

@@ -0,0 +1,62 @@
---
source: dump/src/reader/compat/v4_to_v5.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": "Reset",
"searchableAttributes": "Reset",
"filterableAttributes": {
"Set": [
"genres",
"id"
]
},
"sortableAttributes": {
"Set": [
"release_date"
]
},
"rankingRules": {
"Set": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
]
},
"stopWords": {
"Set": []
},
"synonyms": {
"Set": {}
},
"distinctAttribute": "Reset",
"typoTolerance": {
"Set": {
"enabled": {
"Set": true
},
"minWordSizeForTypos": {
"Set": {
"oneTypo": {
"Set": 5
},
"twoTypos": {
"Set": 9
}
}
},
"disableOnWords": {
"Set": []
},
"disableOnAttributes": {
"Set": []
}
}
},
"faceting": "NotSet",
"pagination": "NotSet"
}

View File

@@ -0,0 +1,40 @@
---
source: dump/src/reader/compat/v5_to_v6.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
},
"faceting": {
"maxValuesPerFacet": 100
},
"pagination": {
"maxTotalHits": 1000
}
}

View File

@@ -0,0 +1,54 @@
---
source: dump/src/reader/compat/v5_to_v6.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
},
"faceting": {
"maxValuesPerFacet": 100
},
"pagination": {
"maxTotalHits": 1000
}
}

View File

@@ -0,0 +1,46 @@
---
source: dump/src/reader/compat/v5_to_v6.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"sortableAttributes": [
"release_date"
],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
},
"faceting": {
"maxValuesPerFacet": 100
},
"pagination": {
"maxTotalHits": 1000
}
}

View File

@@ -0,0 +1,414 @@
use std::collections::BTreeSet;
use std::str::FromStr;
use super::v2_to_v3::CompatV2ToV3;
use crate::reader::{v1, v2, Document};
use crate::Result;
pub struct CompatV1ToV2 {
pub from: v1::V1Reader,
}
impl CompatV1ToV2 {
pub fn new(v1: v1::V1Reader) -> Self {
Self { from: v1 }
}
pub fn to_v3(self) -> CompatV2ToV3 {
CompatV2ToV3::Compat(self)
}
pub fn version(&self) -> crate::Version {
self.from.version()
}
pub fn date(&self) -> Option<time::OffsetDateTime> {
self.from.date()
}
pub fn index_uuid(&self) -> Vec<v2::meta::IndexUuid> {
self.from
.index_uuid()
.into_iter()
.enumerate()
// we use the index of the index 😬 as UUID for the index, so that we can link the v2::Task to their index
.map(|(index, index_uuid)| v2::meta::IndexUuid {
uid: index_uuid.uid,
uuid: uuid::Uuid::from_u128(index as u128),
})
.collect()
}
pub fn indexes(&self) -> Result<impl Iterator<Item = Result<CompatIndexV1ToV2>> + '_> {
Ok(self.from.indexes()?.map(|index_reader| Ok(CompatIndexV1ToV2 { from: index_reader? })))
}
pub fn tasks(
&mut self,
) -> Box<dyn Iterator<Item = Result<(v2::Task, Option<v2::UpdateFile>)>> + '_> {
// Convert an error here to an iterator yielding the error
let indexes = match self.from.indexes() {
Ok(indexes) => indexes,
Err(err) => return Box::new(std::iter::once(Err(err))),
};
let it = indexes.enumerate().flat_map(
move |(index, index_reader)| -> Box<dyn Iterator<Item = _>> {
let index_reader = match index_reader {
Ok(index_reader) => index_reader,
Err(err) => return Box::new(std::iter::once(Err(err))),
};
Box::new(
index_reader
.tasks()
// Filter out the UpdateStatus::Customs variant that is not supported in v2
// and enqueued tasks, that don't contain the necessary update file in v1
.filter_map(move |task| -> Option<_> {
let task = match task {
Ok(task) => task,
Err(err) => return Some(Err(err)),
};
Some(Ok((
v2::Task {
uuid: uuid::Uuid::from_u128(index as u128),
update: Option::from(task)?,
},
None,
)))
}),
)
},
);
Box::new(it)
}
}
pub struct CompatIndexV1ToV2 {
pub from: v1::V1IndexReader,
}
impl CompatIndexV1ToV2 {
pub fn metadata(&self) -> &crate::IndexMetadata {
self.from.metadata()
}
pub fn documents(&mut self) -> Result<Box<dyn Iterator<Item = Result<Document>> + '_>> {
self.from.documents().map(|it| Box::new(it) as Box<dyn Iterator<Item = _>>)
}
pub fn settings(&mut self) -> Result<v2::settings::Settings<v2::settings::Checked>> {
Ok(v2::settings::Settings::<v2::settings::Unchecked>::from(self.from.settings()?).check())
}
}
impl From<v1::settings::Settings> for v2::Settings<v2::Unchecked> {
fn from(source: v1::settings::Settings) -> Self {
let displayed_attributes = source
.displayed_attributes
.map(|opt| opt.map(|displayed_attributes| displayed_attributes.into_iter().collect()));
let attributes_for_faceting = source.attributes_for_faceting.map(|opt| {
opt.map(|attributes_for_faceting| attributes_for_faceting.into_iter().collect())
});
let ranking_rules = source.ranking_rules.map(|opt| {
opt.map(|ranking_rules| {
ranking_rules
.into_iter()
.filter_map(|ranking_rule| {
match v1::settings::RankingRule::from_str(&ranking_rule) {
Ok(ranking_rule) => {
let criterion: Option<v2::settings::Criterion> =
ranking_rule.into();
criterion.as_ref().map(ToString::to_string)
}
Err(()) => Some(ranking_rule),
}
})
.collect()
})
});
Self {
displayed_attributes,
searchable_attributes: source.searchable_attributes,
filterable_attributes: attributes_for_faceting,
ranking_rules,
stop_words: source.stop_words,
synonyms: source.synonyms,
distinct_attribute: source.distinct_attribute,
_kind: std::marker::PhantomData,
}
}
}
impl From<v1::update::UpdateStatus> for Option<v2::updates::UpdateStatus> {
fn from(source: v1::update::UpdateStatus) -> Self {
use v1::update::UpdateStatus as UpdateStatusV1;
use v2::updates::UpdateStatus as UpdateStatusV2;
Some(match source {
UpdateStatusV1::Enqueued { content } => {
log::warn!(
"Cannot import task {} (importing enqueued tasks from v1 dumps is unsupported)",
content.update_id
);
log::warn!("Task will be skipped in the queue of imported tasks.");
return None;
}
UpdateStatusV1::Failed { content } => UpdateStatusV2::Failed(v2::updates::Failed {
from: v2::updates::Processing {
from: v2::updates::Enqueued {
update_id: content.update_id,
meta: Option::from(content.update_type)?,
enqueued_at: content.enqueued_at,
content: None,
},
started_processing_at: content.processed_at
- std::time::Duration::from_secs_f64(content.duration),
},
error: v2::ResponseError {
// error code is ignored by serialization, and so always default in deserialized v2 dumps
// that's a good thing, because we don't have them in v1 dump 😅
code: http::StatusCode::default(),
message: content.error.unwrap_or_default(),
// error codes are unchanged between v1 and v2
error_code: content.error_code.unwrap_or_default(),
// error types are unchanged between v1 and v2
error_type: content.error_type.unwrap_or_default(),
// error links are unchanged between v1 and v2
error_link: content.error_link.unwrap_or_default(),
},
failed_at: content.processed_at,
}),
UpdateStatusV1::Processed { content } => {
UpdateStatusV2::Processed(v2::updates::Processed {
success: match &content.update_type {
v1::update::UpdateType::ClearAll => {
v2::updates::UpdateResult::DocumentDeletion { deleted: u64::MAX }
}
v1::update::UpdateType::Customs => v2::updates::UpdateResult::Other,
v1::update::UpdateType::DocumentsAddition { number } => {
v2::updates::UpdateResult::DocumentsAddition(
v2::updates::DocumentAdditionResult { nb_documents: *number },
)
}
v1::update::UpdateType::DocumentsPartial { number } => {
v2::updates::UpdateResult::DocumentsAddition(
v2::updates::DocumentAdditionResult { nb_documents: *number },
)
}
v1::update::UpdateType::DocumentsDeletion { number } => {
v2::updates::UpdateResult::DocumentDeletion { deleted: *number as u64 }
}
v1::update::UpdateType::Settings { .. } => v2::updates::UpdateResult::Other,
},
processed_at: content.processed_at,
from: v2::updates::Processing {
from: v2::updates::Enqueued {
update_id: content.update_id,
meta: Option::from(content.update_type)?,
enqueued_at: content.enqueued_at,
content: None,
},
started_processing_at: content.processed_at
- std::time::Duration::from_secs_f64(content.duration),
},
})
}
})
}
}
impl From<v1::update::UpdateType> for Option<v2::updates::UpdateMeta> {
fn from(source: v1::update::UpdateType) -> Self {
Some(match source {
v1::update::UpdateType::ClearAll => v2::updates::UpdateMeta::ClearDocuments,
v1::update::UpdateType::Customs => {
log::warn!("Ignoring task with type 'Customs' that is no longer supported");
return None;
}
v1::update::UpdateType::DocumentsAddition { .. } => {
v2::updates::UpdateMeta::DocumentsAddition {
method: v2::updates::IndexDocumentsMethod::ReplaceDocuments,
format: v2::updates::UpdateFormat::Json,
primary_key: None,
}
}
v1::update::UpdateType::DocumentsPartial { .. } => {
v2::updates::UpdateMeta::DocumentsAddition {
method: v2::updates::IndexDocumentsMethod::UpdateDocuments,
format: v2::updates::UpdateFormat::Json,
primary_key: None,
}
}
v1::update::UpdateType::DocumentsDeletion { .. } => {
v2::updates::UpdateMeta::DeleteDocuments { ids: vec![] }
}
v1::update::UpdateType::Settings { settings } => {
v2::updates::UpdateMeta::Settings((*settings).into())
}
})
}
}
impl From<v1::settings::SettingsUpdate> for v2::Settings<v2::Unchecked> {
fn from(source: v1::settings::SettingsUpdate) -> Self {
let displayed_attributes: Option<Option<BTreeSet<String>>> =
source.displayed_attributes.into();
let attributes_for_faceting: Option<Option<Vec<String>>> =
source.attributes_for_faceting.into();
let ranking_rules: Option<Option<Vec<v1::settings::RankingRule>>> =
source.ranking_rules.into();
// go from the concrete types of v1 (RankingRule) to the concrete type of v2 (Criterion),
// and then back to string as this is what the settings manipulate
let ranking_rules = ranking_rules.map(|opt| {
opt.map(|ranking_rules| {
ranking_rules
.into_iter()
// filter out the WordsPosition ranking rule that exists in v1 but not v2
.filter_map(|ranking_rule| {
Option::<v2::settings::Criterion>::from(ranking_rule)
})
.map(|criterion| criterion.to_string())
.collect()
})
});
Self {
displayed_attributes: displayed_attributes.map(|opt| {
opt.map(|displayed_attributes| displayed_attributes.into_iter().collect())
}),
searchable_attributes: source.searchable_attributes.into(),
filterable_attributes: attributes_for_faceting.map(|opt| {
opt.map(|attributes_for_faceting| attributes_for_faceting.into_iter().collect())
}),
ranking_rules,
stop_words: source.stop_words.into(),
synonyms: source.synonyms.into(),
distinct_attribute: source.distinct_attribute.into(),
_kind: std::marker::PhantomData,
}
}
}
impl From<v1::settings::RankingRule> for Option<v2::settings::Criterion> {
fn from(source: v1::settings::RankingRule) -> Self {
match source {
v1::settings::RankingRule::Typo => Some(v2::settings::Criterion::Typo),
v1::settings::RankingRule::Words => Some(v2::settings::Criterion::Words),
v1::settings::RankingRule::Proximity => Some(v2::settings::Criterion::Proximity),
v1::settings::RankingRule::Attribute => Some(v2::settings::Criterion::Attribute),
v1::settings::RankingRule::WordsPosition => {
log::warn!("Removing the 'WordsPosition' ranking rule that is no longer supported, please check the resulting ranking rules of your indexes");
None
}
v1::settings::RankingRule::Exactness => Some(v2::settings::Criterion::Exactness),
v1::settings::RankingRule::Asc(field_name) => {
Some(v2::settings::Criterion::Asc(field_name))
}
v1::settings::RankingRule::Desc(field_name) => {
Some(v2::settings::Criterion::Desc(field_name))
}
}
}
}
impl<T> From<v1::settings::UpdateState<T>> for Option<Option<T>> {
fn from(source: v1::settings::UpdateState<T>) -> Self {
match source {
v1::settings::UpdateState::Update(new_value) => Some(Some(new_value)),
v1::settings::UpdateState::Clear => Some(None),
v1::settings::UpdateState::Nothing => None,
}
}
}
#[cfg(test)]
pub(crate) mod test {
use std::fs::File;
use std::io::BufReader;
use flate2::bufread::GzDecoder;
use meili_snap::insta;
use tempfile::TempDir;
use super::*;
#[test]
fn compat_v1_v2() {
let dump = File::open("tests/assets/v1.dump").unwrap();
let dir = TempDir::new().unwrap();
let mut dump = BufReader::new(dump);
let gz = GzDecoder::new(&mut dump);
let mut archive = tar::Archive::new(gz);
archive.unpack(dir.path()).unwrap();
let mut dump = v1::V1Reader::open(dir).unwrap().to_v2();
// top level infos
assert_eq!(dump.date(), None);
// tasks
let tasks = dump.tasks().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"ad6245d98d1a8e30535f3339a9a8d223");
assert_eq!(update_files.len(), 9);
assert!(update_files[..].iter().all(|u| u.is_none())); // no update file in dumps v1
// indexes
let mut indexes = dump.indexes().unwrap().collect::<Result<Vec<_>>>().unwrap();
// the index are not ordered in any way by default
indexes.sort_by_key(|index| index.metadata().uid.to_string());
let mut products = indexes.pop().unwrap();
let mut movies = indexes.pop().unwrap();
let mut spells = indexes.pop().unwrap();
assert!(indexes.is_empty());
// products
insta::assert_json_snapshot!(products.metadata(), @r###"
{
"uid": "products",
"primaryKey": "sku",
"createdAt": "2022-10-02T13:23:39.976870431Z",
"updatedAt": "2022-10-02T13:27:54.353262482Z"
}
"###);
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
// movies
insta::assert_json_snapshot!(movies.metadata(), @r###"
{
"uid": "movies",
"primaryKey": "id",
"createdAt": "2022-10-02T13:15:29.477512777Z",
"updatedAt": "2022-10-02T13:21:12.671204856Z"
}
"###);
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b63dbed5bbc059f3e32bc471ae699bf5");
// spells
insta::assert_json_snapshot!(spells.metadata(), @r###"
{
"uid": "dnd_spells",
"primaryKey": "index",
"createdAt": "2022-10-02T13:38:26.358882984Z",
"updatedAt": "2022-10-02T13:38:26.385609433Z"
}
"###);
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"aa24c0cfc733d66c396237ad44263bed");
}
}

View File

@@ -4,22 +4,28 @@ use std::str::FromStr;
use time::OffsetDateTime;
use uuid::Uuid;
use super::v1_to_v2::{CompatIndexV1ToV2, CompatV1ToV2};
use super::v3_to_v4::CompatV3ToV4;
use crate::reader::{v2, v3, Document};
use crate::Result;
pub struct CompatV2ToV3 {
pub from: v2::V2Reader,
pub enum CompatV2ToV3 {
V2(v2::V2Reader),
Compat(CompatV1ToV2),
}
impl CompatV2ToV3 {
pub fn new(v2: v2::V2Reader) -> CompatV2ToV3 {
CompatV2ToV3 { from: v2 }
CompatV2ToV3::V2(v2)
}
pub fn index_uuid(&self) -> Vec<v3::meta::IndexUuid> {
self.from
.index_uuid()
let v2_uuids = match self {
CompatV2ToV3::V2(from) => from.index_uuid(),
CompatV2ToV3::Compat(compat) => compat.index_uuid(),
};
v2_uuids
.into_iter()
.into_iter()
.map(|index| v3::meta::IndexUuid { uid: index.uid, uuid: index.uuid })
.collect()
@@ -30,11 +36,17 @@ impl CompatV2ToV3 {
}
pub fn version(&self) -> crate::Version {
self.from.version()
match self {
CompatV2ToV3::V2(from) => from.version(),
CompatV2ToV3::Compat(compat) => compat.version(),
}
}
pub fn date(&self) -> Option<time::OffsetDateTime> {
self.from.date()
match self {
CompatV2ToV3::V2(from) => from.date(),
CompatV2ToV3::Compat(compat) => compat.date(),
}
}
pub fn instance_uid(&self) -> Result<Option<uuid::Uuid>> {
@@ -42,10 +54,18 @@ impl CompatV2ToV3 {
}
pub fn indexes(&self) -> Result<impl Iterator<Item = Result<CompatIndexV2ToV3>> + '_> {
Ok(self.from.indexes()?.map(|index_reader| -> Result<_> {
let compat = CompatIndexV2ToV3::new(index_reader?);
Ok(compat)
}))
Ok(match self {
CompatV2ToV3::V2(from) => Box::new(from.indexes()?.map(|index_reader| -> Result<_> {
let compat = CompatIndexV2ToV3::new(index_reader?);
Ok(compat)
}))
as Box<dyn Iterator<Item = Result<CompatIndexV2ToV3>> + '_>,
CompatV2ToV3::Compat(compat) => Box::new(compat.indexes()?.map(|index_reader| {
let compat = CompatIndexV2ToV3::Compat(Box::new(index_reader?));
Ok(compat)
}))
as Box<dyn Iterator<Item = Result<CompatIndexV2ToV3>> + '_>,
})
}
pub fn tasks(
@@ -54,11 +74,13 @@ impl CompatV2ToV3 {
dyn Iterator<Item = Result<(v3::Task, Option<Box<dyn Iterator<Item = Result<Document>>>>)>>
+ '_,
> {
let _indexes = self.from.index_uuid.clone();
let tasks = match self {
CompatV2ToV3::V2(from) => from.tasks(),
CompatV2ToV3::Compat(compat) => compat.tasks(),
};
Box::new(
self.from
.tasks()
tasks
.map(move |task| {
task.map(|(task, content_file)| {
let task = v3::Task { uuid: task.uuid, update: task.update.into() };
@@ -76,27 +98,38 @@ impl CompatV2ToV3 {
}
}
pub struct CompatIndexV2ToV3 {
from: v2::V2IndexReader,
pub enum CompatIndexV2ToV3 {
V2(v2::V2IndexReader),
Compat(Box<CompatIndexV1ToV2>),
}
impl CompatIndexV2ToV3 {
pub fn new(v2: v2::V2IndexReader) -> CompatIndexV2ToV3 {
CompatIndexV2ToV3 { from: v2 }
CompatIndexV2ToV3::V2(v2)
}
pub fn metadata(&self) -> &crate::IndexMetadata {
self.from.metadata()
match self {
CompatIndexV2ToV3::V2(from) => from.metadata(),
CompatIndexV2ToV3::Compat(compat) => compat.metadata(),
}
}
pub fn documents(&mut self) -> Result<Box<dyn Iterator<Item = Result<Document>> + '_>> {
self.from
.documents()
.map(|iter| Box::new(iter) as Box<dyn Iterator<Item = Result<Document>> + '_>)
match self {
CompatIndexV2ToV3::V2(from) => from
.documents()
.map(|iter| Box::new(iter) as Box<dyn Iterator<Item = Result<Document>> + '_>),
CompatIndexV2ToV3::Compat(compat) => compat.documents(),
}
}
pub fn settings(&mut self) -> Result<v3::Settings<v3::Checked>> {
Ok(v3::Settings::<v3::Unchecked>::from(self.from.settings()?).check())
let settings = match self {
CompatIndexV2ToV3::V2(from) => from.settings()?,
CompatIndexV2ToV3::Compat(compat) => compat.settings()?,
};
Ok(v3::Settings::<v3::Unchecked>::from(settings).check())
}
}
@@ -381,7 +414,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn compat_v2_v3() {
let dump = File::open("tests/assets/v2.dump").unwrap();
let dir = TempDir::new().unwrap();
@@ -427,7 +459,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"54b3d7a0d96de35427d867fa17164a99");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"548284a84de510f71e88e6cdea495cf5");
@@ -442,7 +474,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"ae7c5ade2243a553152dab2f354e9095");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d153b5a81d8b3cdcbe1dec270b574022");
@@ -457,7 +489,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies2.settings()), @"1be82b894556d23953af557b6a328a58");
insta::assert_json_snapshot!(movies2.settings().unwrap());
let documents = movies2.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 0);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d751713988987e9331980363e24189ce");
@@ -472,7 +504,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"1be82b894556d23953af557b6a328a58");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");

View File

@@ -347,7 +347,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn compat_v3_v4() {
let dump = File::open("tests/assets/v3.dump").unwrap();
let dir = TempDir::new().unwrap();
@@ -397,7 +396,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"d3402aff19b90acea9e9a07c466690aa");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"548284a84de510f71e88e6cdea495cf5");
@@ -412,7 +411,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"687aaab250f01b55d57bc69aa313b581");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d153b5a81d8b3cdcbe1dec270b574022");
@@ -427,7 +426,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies2.settings()), @"cd9fedbd7e3492831a94da62c90013ea");
insta::assert_json_snapshot!(movies2.settings().unwrap());
let documents = movies2.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 0);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d751713988987e9331980363e24189ce");
@@ -442,7 +441,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"cd9fedbd7e3492831a94da62c90013ea");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");

View File

@@ -383,7 +383,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn compat_v4_v5() {
let dump = File::open("tests/assets/v4.dump").unwrap();
let dir = TempDir::new().unwrap();
@@ -430,7 +429,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"26947283836ee4cdf0974f82efcc5332");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
@@ -445,7 +444,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"156871410d17e23803d0c90ddc6a66cb");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"786022a66ecb992c8a2a60fee070a5ab");
@@ -460,7 +459,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"69c9916142612cf4a2da9b9ed9455e9e");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");

View File

@@ -119,11 +119,10 @@ impl CompatV5ToV6 {
allow_index_creation,
settings: Box::new(settings.into()),
},
v5::tasks::TaskContent::Dump { uid } => v6::Kind::DumpCreation {
dump_uid: uid,
keys: keys.clone(),
instance_uid,
},
v5::tasks::TaskContent::Dump { uid: _ } => {
// in v6 we compute the dump_uid from the started_at processing time
v6::Kind::DumpCreation { keys: keys.clone(), instance_uid }
}
},
canceled_by: None,
details: task_view.details.map(|details| match details {
@@ -143,13 +142,15 @@ impl CompatV5ToV6 {
received_document_ids,
deleted_documents,
} => v6::Details::DocumentDeletion {
matched_documents: received_document_ids,
provided_ids: received_document_ids,
deleted_documents,
},
v5::Details::ClearAll { deleted_documents } => {
v6::Details::ClearAll { deleted_documents }
}
v5::Details::Dump { dump_uid } => v6::Details::Dump { dump_uid },
v5::Details::Dump { dump_uid } => {
v6::Details::Dump { dump_uid: Some(dump_uid) }
}
}),
error: task_view.error.map(|e| e.into()),
enqueued_at: task_view.enqueued_at,
@@ -401,7 +402,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn compat_v5_v6() {
let dump = File::open("tests/assets/v5.dump").unwrap();
let dir = TempDir::new().unwrap();
@@ -419,7 +419,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"42d4200cf6d92a6449989ca48cd8e28a");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"6519f7064c45d2196dd59b71350a9bf5");
assert_eq!(update_files.len(), 22);
assert!(update_files[0].is_none()); // the dump creation
assert!(update_files[1].is_some()); // the enqueued document addition
@@ -449,7 +449,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"8e5cadabf74aebe1160bf51c3d489efe");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
@@ -464,7 +464,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"4894ac1e74b9e1069ed5ee262b7a1aca");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 200);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"e962baafd2fbae4cdd14e876053b0c5a");
@@ -479,7 +479,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"054dbf08a79e08bb9becba6f5d090f13");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");

View File

@@ -1,42 +0,0 @@
use meilisearch_auth::error::AuthControllerError;
use meilisearch_types::error::{Code, ErrorCode};
use meilisearch_types::internal_error;
use crate::{index_resolver::error::IndexResolverError, tasks::error::TaskError};
pub type Result<T> = std::result::Result<T, DumpError>;
#[derive(thiserror::Error, Debug)]
pub enum DumpError {
#[error("An internal error has occurred. `{0}`.")]
Internal(Box<dyn std::error::Error + Send + Sync + 'static>),
#[error("{0}")]
IndexResolver(Box<IndexResolverError>),
}
internal_error!(
DumpError: milli::heed::Error,
std::io::Error,
tokio::task::JoinError,
tokio::sync::oneshot::error::RecvError,
serde_json::error::Error,
tempfile::PersistError,
fs_extra::error::Error,
AuthControllerError,
TaskError
);
impl From<IndexResolverError> for DumpError {
fn from(e: IndexResolverError) -> Self {
Self::IndexResolver(Box::new(e))
}
}
impl ErrorCode for DumpError {
fn error_code(&self) -> Code {
match self {
DumpError::Internal(_) => Code::Internal,
DumpError::IndexResolver(e) => e.error_code(),
}
}
}

View File

@@ -9,11 +9,11 @@ use self::compat::v4_to_v5::CompatV4ToV5;
use self::compat::v5_to_v6::{CompatIndexV5ToV6, CompatV5ToV6};
use self::v5::V5Reader;
use self::v6::{V6IndexReader, V6Reader};
use crate::{Error, Result, Version};
use crate::{Result, Version};
mod compat;
// pub(self) mod v1;
pub(self) mod v1;
pub(self) mod v2;
pub(self) mod v3;
pub(self) mod v4;
@@ -45,8 +45,9 @@ impl DumpReader {
let MetadataVersion { dump_version } = serde_json::from_reader(&mut meta_file)?;
match dump_version {
// Version::V1 => Ok(Box::new(v1::Reader::open(path)?)),
Version::V1 => Err(Error::DumpV1Unsupported),
Version::V1 => {
Ok(v1::V1Reader::open(path)?.to_v2().to_v3().to_v4().to_v5().to_v6().into())
}
Version::V2 => Ok(v2::V2Reader::open(path)?.to_v3().to_v4().to_v5().to_v6().into()),
Version::V3 => Ok(v3::V3Reader::open(path)?.to_v4().to_v5().to_v6().into()),
Version::V4 => Ok(v4::V4Reader::open(path)?.to_v5().to_v6().into()),
@@ -189,7 +190,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn import_dump_v5() {
let dump = File::open("tests/assets/v5.dump").unwrap();
let mut dump = DumpReader::open(dump).unwrap();
@@ -201,7 +201,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"42d4200cf6d92a6449989ca48cd8e28a");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"6519f7064c45d2196dd59b71350a9bf5");
assert_eq!(update_files.len(), 22);
assert!(update_files[0].is_none()); // the dump creation
assert!(update_files[1].is_some()); // the enqueued document addition
@@ -231,7 +231,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"8e5cadabf74aebe1160bf51c3d489efe");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
@@ -246,7 +246,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"4894ac1e74b9e1069ed5ee262b7a1aca");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 200);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"e962baafd2fbae4cdd14e876053b0c5a");
@@ -261,14 +261,13 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"054dbf08a79e08bb9becba6f5d090f13");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");
}
#[test]
#[ignore]
fn import_dump_v4() {
let dump = File::open("tests/assets/v4.dump").unwrap();
let mut dump = DumpReader::open(dump).unwrap();
@@ -300,53 +299,52 @@ pub(crate) mod test {
assert!(indexes.is_empty());
// products
insta::assert_json_snapshot!(products.metadata(), { ".createdAt" => "[now]", ".updatedAt" => "[now]" }, @r###"
insta::assert_json_snapshot!(products.metadata(), @r###"
{
"uid": "products",
"primaryKey": "sku",
"createdAt": "[now]",
"updatedAt": "[now]"
"createdAt": "2022-10-06T12:53:39.360187055Z",
"updatedAt": "2022-10-06T12:53:40.603035979Z"
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"1f9da51a4518166fb440def5437eafdb");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
// movies
insta::assert_json_snapshot!(movies.metadata(), { ".createdAt" => "[now]", ".updatedAt" => "[now]" }, @r###"
insta::assert_json_snapshot!(movies.metadata(), @r###"
{
"uid": "movies",
"primaryKey": "id",
"createdAt": "[now]",
"updatedAt": "[now]"
"createdAt": "2022-10-06T12:53:38.710611568Z",
"updatedAt": "2022-10-06T12:53:49.785862546Z"
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"488816aba82c1bd65f1609630055c611");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"786022a66ecb992c8a2a60fee070a5ab");
// spells
insta::assert_json_snapshot!(spells.metadata(), { ".createdAt" => "[now]", ".updatedAt" => "[now]" }, @r###"
insta::assert_json_snapshot!(spells.metadata(), @r###"
{
"uid": "dnd_spells",
"primaryKey": "index",
"createdAt": "[now]",
"updatedAt": "[now]"
"createdAt": "2022-10-06T12:53:40.831649057Z",
"updatedAt": "2022-10-06T12:53:41.116036186Z"
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"7b4f66dad597dc651650f35fe34be27f");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");
}
#[test]
#[ignore]
fn import_dump_v3() {
let dump = File::open("tests/assets/v3.dump").unwrap();
let mut dump = DumpReader::open(dump).unwrap();
@@ -388,7 +386,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"855f3165dec609b919171ff83f82b364");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"548284a84de510f71e88e6cdea495cf5");
@@ -403,7 +401,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"43e0bf1746c3ea1d64c1e10ea544c190");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d153b5a81d8b3cdcbe1dec270b574022");
@@ -418,7 +416,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies2.settings()), @"5fd06a5038f49311600379d43412b655");
insta::assert_json_snapshot!(movies2.settings().unwrap());
let documents = movies2.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 0);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d751713988987e9331980363e24189ce");
@@ -433,14 +431,13 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"5fd06a5038f49311600379d43412b655");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");
}
#[test]
#[ignore]
fn import_dump_v2() {
let dump = File::open("tests/assets/v2.dump").unwrap();
let mut dump = DumpReader::open(dump).unwrap();
@@ -482,7 +479,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"b15b71f56dd082d8e8ec5182e688bf36");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"548284a84de510f71e88e6cdea495cf5");
@@ -497,7 +494,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"5389153ddf5527fa79c54b6a6e9c21f6");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d153b5a81d8b3cdcbe1dec270b574022");
@@ -512,7 +509,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies2.settings()), @"8aebab01301d266acf3e18dd449c008f");
insta::assert_json_snapshot!(movies2.settings().unwrap());
let documents = movies2.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 0);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d751713988987e9331980363e24189ce");
@@ -527,9 +524,86 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"8aebab01301d266acf3e18dd449c008f");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");
}
#[test]
fn import_dump_v1() {
let dump = File::open("tests/assets/v1.dump").unwrap();
let mut dump = DumpReader::open(dump).unwrap();
// top level infos
assert_eq!(dump.date(), None);
assert_eq!(dump.instance_uid().unwrap(), None);
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"b3e3652bfc10a76670be157d2507d761");
assert_eq!(update_files.len(), 9);
assert!(update_files[..].iter().all(|u| u.is_none())); // no update file in dump v1
// keys
let keys = dump.keys().unwrap().collect::<Result<Vec<_>>>().unwrap();
meili_snap::snapshot!(meili_snap::json_string!(keys), @"[]");
meili_snap::snapshot_hash!(meili_snap::json_string!(keys), @"d751713988987e9331980363e24189ce");
// indexes
let mut indexes = dump.indexes().unwrap().collect::<Result<Vec<_>>>().unwrap();
// the index are not ordered in any way by default
indexes.sort_by_key(|index| index.metadata().uid.to_string());
let mut products = indexes.pop().unwrap();
let mut movies = indexes.pop().unwrap();
let mut spells = indexes.pop().unwrap();
assert!(indexes.is_empty());
// products
insta::assert_json_snapshot!(products.metadata(), @r###"
{
"uid": "products",
"primaryKey": "sku",
"createdAt": "2022-10-02T13:23:39.976870431Z",
"updatedAt": "2022-10-02T13:27:54.353262482Z"
}
"###);
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
// movies
insta::assert_json_snapshot!(movies.metadata(), @r###"
{
"uid": "movies",
"primaryKey": "id",
"createdAt": "2022-10-02T13:15:29.477512777Z",
"updatedAt": "2022-10-02T13:21:12.671204856Z"
}
"###);
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b63dbed5bbc059f3e32bc471ae699bf5");
// spells
insta::assert_json_snapshot!(spells.metadata(), @r###"
{
"uid": "dnd_spells",
"primaryKey": "index",
"createdAt": "2022-10-02T13:38:26.358882984Z",
"updatedAt": "2022-10-02T13:38:26.385609433Z"
}
"###);
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"aa24c0cfc733d66c396237ad44263bed");
}
}

View File

@@ -0,0 +1,27 @@
---
source: dump/src/reader/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,37 @@
---
source: dump/src/reader/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,37 @@
---
source: dump/src/reader/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,27 @@
---
source: dump/src/reader/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/mod.rs
expression: movies2.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,37 @@
---
source: dump/src/reader/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,24 @@
---
source: dump/src/reader/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,25 @@
---
source: dump/src/reader/mod.rs
expression: movies2.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,25 @@
---
source: dump/src/reader/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,39 @@
---
source: dump/src/reader/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,31 @@
---
source: dump/src/reader/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"sortableAttributes": [
"release_date"
],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,34 @@
---
source: dump/src/reader/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
}
}

View File

@@ -0,0 +1,48 @@
---
source: dump/src/reader/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
}
}

View File

@@ -0,0 +1,40 @@
---
source: dump/src/reader/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"sortableAttributes": [
"release_date"
],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
}
}

View File

@@ -0,0 +1,40 @@
---
source: dump/src/reader/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
},
"faceting": {
"maxValuesPerFacet": 100
},
"pagination": {
"maxTotalHits": 1000
}
}

View File

@@ -0,0 +1,54 @@
---
source: dump/src/reader/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
},
"faceting": {
"maxValuesPerFacet": 100
},
"pagination": {
"maxTotalHits": 1000
}
}

View File

@@ -0,0 +1,46 @@
---
source: dump/src/reader/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"sortableAttributes": [
"release_date"
],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
},
"faceting": {
"maxValuesPerFacet": 100
},
"pagination": {
"maxTotalHits": 1000
}
}

View File

@@ -1,173 +1,262 @@
use std::{
convert::Infallible,
fs::{self, File},
io::{BufRead, BufReader},
path::Path,
};
use std::fs::{self, File};
use std::io::{BufRead, BufReader};
use std::path::{Path, PathBuf};
use serde::Deserialize;
use tempfile::TempDir;
use time::OffsetDateTime;
use self::update::UpdateStatus;
use super::{DumpReader, IndexReader};
use crate::{Error, Result, Version};
use super::compat::v1_to_v2::CompatV1ToV2;
use super::Document;
use crate::{IndexMetadata, Result, Version};
pub mod settings;
pub mod update;
pub mod v1;
pub struct V1Reader {
dump: TempDir,
metadata: v1::Metadata,
indexes: Vec<V1IndexReader>,
pub dump: TempDir,
pub db_version: String,
pub dump_version: crate::Version,
indexes: Vec<V1Index>,
}
struct V1IndexReader {
name: String,
pub struct IndexUuid {
pub name: String,
pub uid: String,
}
pub type Task = self::update::UpdateStatus;
struct V1Index {
metadata: IndexMetadataV1,
path: PathBuf,
}
impl V1Index {
pub fn new(path: PathBuf, metadata: Index) -> Self {
Self { metadata: metadata.into(), path }
}
pub fn open(&self) -> Result<V1IndexReader> {
V1IndexReader::new(&self.path, self.metadata.clone())
}
pub fn metadata(&self) -> &IndexMetadata {
&self.metadata.metadata
}
}
pub struct V1IndexReader {
metadata: IndexMetadataV1,
documents: BufReader<File>,
settings: BufReader<File>,
updates: BufReader<File>,
current_update: Option<UpdateStatus>,
}
impl V1IndexReader {
pub fn new(name: String, path: &Path) -> Result<Self> {
let mut ret = V1IndexReader {
name,
pub fn new(path: &Path, metadata: IndexMetadataV1) -> Result<Self> {
Ok(V1IndexReader {
metadata,
documents: BufReader::new(File::open(path.join("documents.jsonl"))?),
settings: BufReader::new(File::open(path.join("settings.json"))?),
updates: BufReader::new(File::open(path.join("updates.jsonl"))?),
current_update: None,
};
ret.next_update();
Ok(ret)
})
}
pub fn next_update(&mut self) -> Result<Option<UpdateStatus>> {
let current_update = if let Some(line) = self.updates.lines().next() {
Some(serde_json::from_str(&line?)?)
} else {
None
};
pub fn metadata(&self) -> &IndexMetadata {
&self.metadata.metadata
}
Ok(std::mem::replace(&mut self.current_update, current_update))
pub fn documents(&mut self) -> Result<impl Iterator<Item = Result<Document>> + '_> {
Ok((&mut self.documents)
.lines()
.map(|line| -> Result<_> { Ok(serde_json::from_str(&line?)?) }))
}
pub fn settings(&mut self) -> Result<self::settings::Settings> {
Ok(serde_json::from_reader(&mut self.settings)?)
}
pub fn tasks(self) -> impl Iterator<Item = Result<Task>> {
self.updates.lines().map(|line| -> Result<_> { Ok(serde_json::from_str(&line?)?) })
}
}
impl V1Reader {
pub fn open(dump: TempDir) -> Result<Self> {
let mut meta_file = fs::read(dump.path().join("metadata.json"))?;
let metadata = serde_json::from_reader(&*meta_file)?;
let meta_file = fs::read(dump.path().join("metadata.json"))?;
let metadata: Metadata = serde_json::from_reader(&*meta_file)?;
let mut indexes = Vec::new();
let entries = fs::read_dir(dump.path())?;
for entry in entries {
let entry = entry?;
if entry.file_type()?.is_dir() {
indexes.push(V1IndexReader::new(
entry
.file_name()
.to_str()
.ok_or(Error::BadIndexName)?
.to_string(),
&entry.path(),
)?);
}
for index in metadata.indexes.into_iter() {
let index_path = dump.path().join(&index.uid);
indexes.push(V1Index::new(index_path, index));
}
Ok(V1Reader {
dump,
metadata,
indexes,
db_version: metadata.db_version,
dump_version: metadata.dump_version,
})
}
fn next_update(&mut self) -> Result<Option<UpdateStatus>> {
if let Some((idx, _)) = self
.indexes
pub fn to_v2(self) -> CompatV1ToV2 {
CompatV1ToV2 { from: self }
}
pub fn index_uuid(&self) -> Vec<IndexUuid> {
self.indexes
.iter()
.map(|index| index.current_update)
.enumerate()
.filter_map(|(idx, update)| update.map(|u| (idx, u)))
.min_by_key(|(_, update)| update.enqueued_at())
{
self.indexes[idx].next_update()
} else {
Ok(None)
.map(|index| IndexUuid {
name: index.metadata.name.to_owned(),
uid: index.metadata().uid.to_owned(),
})
.collect()
}
pub fn version(&self) -> Version {
Version::V1
}
pub fn date(&self) -> Option<OffsetDateTime> {
None
}
pub fn indexes(&self) -> Result<impl Iterator<Item = Result<V1IndexReader>> + '_> {
Ok(self.indexes.iter().map(|index| index.open()))
}
}
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct Index {
pub name: String,
pub uid: String,
#[serde(with = "time::serde::rfc3339")]
created_at: OffsetDateTime,
#[serde(with = "time::serde::rfc3339")]
updated_at: OffsetDateTime,
pub primary_key: Option<String>,
}
#[derive(Clone)]
pub struct IndexMetadataV1 {
pub name: String,
pub metadata: crate::IndexMetadata,
}
impl From<Index> for IndexMetadataV1 {
fn from(index: Index) -> Self {
IndexMetadataV1 {
name: index.name,
metadata: crate::IndexMetadata {
uid: index.uid,
primary_key: index.primary_key,
created_at: index.created_at,
updated_at: index.updated_at,
},
}
}
}
impl IndexReader for &V1IndexReader {
type Document = serde_json::Map<String, serde_json::Value>;
type Settings = settings::Settings;
fn name(&self) -> &str {
todo!()
}
fn documents(&self) -> Result<Box<dyn Iterator<Item = Result<Self::Document>>>> {
todo!()
}
fn settings(&self) -> Result<Self::Settings> {
todo!()
}
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct Metadata {
pub indexes: Vec<Index>,
pub db_version: String,
pub dump_version: crate::Version,
}
impl DumpReader for V1Reader {
type Document = serde_json::Map<String, serde_json::Value>;
type Settings = settings::Settings;
#[cfg(test)]
pub(crate) mod test {
use std::fs::File;
use std::io::BufReader;
type Task = update::UpdateStatus;
type UpdateFile = Infallible;
use flate2::bufread::GzDecoder;
use meili_snap::insta;
use tempfile::TempDir;
type Key = Infallible;
use super::*;
fn date(&self) -> Option<OffsetDateTime> {
None
}
#[test]
fn read_dump_v1() {
let dump = File::open("tests/assets/v1.dump").unwrap();
let dir = TempDir::new().unwrap();
let mut dump = BufReader::new(dump);
let gz = GzDecoder::new(&mut dump);
let mut archive = tar::Archive::new(gz);
archive.unpack(dir.path()).unwrap();
fn version(&self) -> Version {
Version::V1
}
let dump = V1Reader::open(dir).unwrap();
fn indexes(
&self,
) -> Result<
Box<
dyn Iterator<
Item = Result<
Box<
dyn super::IndexReader<
Document = Self::Document,
Settings = Self::Settings,
>,
>,
>,
>,
>,
> {
Ok(Box::new(self.indexes.iter().map(|index| {
let index = Box::new(index)
as Box<dyn IndexReader<Document = Self::Document, Settings = Self::Settings>>;
Ok(index)
})))
}
// top level infos
assert_eq!(dump.date(), None);
fn tasks(&self) -> Box<dyn Iterator<Item = Result<(Self::Task, Option<Self::UpdateFile>)>>> {
Box::new(std::iter::from_fn(|| {
self.next_update()
.transpose()
.map(|result| result.map(|task| (task, None)))
}))
}
// indexes
let mut indexes = dump.indexes().unwrap().collect::<Result<Vec<_>>>().unwrap();
fn keys(&self) -> Box<dyn Iterator<Item = Result<Self::Key>>> {
Box::new(std::iter::empty())
let mut products = indexes.pop().unwrap();
let mut movies = indexes.pop().unwrap();
let mut dnd_spells = indexes.pop().unwrap();
assert!(indexes.is_empty());
// products
insta::assert_json_snapshot!(products.metadata(), @r###"
{
"uid": "products",
"primaryKey": "sku",
"createdAt": "2022-10-02T13:23:39.976870431Z",
"updatedAt": "2022-10-02T13:27:54.353262482Z"
}
"###);
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
// products tasks
let tasks = products.tasks().collect::<Result<Vec<_>>>().unwrap();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"91de507f206ad21964584021932ba7a7");
// movies
insta::assert_json_snapshot!(movies.metadata(), @r###"
{
"uid": "movies",
"primaryKey": "id",
"createdAt": "2022-10-02T13:15:29.477512777Z",
"updatedAt": "2022-10-02T13:21:12.671204856Z"
}
"###);
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b63dbed5bbc059f3e32bc471ae699bf5");
// movies tasks
let tasks = movies.tasks().collect::<Result<Vec<_>>>().unwrap();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"55eef4de2bef7e84c5ce0bee47488f56");
// spells
insta::assert_json_snapshot!(dnd_spells.metadata(), @r###"
{
"uid": "dnd_spells",
"primaryKey": "index",
"createdAt": "2022-10-02T13:38:26.358882984Z",
"updatedAt": "2022-10-02T13:38:26.385609433Z"
}
"###);
insta::assert_json_snapshot!(dnd_spells.settings().unwrap());
let documents = dnd_spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"aa24c0cfc733d66c396237ad44263bed");
// spells tasks
let tasks = dnd_spells.tasks().collect::<Result<Vec<_>>>().unwrap();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"836dd7d64d5ad20ad901c44b1b161a4c");
}
}

View File

@@ -1,6 +1,9 @@
use std::collections::{BTreeMap, BTreeSet};
use std::result::Result as StdResult;
use std::str::FromStr;
use once_cell::sync::Lazy;
use regex::Regex;
use serde::{Deserialize, Deserializer, Serialize};
#[derive(Default, Clone, Serialize, Deserialize, Debug)]
@@ -53,6 +56,34 @@ pub enum RankingRule {
Desc(String),
}
static ASC_DESC_REGEX: Lazy<Regex> =
Lazy::new(|| Regex::new(r#"(asc|desc)\(([\w_-]+)\)"#).unwrap());
impl FromStr for RankingRule {
type Err = ();
fn from_str(s: &str) -> Result<Self, Self::Err> {
Ok(match s {
"typo" => Self::Typo,
"words" => Self::Words,
"proximity" => Self::Proximity,
"attribute" => Self::Attribute,
"wordsPosition" => Self::WordsPosition,
"exactness" => Self::Exactness,
text => {
let caps = ASC_DESC_REGEX.captures(text).ok_or(())?;
let order = caps.get(1).unwrap().as_str();
let field_name = caps.get(2).unwrap().as_str();
match order {
"asc" => Self::Asc(field_name.to_string()),
"desc" => Self::Desc(field_name.to_string()),
_ => return Err(()),
}
}
})
}
}
// Any value that is present is considered Some value, including null.
fn deserialize_some<'de, T, D>(deserializer: D) -> StdResult<Option<T>, D::Error>
where

View File

@@ -0,0 +1,24 @@
---
source: dump/src/reader/v1/mod.rs
expression: dnd_spells.settings().unwrap()
---
{
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"exactness"
],
"distinctAttribute": null,
"searchableAttributes": [
"*"
],
"displayedAttributes": [
"*"
],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": []
}

View File

@@ -0,0 +1,24 @@
---
source: dump/src/reader/v1/mod.rs
expression: dnd_spells.settings().unwrap()
---
{
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"exactness"
],
"distinctAttribute": null,
"searchableAttributes": [
"*"
],
"displayedAttributes": [
"*"
],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": []
}

View File

@@ -0,0 +1,38 @@
---
source: dump/src/reader/v1/mod.rs
expression: products.settings().unwrap()
---
{
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"exactness"
],
"distinctAttribute": null,
"searchableAttributes": [
"*"
],
"displayedAttributes": [
"*"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"attributesForFaceting": []
}

View File

@@ -0,0 +1,28 @@
---
source: dump/src/reader/v1/mod.rs
expression: movies.settings().unwrap()
---
{
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"exactness",
"asc(release_date)"
],
"distinctAttribute": null,
"searchableAttributes": [
"*"
],
"displayedAttributes": [
"*"
],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": [
"id",
"genres"
]
}

View File

@@ -0,0 +1,28 @@
---
source: dump/src/reader/v1/mod.rs
expression: movies.settings().unwrap()
---
{
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"exactness",
"asc(release_date)"
],
"distinctAttribute": null,
"searchableAttributes": [
"*"
],
"displayedAttributes": [
"*"
],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": [
"id",
"genres"
]
}

View File

@@ -0,0 +1,28 @@
---
source: dump/src/reader/v1/mod.rs
expression: movies.settings().unwrap()
---
{
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"exactness",
"asc(release_date)"
],
"distinctAttribute": null,
"searchableAttributes": [
"*"
],
"displayedAttributes": [
"*"
],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": [
"id",
"genres"
]
}

View File

@@ -0,0 +1,24 @@
---
source: dump/src/reader/v1/mod.rs
expression: dnd_spells.settings().unwrap()
---
{
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"exactness"
],
"distinctAttribute": null,
"searchableAttributes": [
"*"
],
"displayedAttributes": [
"*"
],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": []
}

View File

@@ -1,54 +1,8 @@
use serde::{Deserialize, Serialize};
use serde_json::Value;
use time::OffsetDateTime;
use super::settings::SettingsUpdate;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Update {
data: UpdateData,
#[serde(with = "time::serde::rfc3339")]
enqueued_at: OffsetDateTime,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum UpdateData {
ClearAll,
Customs(Vec<u8>),
// (primary key, documents)
DocumentsAddition {
primary_key: Option<String>,
documents: Vec<serde_json::Map<String, Value>>,
},
DocumentsPartial {
primary_key: Option<String>,
documents: Vec<serde_json::Map<String, Value>>,
},
DocumentsDeletion(Vec<String>),
Settings(Box<SettingsUpdate>),
}
impl UpdateData {
pub fn update_type(&self) -> UpdateType {
match self {
UpdateData::ClearAll => UpdateType::ClearAll,
UpdateData::Customs(_) => UpdateType::Customs,
UpdateData::DocumentsAddition { documents, .. } => UpdateType::DocumentsAddition {
number: documents.len(),
},
UpdateData::DocumentsPartial { documents, .. } => UpdateType::DocumentsPartial {
number: documents.len(),
},
UpdateData::DocumentsDeletion(deletion) => UpdateType::DocumentsDeletion {
number: deletion.len(),
},
UpdateData::Settings(update) => UpdateType::Settings {
settings: update.clone(),
},
}
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(tag = "name")]
pub enum UpdateType {

View File

@@ -1,22 +0,0 @@
use serde::Deserialize;
use time::OffsetDateTime;
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct Index {
pub name: String,
pub uid: String,
#[serde(with = "time::serde::rfc3339")]
created_at: OffsetDateTime,
#[serde(with = "time::serde::rfc3339")]
updated_at: OffsetDateTime,
pub primary_key: Option<String>,
}
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct Metadata {
indexes: Vec<Index>,
db_version: String,
dump_version: crate::Version,
}

View File

@@ -211,7 +211,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn read_dump_v2() {
let dump = File::open("tests/assets/v2.dump").unwrap();
let dir = TempDir::new().unwrap();
@@ -257,7 +256,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"c41bf7315d404da46c99b9e3a2a3cc1e");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"548284a84de510f71e88e6cdea495cf5");
@@ -272,7 +271,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"3d1d96c85b6bab46e957bc8d2532a910");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d153b5a81d8b3cdcbe1dec270b574022");
@@ -287,7 +286,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies2.settings()), @"4f04afc086828d8da0da57a7d598ddba");
insta::assert_json_snapshot!(movies2.settings().unwrap());
let documents = movies2.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 0);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d751713988987e9331980363e24189ce");
@@ -302,7 +301,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"4f04afc086828d8da0da57a7d598ddba");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");

View File

@@ -1,4 +1,5 @@
use std::collections::{BTreeMap, BTreeSet};
use std::fmt::Display;
use std::marker::PhantomData;
use std::str::FromStr;
@@ -174,3 +175,17 @@ impl FromStr for Criterion {
}
}
}
impl Display for Criterion {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Criterion::Words => write!(f, "words"),
Criterion::Typo => write!(f, "typo"),
Criterion::Proximity => write!(f, "proximity"),
Criterion::Attribute => write!(f, "attribute"),
Criterion::Exactness => write!(f, "exactness"),
Criterion::Asc(field_name) => write!(f, "asc({})", field_name),
Criterion::Desc(field_name) => write!(f, "desc({})", field_name),
}
}
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/v2/mod.rs
expression: movies2.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,23 @@
---
source: dump/src/reader/v2/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,37 @@
---
source: dump/src/reader/v2/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,24 @@
---
source: dump/src/reader/v2/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"exactness",
"asc(release_date)"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -227,7 +227,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn read_dump_v3() {
let dump = File::open("tests/assets/v3.dump").unwrap();
let dir = TempDir::new().unwrap();
@@ -273,7 +272,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"f309b009608cc0b770b2f74516f92647");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"548284a84de510f71e88e6cdea495cf5");
@@ -288,7 +287,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"95dff22ba3a7019616c12df9daa35e1e");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d153b5a81d8b3cdcbe1dec270b574022");
@@ -303,7 +302,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies2.settings()), @"1dafc4b123e3a8e14a889719cc01f6e5");
insta::assert_json_snapshot!(movies2.settings().unwrap());
let documents = movies2.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 0);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"d751713988987e9331980363e24189ce");
@@ -318,7 +317,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"1dafc4b123e3a8e14a889719cc01f6e5");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");

View File

@@ -0,0 +1,25 @@
---
source: dump/src/reader/v3/mod.rs
expression: movies2.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,25 @@
---
source: dump/src/reader/v3/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -0,0 +1,39 @@
---
source: dump/src/reader/v3/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null
}

View File

@@ -0,0 +1,31 @@
---
source: dump/src/reader/v3/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"sortableAttributes": [
"release_date"
],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null
}

View File

@@ -13,7 +13,7 @@ pub mod meta;
pub mod settings;
pub mod tasks;
use self::meta::{DumpMeta, IndexUuid};
use self::meta::{DumpMeta, IndexMeta, IndexUuid};
use super::compat::v4_to_v5::CompatV4ToV5;
use crate::{Error, IndexMetadata, Result, Version};
@@ -100,6 +100,10 @@ impl V4Reader {
V4IndexReader::new(
index.uid.clone(),
&self.dump.path().join("indexes").join(index.index_meta.uuid.to_string()),
&index.index_meta,
BufReader::new(
File::open(self.dump.path().join("updates").join("data.jsonl")).unwrap(),
),
)
}))
}
@@ -147,16 +151,44 @@ pub struct V4IndexReader {
}
impl V4IndexReader {
pub fn new(name: String, path: &Path) -> Result<Self> {
pub fn new(
name: String,
path: &Path,
index_metadata: &IndexMeta,
tasks: BufReader<File>,
) -> Result<Self> {
let meta = File::open(path.join("meta.json"))?;
let meta: DumpMeta = serde_json::from_reader(meta)?;
let mut created_at = None;
let mut updated_at = None;
for line in tasks.lines() {
let task: Task = serde_json::from_str(&line?)?;
if task.index_uid.to_string() == name {
// The first task to match our index_uid that succeeded (ie. processed_at returns Some)
// is our `last_updated_at`.
if updated_at.is_none() {
updated_at = task.processed_at()
}
// Once we reach the `creation_task_id` we can stop iterating on the task queue and
// this task represents our `created_at`.
if task.id as usize == index_metadata.creation_task_id {
created_at = task.created_at();
break;
}
}
}
let current_time = OffsetDateTime::now_utc();
let metadata = IndexMetadata {
uid: name,
primary_key: meta.primary_key,
// FIXME: Iterate over the whole task queue to find the creation and last update date.
created_at: OffsetDateTime::now_utc(),
updated_at: OffsetDateTime::now_utc(),
created_at: created_at.unwrap_or(current_time),
updated_at: updated_at.unwrap_or(current_time),
};
let ret = V4IndexReader {
@@ -219,7 +251,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn read_dump_v4() {
let dump = File::open("tests/assets/v4.dump").unwrap();
let dir = TempDir::new().unwrap();
@@ -260,46 +291,46 @@ pub(crate) mod test {
assert!(indexes.is_empty());
// products
insta::assert_json_snapshot!(products.metadata(), { ".createdAt" => "[now]", ".updatedAt" => "[now]" }, @r###"
insta::assert_json_snapshot!(products.metadata(), @r###"
{
"uid": "products",
"primaryKey": "sku",
"createdAt": "[now]",
"updatedAt": "[now]"
"createdAt": "2022-10-06T12:53:39.360187055Z",
"updatedAt": "2022-10-06T12:53:40.603035979Z"
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"65b139c6b9fc251e187073c8557803e2");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
// movies
insta::assert_json_snapshot!(movies.metadata(), { ".createdAt" => "[now]", ".updatedAt" => "[now]" }, @r###"
insta::assert_json_snapshot!(movies.metadata(), @r###"
{
"uid": "movies",
"primaryKey": "id",
"createdAt": "[now]",
"updatedAt": "[now]"
"createdAt": "2022-10-06T12:53:38.710611568Z",
"updatedAt": "2022-10-06T12:53:49.785862546Z"
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"06aa1988493485d9b2cda7c751e6bb15");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 110);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"786022a66ecb992c8a2a60fee070a5ab");
// spells
insta::assert_json_snapshot!(spells.metadata(), { ".createdAt" => "[now]", ".updatedAt" => "[now]" }, @r###"
insta::assert_json_snapshot!(spells.metadata(), @r###"
{
"uid": "dnd_spells",
"primaryKey": "index",
"createdAt": "[now]",
"updatedAt": "[now]"
"createdAt": "2022-10-06T12:53:40.831649057Z",
"updatedAt": "2022-10-06T12:53:41.116036186Z"
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"7d722fc2629eaa45032ed3deb0c9b4ce");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");

View File

@@ -0,0 +1,40 @@
---
source: dump/src/reader/v4/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [
"genres",
"id"
],
"sortableAttributes": [
"release_date"
],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
}
}

View File

@@ -0,0 +1,34 @@
---
source: dump/src/reader/v4/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
}
}

View File

@@ -0,0 +1,48 @@
---
source: dump/src/reader/v4/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": [
"*"
],
"searchableAttributes": [
"*"
],
"filterableAttributes": [],
"sortableAttributes": [],
"rankingRules": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
],
"stopWords": [],
"synonyms": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
},
"distinctAttribute": null,
"typoTolerance": {
"enabled": true,
"minWordSizeForTypos": {
"oneTypo": 5,
"twoTypos": 9
},
"disableOnWords": [],
"disableOnAttributes": []
}
}

View File

@@ -104,6 +104,41 @@ impl Task {
})
}
pub fn processed_at(&self) -> Option<OffsetDateTime> {
match self.events.last() {
Some(TaskEvent::Succeded { result: _, timestamp }) => Some(*timestamp),
_ => None,
}
}
pub fn created_at(&self) -> Option<OffsetDateTime> {
match &self.content {
TaskContent::IndexCreation { primary_key: _ } => match self.events.first() {
Some(TaskEvent::Created(ts)) => Some(*ts),
_ => None,
},
TaskContent::DocumentAddition {
content_uuid: _,
merge_strategy: _,
primary_key: _,
documents_count: _,
allow_index_creation: _,
} => match self.events.first() {
Some(TaskEvent::Created(ts)) => Some(*ts),
_ => None,
},
TaskContent::SettingsUpdate {
settings: _,
is_deletion: _,
allow_index_creation: _,
} => match self.events.first() {
Some(TaskEvent::Created(ts)) => Some(*ts),
_ => None,
},
_ => None,
}
}
/// Return the content_uuid of the `Task` if there is one.
pub fn get_content_uuid(&self) -> Option<Uuid> {
match self {

View File

@@ -261,7 +261,6 @@ pub(crate) mod test {
use super::*;
#[test]
#[ignore]
fn read_dump_v5() {
let dump = File::open("tests/assets/v5.dump").unwrap();
let dir = TempDir::new().unwrap();
@@ -312,7 +311,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", products.settings()), @"b392b928dab63468318b2bdaad844c5a");
insta::assert_json_snapshot!(products.settings().unwrap());
let documents = products.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"b01c8371aea4c7171af0d4d846a2bdca");
@@ -327,7 +326,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", movies.settings()), @"2f881248b7c3623e2ba2885dbf0b2c18");
insta::assert_json_snapshot!(movies.settings().unwrap());
let documents = movies.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 200);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"e962baafd2fbae4cdd14e876053b0c5a");
@@ -342,7 +341,7 @@ pub(crate) mod test {
}
"###);
meili_snap::snapshot_hash!(format!("{:#?}", spells.settings()), @"ade154e63ab713de67919892917d3d9d");
insta::assert_json_snapshot!(spells.settings().unwrap());
let documents = spells.documents().unwrap().collect::<Result<Vec<_>>>().unwrap();
assert_eq!(documents.len(), 10);
meili_snap::snapshot_hash!(format!("{:#?}", documents), @"235016433dd04262c7f2da01d1e808ce");

View File

@@ -0,0 +1,74 @@
---
source: dump/src/reader/v5/mod.rs
expression: movies.settings().unwrap()
---
{
"displayedAttributes": "Reset",
"searchableAttributes": "Reset",
"filterableAttributes": {
"Set": [
"genres",
"id"
]
},
"sortableAttributes": {
"Set": [
"release_date"
]
},
"rankingRules": {
"Set": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness",
"release_date:asc"
]
},
"stopWords": {
"Set": []
},
"synonyms": {
"Set": {}
},
"distinctAttribute": "Reset",
"typoTolerance": {
"Set": {
"enabled": {
"Set": true
},
"minWordSizeForTypos": {
"Set": {
"oneTypo": {
"Set": 5
},
"twoTypos": {
"Set": 9
}
}
},
"disableOnWords": {
"Set": []
},
"disableOnAttributes": {
"Set": []
}
}
},
"faceting": {
"Set": {
"maxValuesPerFacet": {
"Set": 100
}
}
},
"pagination": {
"Set": {
"maxTotalHits": {
"Set": 1000
}
}
}
}

View File

@@ -0,0 +1,68 @@
---
source: dump/src/reader/v5/mod.rs
expression: spells.settings().unwrap()
---
{
"displayedAttributes": "Reset",
"searchableAttributes": "Reset",
"filterableAttributes": {
"Set": []
},
"sortableAttributes": {
"Set": []
},
"rankingRules": {
"Set": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
]
},
"stopWords": {
"Set": []
},
"synonyms": {
"Set": {}
},
"distinctAttribute": "Reset",
"typoTolerance": {
"Set": {
"enabled": {
"Set": true
},
"minWordSizeForTypos": {
"Set": {
"oneTypo": {
"Set": 5
},
"twoTypos": {
"Set": 9
}
}
},
"disableOnWords": {
"Set": []
},
"disableOnAttributes": {
"Set": []
}
}
},
"faceting": {
"Set": {
"maxValuesPerFacet": {
"Set": 100
}
}
},
"pagination": {
"Set": {
"maxTotalHits": {
"Set": 1000
}
}
}
}

View File

@@ -0,0 +1,82 @@
---
source: dump/src/reader/v5/mod.rs
expression: products.settings().unwrap()
---
{
"displayedAttributes": "Reset",
"searchableAttributes": "Reset",
"filterableAttributes": {
"Set": []
},
"sortableAttributes": {
"Set": []
},
"rankingRules": {
"Set": [
"words",
"typo",
"proximity",
"attribute",
"sort",
"exactness"
]
},
"stopWords": {
"Set": []
},
"synonyms": {
"Set": {
"android": [
"phone",
"smartphone"
],
"iphone": [
"phone",
"smartphone"
],
"phone": [
"android",
"iphone",
"smartphone"
]
}
},
"distinctAttribute": "Reset",
"typoTolerance": {
"Set": {
"enabled": {
"Set": true
},
"minWordSizeForTypos": {
"Set": {
"oneTypo": {
"Set": 5
},
"twoTypos": {
"Set": 9
}
}
},
"disableOnWords": {
"Set": []
},
"disableOnAttributes": {
"Set": []
}
}
},
"faceting": {
"Set": {
"maxValuesPerFacet": {
"Set": 100
}
}
},
"pagination": {
"Set": {
"maxTotalHits": {
"Set": 1000
}
}
}
}

Some files were not shown because too many files have changed in this diff Show More