Compare commits

...

186 Commits

Author SHA1 Message Date
ba7500998e Fix getting a single batch through the GET route 2024-11-21 17:59:31 +01:00
9a08757a70 Merge #5070
Some checks failed
Test suite / Tests on ubuntu-20.04 (push) Failing after 12s
Test suite / Tests on ${{ matrix.os }} (windows-2022) (push) Failing after 11s
Test suite / Tests almost all features (push) Has been skipped
Test suite / Test disabled tokenization (push) Has been skipped
Test suite / Run tests in debug (push) Failing after 10s
Test suite / Run Clippy (push) Successful in 6m18s
Test suite / Run Rustfmt (push) Successful in 1m34s
Indexing bench (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of indexing (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for geo (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for songs (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for Wikipedia articles (push) / Run and upload benchmarks (push) Waiting to run
Run the indexing fuzzer / Setup the action (push) Successful in 1h4m33s
Test suite / Tests on ${{ matrix.os }} (macos-13) (push) Has been cancelled
5070: Improve the details and stats of the current batch processing r=Kerollmops a=irevoire

Small improvement we missed over https://github.com/meilisearch/meilisearch/pull/5060

The current batch processing had empty details and stats.

Co-authored-by: Tamo <tamo@meilisearch.com>
2024-11-20 16:56:01 +00:00
1e694ae432 improve the count of the number of tasks in a batch 2024-11-20 17:48:26 +01:00
71807cac6d makes clippy happy 2024-11-20 17:40:58 +01:00
21a2264782 improve the details and stats of the current batch processing 2024-11-20 17:25:55 +01:00
d4d8becfa7 Merge #5060
Some checks failed
Indexing bench (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of indexing (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for geo (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for songs (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for Wikipedia articles (push) / Run and upload benchmarks (push) Waiting to run
Publish binaries to GitHub release / Check the version validity (push) Successful in 11s
Publish binaries to GitHub release / Publish binary for ${{ matrix.os }} (meilisearch, meilisearch-macos-amd64, macos-13) (push) Waiting to run
Publish binaries to GitHub release / Publish binary for macOS silicon (meilisearch-macos-apple-silicon, aarch64-apple-darwin) (push) Waiting to run
Publish binaries to GitHub release / Publish binary for ${{ matrix.os }} (meilisearch.exe, meilisearch-windows-amd64.exe, windows-2022) (push) Failing after 21s
Publish binaries to GitHub release / Publish binary for Linux (push) Failing after 12s
Publish binaries to GitHub release / Publish binary for aarch64 (meilisearch-linux-aarch64, aarch64-unknown-linux-gnu) (push) Failing after 10s
Run the indexing fuzzer / Setup the action (push) Successful in 1h5m1s
Test suite / Tests on ubuntu-20.04 (push) Failing after 12s
Test suite / Tests on ${{ matrix.os }} (macos-13) (push) Waiting to run
Test suite / Tests almost all features (push) Failing after 9s
Test suite / Test disabled tokenization (push) Failing after 8s
Test suite / Run tests in debug (push) Failing after 10s
Test suite / Tests on ${{ matrix.os }} (windows-2022) (push) Failing after 40s
Test suite / Run Rustfmt (push) Successful in 1m28s
Test suite / Run Clippy (push) Successful in 5m29s
5060: Batch route r=Kerollmops a=irevoire

# Pull Request

See [usage](https://www.notion.so/meilisearch/Enhance-visibility-on-batched-tasks-1194b06b651f810b8fe0fab5d72846a8).

## Related issue
Fixes https://github.com/meilisearch/meilisearch/issues/4977

## What does this PR do?
- For more detailed information, see the PRD.
- Added a `batchUid` to the tasks (that's the cause of all the updates of the dumps):
  - For all enqueued tasks, it's set to `None`
  - For every other tasks it must be set to something
  - ⚠️ For all the tasks imported in a dump, the `batchUid` will be set to `None` as well.
- Add two new routes:
  - `GET /batches/:uid` - to query a batch by its id
  - `GET /batches` - to retrieve a list of batches. It accepts all the same query parameters that are available on the `GET /tasks` route
- Adds new databases to query the batches directly:
  - When doing a query against the batches, the rule of thumb is that we want to return a batch iif **at least one** task in it matches the provided filter.
  - We don't need a `canceledBy` batch specific database because we can just retrieve the task and if it's a `taskCancelation` retrieve its `batchUid`
- The task cancelation has been updated and simplified a bit:
  - Instead of updating the matching tasks on disk while processing the cancelation task, we instead retrieve the task and let the `tick` function do the work afterward.
  - In the `tick` function, we now have to take care of not missing any tasks
- All the tests applied to the tasks were duplicated and updated to works with the new batches routes
- The deletion of batches doesn't contain any tests because it's already tested in the deletion of tasks (and especially highlighted in the snapshots)


Currently, one part of the PRD is not implemented: it's the progress.

Co-authored-by: Tamo <tamo@meilisearch.com>
2024-11-20 14:07:48 +00:00
ec06879d28 apply review changes 2024-11-20 14:40:36 +01:00
83d1f858c1 Update crates/index-scheduler/src/lib.rs
Co-authored-by: Clément Renault <clement@meilisearch.com>
2024-11-20 14:36:05 +01:00
a7ac590e9e implements the reverse query parameter for the batches 2024-11-20 13:29:52 +01:00
8ad68dd708 stop leaking the update files of the canceled tasks 2024-11-20 13:17:54 +01:00
7e379b3d14 remove useless prints 2024-11-20 12:27:12 +01:00
56eacd221f update the tests after the rebase 2024-11-20 10:54:38 +01:00
bdb51a85fe now that the task cancelation shares their started at with all the tasks of their batch we don't need the trick of retrieving the previous batch anymore 2024-11-20 10:51:07 +01:00
b24a34830d fix the dump test -> the only change is that we now have a null batch_uid in all the tasks 2024-11-20 10:51:06 +01:00
e145d71a62 implements the two last TODOs 2024-11-20 10:51:06 +01:00
d9a4e69990 push a missing snapshot 2024-11-20 10:51:06 +01:00
b906e3ed70 improve the way we access the mutex 2024-11-20 10:51:06 +01:00
4abcd9c04e add some stats on the batches 2024-11-20 10:51:06 +01:00
229fa0f902 implements the batch details 2024-11-20 10:51:06 +01:00
5d10c2312b remove unused file 2024-11-20 10:51:06 +01:00
f1d38581e5 add the front end tests on the batches routes 2024-11-20 10:51:06 +01:00
62646af7b9 implements the automatic batch deletion 2024-11-20 10:51:06 +01:00
1fcb9526f5 fix the task cancelation 2024-11-20 10:51:06 +01:00
15eefa4fcc fixes a lot of small issue, the test about the cancellation is still failing 2024-11-20 10:51:05 +01:00
ad9763ffcd copy multiple task query tests to batches. Currently, they fails 2024-11-20 10:49:25 +01:00
d489f5635f add the mapping between the task and batches 2024-11-20 10:49:23 +01:00
a1251c3c83 Implements the get all batches route with filters working 2024-11-20 10:42:55 +01:00
6062914654 add the batch_id to the tasks 2024-11-20 10:42:54 +01:00
057fcb3993 Add indices field to _matchesPosition to specify where in an array a match comes from (#5005)
Some checks are pending
Indexing bench (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of indexing (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for geo (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for songs (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for Wikipedia articles (push) / Run and upload benchmarks (push) Waiting to run
Run the indexing fuzzer / Setup the action (push) Successful in 1h4m31s
* Remove unreachable code

* Add `indices` field to `MatchBounds`

For matches inside arrays, this field holds the indices of the array
elements that matched. For example, searching for `cat` inside
`{ "a": ["dog", "cat", "fox"] }` would return `indices: [1]`. For nested
arrays, this contains multiple indices, starting with the one for the
top-most array. For matches in fields without arrays, `indices` is not
serialized (does not exist) to save space.
2024-11-20 01:00:43 +01:00
c1d8ee2a8d Merge #5048
Some checks failed
Test suite / Tests almost all features (push) Has been skipped
Test suite / Test disabled tokenization (push) Has been skipped
Test suite / Tests on ubuntu-20.04 (push) Failing after 14s
Test suite / Run tests in debug (push) Failing after 24s
Test suite / Tests on ${{ matrix.os }} (windows-2022) (push) Failing after 57s
Test suite / Run Rustfmt (push) Successful in 1m36s
Test suite / Run Clippy (push) Successful in 6m8s
Indexing bench (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of indexing (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for geo (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for songs (push) / Run and upload benchmarks (push) Waiting to run
Benchmarks of search for Wikipedia articles (push) / Run and upload benchmarks (push) Waiting to run
Run the indexing fuzzer / Setup the action (push) Successful in 1h4m23s
Test suite / Tests on ${{ matrix.os }} (macos-13) (push) Has been cancelled
5048: Reverse the order of the task queue r=Kerollmops a=irevoire

# Pull Request

## Related issue
Fixes https://github.com/meilisearch/meilisearch/issues/5047

## What does this PR do?
- Provide a new parameter to reverse the order of the task queue
- Add tests
- Remove some unrelated tests that were duplicated in tests/tasks/mod.rs and tests/tasks/error.rs


Co-authored-by: Tamo <tamo@meilisearch.com>
2024-11-18 16:24:12 +00:00
94fb55bb6f Merge #5049
Some checks failed
Test suite / Tests on ${{ matrix.os }} (macos-13) (push) Waiting to run
Test suite / Tests on ubuntu-20.04 (push) Failing after 59s
Test suite / Tests almost all features (push) Has been skipped
Test suite / Test disabled tokenization (push) Has been skipped
Test suite / Run tests in debug (push) Failing after 13s
Test suite / Tests on ${{ matrix.os }} (windows-2022) (push) Failing after 7m4s
Test suite / Run Clippy (push) Successful in 10m58s
Test suite / Run Rustfmt (push) Successful in 2m34s
Run the indexing fuzzer / Setup the action (push) Successful in 1h5m58s
Indexing bench (push) / Run and upload benchmarks (push) Has been cancelled
Benchmarks of indexing (push) / Run and upload benchmarks (push) Has been cancelled
Benchmarks of search for geo (push) / Run and upload benchmarks (push) Has been cancelled
Benchmarks of search for songs (push) / Run and upload benchmarks (push) Has been cancelled
Benchmarks of search for Wikipedia articles (push) / Run and upload benchmarks (push) Has been cancelled
5049: Fix the path used in the flaky tests CI r=irevoire a=Kerollmops

This PR fixes [the flaky tests CI](https://github.com/meilisearch/meilisearch/actions/runs/11741717787) path used.

Co-authored-by: Clément Renault <clement@meilisearch.com>
2024-11-13 10:26:50 +00:00
009709eace Fix the path used in the flaky tests CI 2024-11-13 09:52:10 +01:00
2eb1801e85 reverse the order of the task queue 2024-11-07 19:19:44 +01:00
a5d7ae23bd Merge #5044
5044: Adds new metrics to prometheus r=irevoire a=PedroTurik

not 100% confident in this solution, especially because i couldn't make the "Search Queue searches waiting" metric give me any value other than 0 with my local testing 😆. But i believe it solves the Issue.

# Pull Request

## Related issue
Fixes #4998 

## What does this PR do?
### Adds new metrics to prometheus;
- SearchQueue size, 
- SearchQueue searches running, 
- and Search Queue searches waiting.

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Co-authored-by: Pedro Turik Firmino <pedroturik@gmail.com>
2024-11-07 17:05:43 +00:00
03886d0012 Applies optimizations to formatted integration tests (#5043) 2024-11-07 15:58:55 +01:00
b427b9e88f Merge #5025
5025: test: improve performance of get_documents.rs r=irevoire a=PedroTurik

# Pull Request

## Related issue
Fixes one item from #4840 

## What does this PR do?
- Applies the changes recommended on the issue for `meilisearch/tests/documents/get_documents.rs`

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Pedro Turik Firmino <pedroturik@gmail.com>
2024-11-07 09:46:34 +00:00
8b95f5ccc6 Adds new metrics to prometheus: SearchQueue size, SearchQueue searches running, and Search Queue searches waiting. 2024-11-06 15:37:16 -03:00
6b67f9fc4c Merge #5030
5030: Bump Swatinem/rust-cache from 2.7.1 to 2.7.5 r=curquiza a=dependabot[bot]

Bumps [Swatinem/rust-cache](https://github.com/swatinem/rust-cache) from 2.7.1 to 2.7.5.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/swatinem/rust-cache/releases">Swatinem/rust-cache's releases</a>.</em></p>
<blockquote>
<h2>v2.7.5</h2>
<h2>What's Changed</h2>
<ul>
<li>Upgrade checkout action from version 3 to 4 by <a href="https://github.com/carsten-wenderdel"><code>`@​carsten-wenderdel</code></a>` in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/190">Swatinem/rust-cache#190</a></li>
<li>fix: usage of <code>deprecated</code> version of <code>node</code> by <a href="https://github.com/hamirmahal"><code>`@​hamirmahal</code></a>` in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/197">Swatinem/rust-cache#197</a></li>
<li>Only run macOsWorkaround() on macOS by <a href="https://github.com/heksesang"><code>`@​heksesang</code></a>` in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/206">Swatinem/rust-cache#206</a></li>
<li>Support Cargo.lock format cargo-lock v4 by <a href="https://github.com/NobodyXu"><code>`@​NobodyXu</code></a>` in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/211">Swatinem/rust-cache#211</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/carsten-wenderdel"><code>`@​carsten-wenderdel</code></a>` made their first contribution in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/190">Swatinem/rust-cache#190</a></li>
<li><a href="https://github.com/hamirmahal"><code>`@​hamirmahal</code></a>` made their first contribution in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/197">Swatinem/rust-cache#197</a></li>
<li><a href="https://github.com/heksesang"><code>`@​heksesang</code></a>` made their first contribution in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/206">Swatinem/rust-cache#206</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/Swatinem/rust-cache/compare/v2.7.3...v2.7.5">https://github.com/Swatinem/rust-cache/compare/v2.7.3...v2.7.5</a></p>
<h2>v2.7.3</h2>
<ul>
<li>Work around upstream problem that causes cache saving to hang for minutes.</li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/Swatinem/rust-cache/compare/v2.7.2...v2.7.3">https://github.com/Swatinem/rust-cache/compare/v2.7.2...v2.7.3</a></p>
<h2>v2.7.2</h2>
<h2>What's Changed</h2>
<ul>
<li>Update action runtime to <code>node20</code> by <a href="https://github.com/rhysd"><code>`@​rhysd</code></a>` in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/175">Swatinem/rust-cache#175</a></li>
<li>Only key by <code>Cargo.toml</code> and <code>Cargo.lock</code> files of workspace members by <a href="https://github.com/max-heller"><code>`@​max-heller</code></a>` in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/180">Swatinem/rust-cache#180</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/rhysd"><code>`@​rhysd</code></a>` made their first contribution in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/175">Swatinem/rust-cache#175</a></li>
<li><a href="https://github.com/max-heller"><code>`@​max-heller</code></a>` made their first contribution in <a href="https://redirect.github.com/Swatinem/rust-cache/pull/180">Swatinem/rust-cache#180</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a href="https://github.com/Swatinem/rust-cache/compare/v2.7.1...v2.7.2">https://github.com/Swatinem/rust-cache/compare/v2.7.1...v2.7.2</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/Swatinem/rust-cache/blob/master/CHANGELOG.md">Swatinem/rust-cache's changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<h2>2.7.3</h2>
<ul>
<li>Work around upstream problem that causes cache saving to hang for minutes.</li>
</ul>
<h2>2.7.2</h2>
<ul>
<li>Only key by <code>Cargo.toml</code> and <code>Cargo.lock</code> files of workspace members.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="82a92a6e8f"><code>82a92a6</code></a> 2.7.5</li>
<li><a href="598fe25fa1"><code>598fe25</code></a> update dependencies, rebuild</li>
<li><a href="8f842c2d45"><code>8f842c2</code></a> Support Cargo.lock format cargo-lock v4 (<a href="https://redirect.github.com/swatinem/rust-cache/issues/211">#211</a>)</li>
<li><a href="96a8d65dba"><code>96a8d65</code></a> Only run macOsWorkaround() on macOS (<a href="https://redirect.github.com/swatinem/rust-cache/issues/206">#206</a>)</li>
<li><a href="9bdad043e8"><code>9bdad04</code></a> fix: usage of <code>deprecated</code> version of <code>node</code> (<a href="https://redirect.github.com/swatinem/rust-cache/issues/197">#197</a>)</li>
<li><a href="f7a52f6914"><code>f7a52f6</code></a> &quot;add jsonpath test&quot;</li>
<li><a href="2bceda3912"><code>2bceda3</code></a> &quot;update dependencies&quot;</li>
<li><a href="640a22190e"><code>640a221</code></a> Upgrade checkout action from version 3 to 4 (<a href="https://redirect.github.com/swatinem/rust-cache/issues/190">#190</a>)</li>
<li><a href="1582741630"><code>1582741</code></a> update dependencies</li>
<li><a href="23bce251a8"><code>23bce25</code></a> 2.7.3</li>
<li>Additional commits viewable in <a href="https://github.com/swatinem/rust-cache/compare/v2.7.1...v2.7.5">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=Swatinem/rust-cache&package-manager=github_actions&previous-version=2.7.1&new-version=2.7.5)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

You can trigger a rebase of this PR by commenting ``@dependabot` rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- ``@dependabot` rebase` will rebase this PR
- ``@dependabot` recreate` will recreate this PR, overwriting any edits that have been made to it
- ``@dependabot` merge` will merge this PR after your CI passes on it
- ``@dependabot` squash and merge` will squash and merge this PR after your CI passes on it
- ``@dependabot` cancel merge` will cancel a previously requested merge and block automerging
- ``@dependabot` reopen` will reopen this PR if it is closed
- ``@dependabot` close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- ``@dependabot` show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- ``@dependabot` ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- ``@dependabot` ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- ``@dependabot` ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)


</details>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-06 12:59:36 +00:00
2e4d4b398d Bump Swatinem/rust-cache from 2.7.1 to 2.7.5
Bumps [Swatinem/rust-cache](https://github.com/swatinem/rust-cache) from 2.7.1 to 2.7.5.
- [Release notes](https://github.com/swatinem/rust-cache/releases)
- [Changelog](https://github.com/Swatinem/rust-cache/blob/master/CHANGELOG.md)
- [Commits](https://github.com/swatinem/rust-cache/compare/v2.7.1...v2.7.5)

---
updated-dependencies:
- dependency-name: Swatinem/rust-cache
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-11-06 12:57:04 +00:00
da59a043ba Fixes formatting issues 2024-11-06 09:55:48 -03:00
da4d47b5d0 Fixes formatting issues 2024-11-06 09:54:20 -03:00
0507f5d99b Merge #4928
4928: Make matches consider phrases as a single `Match` r=ManyTheFish a=flevi29

# Pull Request

## Related issue
Fixes #4732

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: F. Levi <55688616+flevi29@users.noreply.github.com>
2024-11-06 08:22:01 +00:00
be2a7c70f2 Merge #5037
5037: Fix the benchmarks r=Kerollmops a=irevoire

# Pull Request

## Related issue
https://github.com/meilisearch/meilisearch/pull/5016 broke all benchmarks. This PR fix the benchmarks


Co-authored-by: Tamo <tamo@meilisearch.com>
2024-11-05 15:37:55 +00:00
ede086bc30 Merge #5034
5034: Upgrade from v1 10 to v1 11 r=irevoire a=irevoire

# Pull Request

## Related issue
Parts of https://github.com/meilisearch/meilisearch/issues/4978

## What does this PR do?
- Move the code around the offline upgrade to its own module with a file per version
- Fix the upgrade from v1.9 to v1.10 because I couldn’t make it work anymore. It now uses a specified format instead of relying on cargo to get the right set of feature
- ☝️ must be checked against docker
- Provide an update path from v1.10 to v1.11. Most of the code is boilerplate in meilitool, the real code is located here: 053807bf38/src/lib.rs (L161-L269)


Co-authored-by: Tamo <tamo@meilisearch.com>
2024-11-05 14:49:56 +00:00
7415ef7ff5 Update crates/meilitool/src/upgrade/v1_11.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-11-05 15:37:59 +01:00
a5d138ac34 use a tag while importing arroy instead of a loose branch or rev 2024-11-05 15:24:02 +01:00
0f74a93346 Update crates/meilitool/src/upgrade/v1_11.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-11-05 15:14:02 +01:00
e4993aa705 Update crates/meilitool/src/upgrade/mod.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-11-05 15:13:50 +01:00
66b7e0824e Update crates/meilitool/src/upgrade/mod.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-11-05 15:13:40 +01:00
f193c3a67c Update crates/meilitool/src/main.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-11-05 15:13:32 +01:00
9799812b27 fix the benchmarks 2024-11-05 15:08:01 +01:00
48ab898ca2 fix the datetime of v1.9 2024-11-05 10:30:53 +01:00
a5dc783ffa Merge with main branch 2024-11-05 10:56:17 +02:00
1b49b60486 Merge #5026
5026: test: improve performance of update_documents.rs  r=dureuill a=PedroTurik

# Pull Request

## Related issue
Fixes one item from #4840 

## What does this PR do?
- Applies the changes recommended on the issue for `meilisearch/tests/documents/update_documents.rs`

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Pedro Turik Firmino <pedroturik@gmail.com>
2024-11-05 08:37:44 +00:00
d0b1ba20cb Improves usage of shared indexes 2024-11-04 17:26:50 -03:00
a1f228f662 remove the uneeded files after the rebase 2024-11-04 18:19:36 +01:00
99a9fde37f push back the removed files 2024-11-04 17:55:55 +01:00
106cc7fe3a fmt 2024-11-04 17:51:40 +01:00
4eef0cd332 fix the update from v1_9 to v1_10 by providing a custom datetime formatter myself 2024-11-04 17:47:10 +01:00
5f57306858 update the arroy version in meilitool 2024-11-04 17:47:10 +01:00
690eb42fc0 update the version of arroy 2024-11-04 17:47:10 +01:00
a9b61c8434 fix the version parsing and improve error handling 2024-11-04 17:47:10 +01:00
ddd03e9b37 implement the upgrade from v1.10 to v1.11 in meilitool 2024-11-04 17:47:10 +01:00
362836efb7 make an upgrade module where we'll be able to shove each version instead of putting everything in the same file 2024-11-04 17:47:10 +01:00
22229d3046 Merge #5022
5022: Briging changes from v1.11.0 back to main r=irevoire a=Kerollmops

Fixes https://github.com/meilisearch/meilisearch/issues/5035

...and fixing merge conflicts.

Co-authored-by: Tamo <tamo@meilisearch.com>
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
Co-authored-by: meili-bors[bot] <89034592+meili-bors[bot]@users.noreply.github.com>
Co-authored-by: ManyTheFish <many@meilisearch.com>
Co-authored-by: curquiza <clementine@meilisearch.com>
2024-11-04 15:34:19 +00:00
186326fe40 update the macos version 2024-11-04 16:33:04 +01:00
cf6ad1ae5e Merge branch 'main' into tmp-release-v1.11.0 2024-11-04 16:14:44 +01:00
c79ca9679b Changes variable name to re-run CI 2024-11-02 18:25:33 -03:00
b02a72c0c0 Applies optimizations to some integration tests 2024-10-29 19:30:11 -03:00
a934b0ac6a Applies optimizations to some integration tests 2024-10-29 18:49:06 -03:00
28274292d8 Merge #5021
5021: Update benchmarks to match the new crates subfolder r=dureuill a=Kerollmops



Co-authored-by: Clément Renault <clement@meilisearch.com>
2024-10-29 08:06:35 +00:00
ee72f622c7 Update benchmarks to match the new crates subfolder 2024-10-28 14:06:46 +01:00
b0da626506 Merge #5016
5016: Hide code complexity into a subfolder r=Kerollmops a=Kerollmops

This PR moves the complexity and main code into a subfolder to make the main repository page more welcoming by reducing the number of visible files and showing the README earlier.

Co-authored-by: Clément Renault <clement@meilisearch.com>
2024-10-28 09:43:14 +00:00
f372ee505f Merge #5017
5017: Rollback the Meilisearch Kawaii logo r=curquiza a=Kerollmops

This PR reverts #4778 and brings back the official one. It's no longer the time to JOKE, OK !?

Co-authored-by: Clément Renault <clement@meilisearch.com>
2024-10-22 08:14:18 +00:00
3753f87fd8 Merge #5011
5011: Revamp analytics r=ManyTheFish a=irevoire

# Pull Request

## Related issue
Fixes https://github.com/meilisearch/meilisearch/issues/5009

## What does this PR do?
- Force every analytics to go through a trait that forces you to handle aggregation correcty
- Put the code to retrieve the `user-agent`, `timestamp` and `requests.total_received` in common between all aggregates, so there is no mistake
- Get rids of all the different channel for each kind of event in favor of an any map
- Ensure that we never [send empty event ever again](https://github.com/meilisearch/meilisearch/pull/5001)
- Merge all the sub-settings route into a global « Settings Updated » event.
- Fix: When using one of the three following feature, we were not sending any analytics IF they were set from the global route
  - /non-separator-tokens
  - /separator-tokens
  - /dictionary

Co-authored-by: Tamo <tamo@meilisearch.com>
2024-10-21 15:08:49 +00:00
8ef8035bf2 Fix CI 2024-10-21 08:28:33 +02:00
3353bcd82d Revert "Change the Meilisearch logo to the kawaii version"
This reverts commit 13d1d78a2d.
2024-10-21 08:21:56 +02:00
9c1e54a2c8 Move crates under a sub folder to clean up the code 2024-10-21 08:18:43 +02:00
5675585fe8 move all the searches structures to new modules 2024-10-20 17:54:43 +02:00
af589c85ec reverse all the settings to keep the last one received instead of the first one received in case we receive the same setting multiple times 2024-10-20 17:40:31 +02:00
ac919df37d simplify the trait a bit more by getting rids of the downcast_aggregate method 2024-10-20 17:36:29 +02:00
73b5722896 rename the other parameter of the aggregate method to new to avoid confusion 2024-10-20 17:31:35 +02:00
c94679bde6 apply review comments 2024-10-20 17:24:12 +02:00
e51e6f902a Highlight partially cropped matches too 2024-10-19 13:42:02 +03:00
6c226a4580 Merge branch 'main' into change-matches-position-phrase-search 2024-10-17 21:25:42 +03:00
89e2d2b2b9 fix the doctest 2024-10-17 13:55:49 +02:00
3a7a20c716 remove the segment feature and always import segment 2024-10-17 11:21:14 +02:00
fa1db6b721 fix the tests 2024-10-17 09:55:30 +02:00
1ab6fec903 send all experimental features in the info event including the runtime one 2024-10-17 09:49:21 +02:00
18ac4032aa Remove the experimental feature seen 2024-10-17 09:35:11 +02:00
d9115b74f0 move the analytics settings code to a dedicated file 2024-10-17 09:32:54 +02:00
0fde49640a make clippy happy 2024-10-17 09:18:25 +02:00
4ee65d870e remove a lot of ununsed code 2024-10-17 09:14:34 +02:00
ef77c7699b add the required shared values between all the events and fix the timestamp 2024-10-17 09:06:23 +02:00
7382fb21e4 fix the main 2024-10-17 08:38:11 +02:00
e4ace98004 fix all the routes + move to a better version of mopa 2024-10-17 01:04:25 +02:00
aa7a34ffe8 make the aggregate method send 2024-10-17 00:43:34 +02:00
6728cfbfac fix the analytics 2024-10-17 00:38:18 +02:00
ea6883189e finish the analytics in all the routes 2024-10-16 21:17:06 +02:00
fdeb47fb54 implements all routes 2024-10-16 17:16:33 +02:00
e66fccc3f2 get rids of the analytics closure 2024-10-16 15:51:48 +02:00
73e87c152a rewrite most of the analytics especially the settings 2024-10-16 15:43:27 +02:00
75b2f22add Merge #5008
5008: Display vectors when no custom vectors where ever provided r=irevoire a=dureuill

# Pull Request

## Related issue
Fixes the issue reported on [Discord](https://discord.com/channels/1006923006964154428/1294653031958446080/1295336784896589967).

## What does this PR do?
- Normal behavior of Meilisearch is to hide `_vectors` even when `retrieveVectors: true` when there is an explicit list of displayed attributes that does not contain vectors
- However, this relied on the field id for the `_vectors` field to exist, which wasn't the case when no `_vectors` was manually provided to documents. This would often be the case for people using autoembedders such as the OpenAI integration.
- This PR fixes the behavior by looking for the `_vectors` string in the `displayedAttributes` when there is no `_vectors` fid.
- This PR also adds a test for this specific situation, that would fail before the PR, and pass after the PR


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-10-15 13:08:47 +00:00
5a74d4729c Add test failing before this PR, OK now 2024-10-14 16:23:28 +02:00
e44e7b5e81 Fix retrieveVectors when explicitly passed in displayed attributes without any document containing _vectors 2024-10-14 16:17:19 +02:00
a0b3887709 Merge #5006
5006: Bring back changes from v1.10.3 r=Kerollmops a=irevoire

# Pull Request

## Related issue
Port the following PR to the latest version: https://github.com/meilisearch/meilisearch/pull/5000
See its description for more information

Co-authored-by: Tamo <tamo@meilisearch.com>
2024-10-14 14:06:35 +00:00
4b4a6c7863 Update meilisearch/src/option.rs
Co-authored-by: Clément Renault <clement@meilisearch.com>
2024-10-14 14:39:34 +02:00
3085092e04 Update meilisearch/src/option.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-10-14 14:39:34 +02:00
c4efd1df4e Update meilisearch/src/option.rs
Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-10-14 14:39:34 +02:00
c32282acb1 improve doc 2024-10-14 14:39:34 +02:00
92070a3578 Implement the experimental drop search after and nb search per core 2024-10-14 14:39:33 +02:00
a90563df3f Merge #5001
5001: Do not send empty edit document by function r=Kerollmops a=irevoire

# Pull Request

We realized that we had a huge usage of the feature from user who didn’t enable the feature at all. That shouldn’t be possible.
After a big investigation with `@gmourier` 
![image](https://github.com/user-attachments/assets/eae3e851-dc5b-4616-80ee-7237a4871522)
We found the issue, it was in the engine

## What does this PR do?
- Do not send the edit by function event to segment if no event was received during this batch

Co-authored-by: Tamo <tamo@meilisearch.com>
2024-10-11 08:27:16 +00:00
466604725e Do not send empty edit document by function 2024-10-10 23:47:15 +02:00
995394a516 Merge #4993
4993: Update mini-dashboard r=ManyTheFish a=curquiza

Remove the forced capitalized attribute name

Co-authored-by: curquiza <clementine@meilisearch.com>
2024-10-10 05:57:45 +00:00
6e37ae8619 Update mini-dashboard 2024-10-09 19:13:14 +02:00
657c645603 Merge #4992
4992: fix the bad experimental search queue size r=dureuill a=irevoire

# Pull Request

## Related issue
Fixes #4991 

## What does this PR do?
- Set the right default value for the experimental search queue size in the config file


Co-authored-by: Tamo <tamo@meilisearch.com>
2024-10-09 10:45:48 +00:00
7f5d0837c3 fix the bad experimental search queue size 2024-10-09 11:46:57 +02:00
30f3c30389 Merge #4962
4962: test: improve performance of create_index.rs r=irevoire a=DerTimonius

# Pull Request

## Related issue
related to #4840 

## What does this PR do?
This PR follows the instructions in #4840 and improves the performance of `meilisearch/tests/index/create_index.rs`. The tests run locally, if they fail in the CI I'll try to fix them

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Timon Jurschitsch <timon.jurschitsch@gmail.com>
2024-10-08 13:00:56 +00:00
d907d1b22d Merge #4990
4990: Add image source label to dockerfiles r=curquiza a=wuast94

To get changelogs shown with Renovate a docker container has to add the source label described in the OCI Image Format Specification.

For reference: https://github.com/renovatebot/renovate/blob/main/lib/modules/datasource/docker/readme.md

Co-authored-by: Marc <github@wuast24.de>
Co-authored-by: Clémentine <clementine@meilisearch.com>
2024-10-08 12:19:38 +00:00
ed267fa063 Apply suggestions from code review 2024-10-08 14:14:16 +02:00
6af55b1a80 Update Dockerfile 2024-10-08 11:59:43 +02:00
5b04189f7a remove flaky assert 2024-10-07 16:50:57 +02:00
c0912aa685 add missing shared servers 2024-10-07 16:29:47 +02:00
af38f46621 Merge branch 'main' of https://github.com/meilisearch/meilisearch into test/improve-create-index 2024-10-07 16:27:57 +02:00
03579aba13 Adjust test 2024-10-04 11:38:47 +03:00
c3de3a9ab7 Refactor 2024-10-04 11:30:31 +03:00
386ca86297 Merge #4963
4963: test: improve performance of delete_index.rs r=curquiza a=DerTimonius

# Pull Request

## Related issue
related to #4840

## What does this PR do?
This PR follows the instructions in #4840 and improves the performance of `meilisearch/tests/index/delete_index.rs`. The tests run locally, if they fail in the CI I'll try to fix them

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: Timon Jurschitsch <timon.jurschitsch@gmail.com>
2024-10-03 15:40:07 +00:00
8221c94e7f Split into multiple files, refactor 2024-10-03 15:37:51 +03:00
c427d9e2ad Merge branch 'main' into change-matches-position-phrase-search 2024-10-03 10:42:34 +03:00
40336ce87d Fix and refactor crop_bounds 2024-10-03 10:40:14 +03:00
2a18917af3 add delete_index_fail function 2024-10-02 16:23:21 +02:00
0566f2549d Merge #4972
4972: Add binary quantized to error messages r=irevoire a=dureuill

was missing in error messages

Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-10-02 09:23:55 +00:00
0c2661ea90 Fix tests 2024-10-02 11:20:29 +02:00
62dfbd6255 Add binary quantized to allowed fields for source adds its sources 2024-10-02 11:20:02 +02:00
cc669f90d5 Merge #4971
4971: update arroy r=dureuill a=irevoire

# Pull Request

Fix part of https://github.com/meilisearch/meilisearch/issues/3715


## What does this PR do?
- Update arroy to the latest version, most change are maintenance changes
- The performances of adding vectors to arroy should slightly improve
- Forward the build cancellation function to arroy so it can stop building trees when we have to stop an indexing process


Co-authored-by: Tamo <tamo@meilisearch.com>
2024-10-02 05:53:51 +00:00
37a9d64c44 Fix failing test, refactor 2024-10-01 22:52:01 +03:00
b1dc10e771 uses the new cancellation method in arroy 2024-10-01 17:45:49 +02:00
4b598fa648 update arroy 2024-10-01 17:31:12 +02:00
17571805b4 use shared servers 2024-10-01 17:27:27 +02:00
2654ce6e6c use shared servers 2024-10-01 17:01:47 +02:00
d9e4db9983 Refactor 2024-10-01 17:50:59 +03:00
6d16230f17 Refactor 2024-10-01 17:19:15 +03:00
eabc14c268 Refactor, handle more cases for phrases 2024-09-30 21:24:41 +03:00
e78da35287 Merge #4930
4930: Return `UserError::InvalidDocumentId` for primary keys with a length greater than 512 bytes r=curquiza a=flevi29

# Pull Request

## Related issue
Fixes #4843

## PR checklist
Please check if your PR fulfills the following requirements:
- [x] Does this PR fix an existing issue, or have you listed the changes applied in the PR description (and why they are needed)?
- [x] Have you read the contributing guidelines?
- [x] Have you made sure that the title is accurate and descriptive of the changes?

Thank you so much for contributing to Meilisearch!


Co-authored-by: F. Levi <55688616+flevi29@users.noreply.github.com>
2024-09-30 15:55:05 +00:00
84b4219a4f test: improve delete_index.rs 2024-09-29 10:16:31 +02:00
5539a1904a test: improve performance of create_index.rs 2024-09-28 11:05:52 +02:00
00ccf53ffa Merge branch 'main' into change-matches-position-phrase-search 2024-09-27 15:52:05 +03:00
d20a39b959 Refactor find_best_match_interval 2024-09-27 15:44:30 +03:00
71b364286b Merge #4957
4957: Update charabia feature flags r=dureuill a=ManyTheFish

# Pull Request

Add charabia's `turkish` feature flag into Meilisearch default tokenization flag



[All tests pipeline](https://github.com/meilisearch/meilisearch/actions/runs/11030036031)

Co-authored-by: ManyTheFish <many@meilisearch.com>
2024-09-26 20:19:21 +00:00
86183e0807 Merge #4960
4960: Update rhai r=dureuill a=irevoire

# Pull Request

## Related issue
Fixes https://github.com/meilisearch/meilisearch/issues/4956

A fix has been implemented in https://github.com/rhaiscript/rhai/issues/916

## What does this PR do?
- Use the latest version of rhai containing the fix

Co-authored-by: Tamo <tamo@meilisearch.com>
2024-09-26 15:03:01 +00:00
78a4b7949d update rhai to a version that shouldn’t panic 2024-09-26 15:04:03 +02:00
dc2cb58cf1 use charabia default for all-tokenization 2024-09-25 11:12:30 +02:00
e9580fe619 Add turkish normalization 2024-09-25 11:03:17 +02:00
8205254f4c Merge #4955
4955: Upgrade "batch failed" log to error level r=irevoire a=dureuill

# Pull Request

## Related issue
Fixes #4916 


Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-09-25 08:18:44 +00:00
efdc5739d7 Merge #4953
4953: Move the multi arroy index logic to the arroy wrapper r=irevoire a=irevoire

# Pull Request

## Related issue
Fixes https://github.com/meilisearch/meilisearch/issues/4948

## What does this PR do?
- Make the `ArroyWrapper` we introduced in the last PR handle all the embedded for a specific docid itself.


Co-authored-by: Tamo <tamo@meilisearch.com>
2024-09-24 15:02:24 +00:00
b31e9bea26 while retrieving the readers on an arroywrapper, stops at the first empty reader 2024-09-24 16:33:17 +02:00
7f048b9732 early exit in the clear and contains 2024-09-24 15:02:38 +02:00
8b4e2c7b17 Remove now unused method 2024-09-24 15:00:25 +02:00
645a55317a merge the build and quantize method 2024-09-24 14:54:24 +02:00
8caf97db86 Merge #4954
4954: Fix bench by adding embedder r=ManyTheFish a=dureuill

Fix benchmark workloads following breaking change on embedders

Co-authored-by: Louis Dureuil <louis@meilisearch.com>
2024-09-24 12:53:34 +00:00
b8a74e0464 fix comments 2024-09-24 10:59:15 +02:00
fd8447c521 fix the del items thing 2024-09-24 10:52:05 +02:00
f2d187ba3e rename the index method to embedder_index 2024-09-24 10:39:40 +02:00
79d8a7a51a rename the embedder index for clarity 2024-09-24 10:36:28 +02:00
86da0e83fe Upgrade "batch failed" log to ERROR level 2024-09-24 10:02:53 +02:00
0704fb71e9 Fix bench by adding embedder 2024-09-24 09:56:47 +02:00
1e4d4e69c4 finish the arroywrapper 2024-09-23 18:56:15 +02:00
6ba4baecbf first ugly step 2024-09-23 15:15:26 +02:00
afa3ae0cbd WIP 2024-09-19 17:42:52 +02:00
0ffeea5a52 Remove wrong comments 2024-09-19 09:06:40 +03:00
30aa1f6dea Merge with main 2024-09-18 11:03:33 +03:00
83113998f9 Add more test assertions 2024-09-18 10:35:23 +03:00
f7337affd6 Adjust tests to changes 2024-09-17 17:31:09 +03:00
e098cc8320 Make comparison simpler, add IndexUid error details similarly 2024-09-17 00:16:15 +03:00
ec815fa368 Format 2024-09-16 23:59:48 +03:00
4a922a176f Add test for > 512 byte ID 2024-09-16 23:53:34 +03:00
51bc7b3173 Update tests 2024-09-16 22:22:24 +03:00
993408d3ba Change closure to fn 2024-09-15 16:15:09 +03:00
dcb61f8b3a Return error for primary keys with a length greater than 512 bytes 2024-09-14 11:34:13 +03:00
51085206cc Misc adjustments 2024-09-14 10:14:07 +03:00
a2a16bf846 Move MatchPosition impl to Match, adjust counting score for phrases 2024-09-13 21:20:06 +03:00
cab63abc84 Improve MatchesPosition enum with an impl 2024-09-13 14:35:28 +03:00
65e3d61a95 Make use of helper function in one more place 2024-09-13 13:35:58 +03:00
cc6a2aec06 Improve changes to Matcher 2024-09-13 13:31:07 +03:00
e7af499314 Improve changes to Matcher 2024-09-12 16:58:13 +03:00
edcb4c60ba Change Matcher so that phrases are counted as one instead of word by word 2024-09-12 09:46:08 +03:00
1186 changed files with 19413 additions and 9907 deletions

View File

@ -43,7 +43,7 @@ jobs:
# Run benchmarks
- name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}
run: |
cd benchmarks
cd crates/benchmarks
cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}
# Generate critcmp files

View File

@ -88,7 +88,7 @@ jobs:
# Run benchmarks
- name: Run benchmarks - Dataset ${{ steps.command.outputs.command-arguments }} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}
run: |
cd benchmarks
cd crates/benchmarks
cargo bench --bench ${{ steps.command.outputs.command-arguments }} -- --save-baseline ${{ steps.file.outputs.basename }}
# Generate critcmp files

View File

@ -41,7 +41,7 @@ jobs:
# Run benchmarks
- name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}
run: |
cd benchmarks
cd crates/benchmarks
cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}
# Generate critcmp files

View File

@ -40,7 +40,7 @@ jobs:
# Run benchmarks
- name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}
run: |
cd benchmarks
cd crates/benchmarks
cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}
# Generate critcmp files

View File

@ -40,7 +40,7 @@ jobs:
# Run benchmarks
- name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}
run: |
cd benchmarks
cd crates/benchmarks
cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}
# Generate critcmp files

View File

@ -40,7 +40,7 @@ jobs:
# Run benchmarks
- name: Run benchmarks - Dataset ${BENCH_NAME} - Branch ${{ steps.current_branch.outputs.name }} - Commit ${{ steps.commit_sha.outputs.short }}
run: |
cd benchmarks
cd crates/benchmarks
cargo bench --bench ${BENCH_NAME} -- --save-baseline ${{ steps.file.outputs.basename }}
# Generate critcmp files

View File

@ -21,10 +21,10 @@ jobs:
- name: Install cargo-flaky
run: cargo install cargo-flaky
- name: Run cargo flaky in the dumps
run: cd dump; cargo flaky -i 100 --release
run: cd crates/dump; cargo flaky -i 100 --release
- name: Run cargo flaky in the index-scheduler
run: cd index-scheduler; cargo flaky -i 100 --release
run: cd crates/index-scheduler; cargo flaky -i 100 --release
- name: Run cargo flaky in the auth
run: cd meilisearch-auth; cargo flaky -i 100 --release
run: cd crates/meilisearch-auth; cargo flaky -i 100 --release
- name: Run cargo flaky in meilisearch
run: cd meilisearch; cargo flaky -i 100 --release
run: cd crates/meilisearch; cargo flaky -i 100 --release

View File

@ -65,9 +65,9 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [macos-12, windows-2022]
os: [macos-13, windows-2022]
include:
- os: macos-12
- os: macos-13
artifact_name: meilisearch
asset_name: meilisearch-macos-amd64
- os: windows-2022
@ -90,7 +90,7 @@ jobs:
publish-macos-apple-silicon:
name: Publish binary for macOS silicon
runs-on: macos-12
runs-on: macos-13
needs: check-version
strategy:
matrix:

View File

@ -33,7 +33,7 @@ jobs:
- name: Setup test with Rust stable
uses: dtolnay/rust-toolchain@1.79
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.7.1
uses: Swatinem/rust-cache@v2.7.5
- name: Run cargo check without any default features
uses: actions-rs/cargo@v1
with:
@ -51,11 +51,11 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [macos-12, windows-2022]
os: [macos-13, windows-2022]
steps:
- uses: actions/checkout@v3
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.7.1
uses: Swatinem/rust-cache@v2.7.5
- uses: dtolnay/rust-toolchain@1.79
- name: Run cargo check without any default features
uses: actions-rs/cargo@v1
@ -127,7 +127,7 @@ jobs:
apt-get install build-essential -y
- uses: dtolnay/rust-toolchain@1.79
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.7.1
uses: Swatinem/rust-cache@v2.7.5
- name: Run tests in debug
uses: actions-rs/cargo@v1
with:
@ -144,7 +144,7 @@ jobs:
profile: minimal
components: clippy
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.7.1
uses: Swatinem/rust-cache@v2.7.5
- name: Run cargo clippy
uses: actions-rs/cargo@v1
with:
@ -163,11 +163,11 @@ jobs:
override: true
components: rustfmt
- name: Cache dependencies
uses: Swatinem/rust-cache@v2.7.1
uses: Swatinem/rust-cache@v2.7.5
- name: Run cargo fmt
# Since we never ran the `build.rs` script in the benchmark directory we are missing one auto-generated import file.
# Since we want to trigger (and fail) this action as fast as possible, instead of building the benchmark crate
# we are going to create an empty file where rustfmt expects it.
run: |
echo -ne "\n" > benchmarks/benches/datasets_paths.rs
echo -ne "\n" > crates/benchmarks/benches/datasets_paths.rs
cargo fmt --all -- --check

3
.gitignore vendored
View File

@ -5,7 +5,6 @@
**/*.json_lines
**/*.rs.bk
/*.mdb
/query-history.txt
/data.ms
/snapshots
/dumps
@ -19,4 +18,4 @@
*.snap.new
# Fuzzcheck data for the facet indexing fuzz test
milli/fuzz/update::facet::incremental::fuzz::fuzz/
crates/milli/fuzz/update::facet::incremental::fuzz::fuzz/

48
Cargo.lock generated
View File

@ -386,8 +386,28 @@ checksum = "96d30a06541fbafbc7f82ed10c06164cfbd2c401138f6addd8404629c4b16711"
[[package]]
name = "arroy"
version = "0.4.0"
source = "git+https://github.com/meilisearch/arroy/?rev=2386594dfb009ce08821a925ccc89fb8e30bf73d#2386594dfb009ce08821a925ccc89fb8e30bf73d"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dfc5f272f38fa063bbff0a7ab5219404e221493de005e2b4078c62d626ef567e"
dependencies = [
"bytemuck",
"byteorder",
"heed",
"log",
"memmap2",
"nohash",
"ordered-float",
"rand",
"rayon",
"roaring",
"tempfile",
"thiserror",
]
[[package]]
name = "arroy"
version = "0.5.0"
source = "git+https://github.com/meilisearch/arroy/?tag=DO-NOT-DELETE-upgrade-v04-to-v05#053807bf38dc079f25b003f19fc30fbf3613f6e7"
dependencies = [
"bytemuck",
"byteorder",
@ -706,9 +726,9 @@ checksum = "2c676a478f63e9fa2dd5368a42f28bba0d6c560b775f38583c8bbaa7fcd67c9c"
[[package]]
name = "bytemuck"
version = "1.16.1"
version = "1.19.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b236fc92302c97ed75b38da1f4917b5cdda4984745740f153a5d3059e48d725e"
checksum = "8334215b81e418a0a7bdb8ef0849474f40bb10c8b71f1c4ed315cff49f32494d"
dependencies = [
"bytemuck_derive",
]
@ -2555,7 +2575,7 @@ name = "index-scheduler"
version = "1.11.0"
dependencies = [
"anyhow",
"arroy",
"arroy 0.5.0 (registry+https://github.com/rust-lang/crates.io-index)",
"big_s",
"bincode",
"crossbeam",
@ -3414,6 +3434,7 @@ dependencies = [
"meilisearch-types",
"mimalloc",
"mime",
"mopa-maintained",
"num_cpus",
"obkv",
"once_cell",
@ -3515,6 +3536,7 @@ name = "meilitool"
version = "1.11.0"
dependencies = [
"anyhow",
"arroy 0.5.0 (git+https://github.com/meilisearch/arroy/?tag=DO-NOT-DELETE-upgrade-v04-to-v05)",
"clap",
"dump",
"file-store",
@ -3545,7 +3567,7 @@ dependencies = [
name = "milli"
version = "1.11.0"
dependencies = [
"arroy",
"arroy 0.5.0 (registry+https://github.com/rust-lang/crates.io-index)",
"big_s",
"bimap",
"bincode",
@ -3680,6 +3702,12 @@ dependencies = [
"syn 2.0.60",
]
[[package]]
name = "mopa-maintained"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "79b7f3e22167862cc7c95b21a6f326c22e4bf40da59cbf000b368a310173ba11"
[[package]]
name = "mutually_exclusive_features"
version = "0.0.3"
@ -4581,9 +4609,8 @@ dependencies = [
[[package]]
name = "rhai"
version = "1.19.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "61797318be89b1a268a018a92a7657096d83f3ecb31418b9e9c16dcbb043b702"
version = "1.20.0"
source = "git+https://github.com/rhaiscript/rhai?rev=ef3df63121d27aacd838f366f2b83fd65f20a1e4#ef3df63121d27aacd838f366f2b83fd65f20a1e4"
dependencies = [
"ahash 0.8.11",
"bitflags 2.6.0",
@ -4600,8 +4627,7 @@ dependencies = [
[[package]]
name = "rhai_codegen"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a5a11a05ee1ce44058fa3d5961d05194fdbe3ad6b40f904af764d81b86450e6b"
source = "git+https://github.com/rhaiscript/rhai?rev=ef3df63121d27aacd838f366f2b83fd65f20a1e4#ef3df63121d27aacd838f366f2b83fd65f20a1e4"
dependencies = [
"proc-macro2",
"quote",

View File

@ -1,24 +1,24 @@
[workspace]
resolver = "2"
members = [
"meilisearch",
"meilitool",
"meilisearch-types",
"meilisearch-auth",
"meili-snap",
"index-scheduler",
"dump",
"file-store",
"permissive-json-pointer",
"milli",
"filter-parser",
"flatten-serde-json",
"json-depth-checker",
"benchmarks",
"fuzzers",
"tracing-trace",
"xtask",
"build-info",
"crates/meilisearch",
"crates/meilitool",
"crates/meilisearch-types",
"crates/meilisearch-auth",
"crates/meili-snap",
"crates/index-scheduler",
"crates/dump",
"crates/file-store",
"crates/permissive-json-pointer",
"crates/milli",
"crates/filter-parser",
"crates/flatten-serde-json",
"crates/json-depth-checker",
"crates/benchmarks",
"crates/fuzzers",
"crates/tracing-trace",
"crates/xtask",
"crates/build-info",
]
[workspace.package]

View File

@ -21,6 +21,7 @@ RUN set -eux; \
# Run
FROM alpine:3.20
LABEL org.opencontainers.image.source="https://github.com/meilisearch/meilisearch"
ENV MEILI_HTTP_ADDR 0.0.0.0:7700
ENV MEILI_SERVER_PROVIDER docker

View File

@ -1,6 +1,9 @@
<p align="center">
<a href="https://www.meilisearch.com/?utm_campaign=oss&utm_source=github&utm_medium=meilisearch&utm_content=logo" target="_blank">
<img src="assets/meilisearch-logo-kawaii.png">
<a href="https://www.meilisearch.com/?utm_campaign=oss&utm_source=github&utm_medium=meilisearch&utm_content=logo#gh-light-mode-only" target="_blank">
<img src="assets/meilisearch-logo-light.svg?sanitize=true#gh-light-mode-only">
</a>
<a href="https://www.meilisearch.com/?utm_campaign=oss&utm_source=github&utm_medium=meilisearch&utm_content=logo#gh-dark-mode-only" target="_blank">
<img src="assets/meilisearch-logo-dark.svg?sanitize=true#gh-dark-mode-only">
</a>
</p>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 98 KiB

View File

@ -1,6 +1,6 @@
status = [
'Tests on ubuntu-20.04',
'Tests on macos-12',
'Tests on macos-13',
'Tests on windows-2022',
'Run Clippy',
'Run Rustfmt',

View File

@ -1,6 +1,7 @@
#![allow(clippy::type_complexity)]
#![allow(clippy::wrong_self_convention)]
use meilisearch_types::batches::BatchId;
use meilisearch_types::error::ResponseError;
use meilisearch_types::keys::Key;
use meilisearch_types::milli::update::IndexDocumentsMethod;
@ -57,6 +58,9 @@ pub enum Version {
#[serde(rename_all = "camelCase")]
pub struct TaskDump {
pub uid: TaskId,
// The batch ID were introduced in v1.12, everything prior to this version will be `None`.
#[serde(default)]
pub batch_uid: Option<BatchId>,
#[serde(default)]
pub index_uid: Option<String>,
pub status: Status,
@ -143,6 +147,7 @@ impl From<Task> for TaskDump {
fn from(task: Task) -> Self {
TaskDump {
uid: task.uid,
batch_uid: task.batch_uid,
index_uid: task.index_uid().map(|uid| uid.to_string()),
status: task.status,
kind: task.kind.into(),
@ -297,6 +302,7 @@ pub(crate) mod test {
(
TaskDump {
uid: 0,
batch_uid: Some(0),
index_uid: Some(S("doggo")),
status: Status::Succeeded,
kind: KindDump::DocumentImport {
@ -320,6 +326,7 @@ pub(crate) mod test {
(
TaskDump {
uid: 1,
batch_uid: None,
index_uid: Some(S("doggo")),
status: Status::Enqueued,
kind: KindDump::DocumentImport {
@ -346,6 +353,7 @@ pub(crate) mod test {
(
TaskDump {
uid: 5,
batch_uid: None,
index_uid: Some(S("catto")),
status: Status::Enqueued,
kind: KindDump::IndexDeletion,

View File

@ -70,6 +70,7 @@ impl CompatV5ToV6 {
let task = v6::Task {
uid: task_view.uid,
batch_uid: None,
index_uid: task_view.index_uid,
status: match task_view.status {
v5::Status::Enqueued => v6::Status::Enqueued,
@ -449,7 +450,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"41f91d3a94911b2735ec41b07540df5c");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"4b03e23e740b27bfb9d2a1faffe512e2");
assert_eq!(update_files.len(), 22);
assert!(update_files[0].is_none()); // the dump creation
assert!(update_files[1].is_some()); // the enqueued document addition

View File

@ -222,7 +222,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"278f63325ef06ca04d01df98d8207b94");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"2b8a72d6bc6ba79980491966437daaf9");
assert_eq!(update_files.len(), 10);
assert!(update_files[0].is_none()); // the dump creation
assert!(update_files[1].is_none());
@ -345,7 +345,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"d45cd8571703e58ae53c7bd7ce3f5c22");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"3ddf6169b0a3703c5d770971f036fc5d");
assert_eq!(update_files.len(), 2);
assert!(update_files[0].is_none()); // the dump creation
assert!(update_files[1].is_none()); // the processed document addition
@ -391,7 +391,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"41f91d3a94911b2735ec41b07540df5c");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"4b03e23e740b27bfb9d2a1faffe512e2");
assert_eq!(update_files.len(), 22);
assert!(update_files[0].is_none()); // the dump creation
assert!(update_files[1].is_some()); // the enqueued document addition
@ -471,7 +471,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"c2445ddd1785528b80f2ba534d3bd00c");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"c1b06a5ca60d5805483c16c5b3ff61ef");
assert_eq!(update_files.len(), 10);
assert!(update_files[0].is_some()); // the enqueued document addition
assert!(update_files[1..].iter().all(|u| u.is_none())); // everything already processed
@ -548,7 +548,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"cd12efd308fe3ed226356a727ab42ed3");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"0e203b6095f7c68dbdf788321dcc8215");
assert_eq!(update_files.len(), 10);
assert!(update_files[0].is_some()); // the enqueued document addition
assert!(update_files[1..].iter().all(|u| u.is_none())); // everything already processed
@ -641,7 +641,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"bc616290adfe7d09a624cf6065ca9069");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"d216c7f90f538ffbb2a059531d7ac89a");
assert_eq!(update_files.len(), 9);
assert!(update_files[0].is_some()); // the enqueued document addition
assert!(update_files[1..].iter().all(|u| u.is_none())); // everything already processed
@ -734,7 +734,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"2db37756d8af1fb7623436b76e8956a6");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"e27999f1112632222cb84f6cffff7c5f");
assert_eq!(update_files.len(), 8);
assert!(update_files[0..].iter().all(|u| u.is_none())); // everything already processed
@ -810,7 +810,7 @@ pub(crate) mod test {
// tasks
let tasks = dump.tasks().unwrap().collect::<Result<Vec<_>>>().unwrap();
let (tasks, update_files): (Vec<_>, Vec<_>) = tasks.into_iter().unzip();
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"8df6eab075a44b3c1af6b726f9fd9a43");
meili_snap::snapshot_hash!(meili_snap::json_string!(tasks), @"0155a664b0cf62aae23db5138b6b03d7");
assert_eq!(update_files.len(), 9);
assert!(update_files[..].iter().all(|u| u.is_none())); // no update file in dump v1

Some files were not shown because too many files have changed in this diff Show More