Compare commits

...

74 Commits

Author SHA1 Message Date
44cb7f68f9 Merge #878
878: Bump meilisearch v0.13.0 r=MarinPostma a=MarinPostma



Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-22 09:18:56 +00:00
25dc2ad66f update changelog 2020-07-22 10:56:19 +02:00
624bd56459 bump meilisearch version 2020-07-22 10:56:19 +02:00
7a6615cfa7 Merge #785
785: Adding a tracking issue template r=MarinPostma a=qdequele



Co-authored-by: Quentin de Quelen <quentin@dequelen.me>
2020-07-22 08:49:27 +00:00
bcad3ffd7c Merge #873
873: Update CI for new workflow r=MarinPostma a=MarinPostma

This pr implements the necessary automation for our new release workflow.

## Pre-releases

whenever something is pushed to a branch `release-v*`, tests are triggered. If all test pass, the current reference is checked to see if it's a release branch. If it's a release branch, a pre-release is created for this branch and assets are automatically generated for this branch. The prerelease has the tag `vx.x.xrcn` where `x.x.x` is the version extracteds from the branch name, and n is the number of commits since the branch was forked from master. (starting from rc0).

## Releases

Whenever something is pushed to stable and tagged `vx.x.x` where `x.x.x` is the version, tests are run and a release is generated containing the assets, and binaries are published to docker, brew, apt, etc.

Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-22 08:24:24 +00:00
98d87fa1ff Merge #868
868: Update error.rs r=MarinPostma a=tpayet



Co-authored-by: Thomas Payet <thomas@meilisearch.com>
2020-07-21 16:54:56 +00:00
7e00bf4bfa update ci to new workflow 2020-07-21 16:52:01 +02:00
c39b358518 Update error.rs 2020-07-20 14:42:47 +02:00
982fb7b786 Merge #858
858: update error url r=LegendreM a=MarinPostma

@bidoubiwa 

Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-16 14:55:52 +00:00
7dc628965c Merge #846
846: Change settings behavior r=LegendreM a=MarinPostma

partially implements #824.

Returning the field distribution for all know fields is more complicated that anticipated, see https://github.com/meilisearch/MeiliSearch/issues/824#issuecomment-657656561

If we decide to to it anyway, and find a reasonable solution, I will make another PR.

fix #853 by resetting displayed and searchable attributes to wildcard when attributes are set to `[]` in the all settings route. @curquiza @bidoubiwa can you confirm me that this is the expected behavior?

Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-16 14:31:06 +00:00
d114250ebb requested changes 2020-07-16 16:19:15 +02:00
8eec3bcdc2 update error url 2020-07-16 15:14:53 +02:00
0583cd8e5d Merge pull request #810 from MarinPostma/remove-sys-info
remove the sys-info routes
2020-07-15 20:24:18 +02:00
83b6fc48e1 remove the sys-info routes 2020-07-15 19:33:29 +02:00
4b5437a882 fix displayed attrs empty array bug 2020-07-15 19:25:24 +02:00
de4caef468 test reset attributes to wildcard 2020-07-15 18:56:19 +02:00
36b763b84e test setting attributes before adding documents 2020-07-15 18:56:19 +02:00
c06dd35af1 fix tests 2020-07-15 18:56:19 +02:00
51b7cb2722 remove accept new fields / add indexed * 2020-07-15 18:56:19 +02:00
7f5fb50307 add displayed attributes wildcard 2020-07-15 18:56:19 +02:00
4262561596 Merge #819
819: run clippy during tests r=MarinPostma a=MarinPostma



Co-authored-by: marin <postma.marin@protonmail.com>
Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-15 08:07:42 +00:00
8471796987 add clippy component 2020-07-13 18:53:19 +02:00
2775aeb6ac Merge #794
794: Check database version mismatch r=MarinPostma a=MarinPostma

Checks if the versions of the database and the engine are compatible.

The database and the engine are compatible if they share the same major and minor version.

The engine will refuse to start if there is a mismatch.

@bidoubiwa do we need to document this?

Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-13 15:08:33 +00:00
a747e79e5d run clippy during tests 2020-07-13 16:15:32 +02:00
5773c5c865 check version file against regex 2020-07-13 16:06:28 +02:00
51d7c84e73 better exit on error
Update meilisearch-core/src/database.rs

Co-authored-by: Clément Renault <renault.cle@gmail.com>

Update meilisearch-core/src/database.rs

Co-authored-by: Clément Renault <renault.cle@gmail.com>
2020-07-13 16:06:28 +02:00
6f0b6933e6 update changelog 2020-07-13 16:05:56 +02:00
f5a936614a error on meili database version mismatch 2020-07-13 16:05:08 +02:00
308630c094 Merge #841
841: Unique docid bugfix r=LegendreM a=MarinPostma

fix #827 

Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-13 13:36:32 +00:00
f54397e0cf test unique document id bug 2020-07-13 15:14:07 +02:00
754efe1f42 fix document id uniqueness bug 2020-07-13 15:14:07 +02:00
05c30c879f Merge #791
791: Create tests for error codes r=LegendreM a=MarinPostma

- create tests for error codes
-  fix primary key error that returned internal error instead of the correct error
- bits of documentation for error
- change a bunch of error type, for better accuracy, @curquiza, @eskombro, @bidoubiwa  you may want to take a look at `meilisearch-error/src/lib.rs`
- fix #836 

Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-13 13:12:21 +00:00
99e8d4adae fix missing primary key 2020-07-13 14:54:25 +02:00
ac63f1cd7a fix typo in error code 2020-07-13 14:54:25 +02:00
169749396b update error types to be more accurate 2020-07-13 14:54:25 +02:00
a0637c2c6d Merge #842
842: bors setup r=LegendreM a=MarinPostma

set up bors to run the tests and merge automatically.

the tests are now run only on staging and trying branches

you can use `bors r+` to test and merge the branch into master if the tests succeed

or

you can just use `bors try` to run the test on the trying branch (synced with master)

Co-authored-by: mpostma <postma.marin@protonmail.com>
2020-07-10 13:27:21 +00:00
edbba64711 fix bors.yaml 2020-07-08 21:04:07 +02:00
9ba711dfe5 update readme with bors badge 2020-07-08 14:33:15 +02:00
6bce83dde8 set bors timeout 2020-07-08 13:36:33 +02:00
629a658c75 bors setup 2020-07-08 09:50:07 +02:00
2f6c55ef78 Merge pull request #771 from MarinPostma/placeholder-search
Placeholder search
2020-07-03 18:56:55 +02:00
a6457718f2 update changelog 2020-07-03 17:17:28 +02:00
3bf23a7c59 test placeholder search
move search test macro to common module
2020-07-03 17:17:28 +02:00
bbe3a10107 implement placeholder search 2020-07-03 17:17:28 +02:00
37ee0f36c1 Merge pull request #792 from MarinPostma/error-codes-in-updates
Error codes in updates
2020-07-02 16:17:57 +02:00
e92f544fd1 add test for update errors 2020-07-02 15:18:30 +02:00
d7b49fa671 fix potential infinite loop 2020-07-02 15:18:30 +02:00
41707e3245 fix error on missing document id in document 2020-07-02 15:18:30 +02:00
3c51e9f5ed Enable error code reporting for update errors 2020-07-02 15:18:30 +02:00
7d3e937134 add tests for error codes 2020-07-02 15:18:30 +02:00
6445eea946 update error types to be more accurate 2020-07-02 15:18:28 +02:00
ced6cc0e23 fix bad error report when primary key exists 2020-07-02 15:16:48 +02:00
944a3943e5 Merge pull request #820 from MarinPostma/readme-update
update readme
2020-07-02 15:16:37 +02:00
d419f151a0 update readme 2020-07-02 15:14:05 +02:00
b2124822a3 Merge pull request #825 from Rio/log-analytics-usage
feat(analytics): log if analytics are enabled
2020-07-02 15:02:19 +02:00
f60b912f12 feat(analytics): log if analytics are enabled 2020-07-02 14:33:25 +02:00
e1f956ce18 Merge pull request #821 from aeriksson/patch-1
Fix typo in option.rs
2020-07-02 14:05:00 +02:00
ab16e2eff1 fix merge error 2020-07-02 14:04:15 +02:00
3da607749f Merge branch 'master' into patch-1 2020-07-02 13:57:52 +02:00
a626e5e935 Merge pull request #737 from balajisivaraman/wip_655
Improve test suite performance using Test Dataset
2020-07-02 13:51:38 +02:00
3d73a4895e cleanup movies dataset and related functions 2020-07-02 16:52:39 +05:30
979b01a1c0 update index status test to use the test dataset 2020-07-02 16:52:39 +05:30
38cf489acf update remaining search tests to use the test dataset 2020-07-02 16:52:39 +05:30
60264763f4 update search_settings tests to use the test dataset 2020-07-02 16:52:39 +05:30
d55124e524 update settings_ranking_rules tests to use the test dataset 2020-07-02 16:52:39 +05:30
643933c3b0 update settings tests to use the test dataset 2020-07-02 16:52:39 +05:30
44fd9384bd update stop_words tests to use the test dataset 2020-07-02 16:52:39 +05:30
75d0d2df6c update documents_delete tests to use the test dataset 2020-07-02 16:52:39 +05:30
92d9283d1a Merge pull request #823 from Rio/public-health-endpoint
chore(http): do not require auth on /health endpoint
2020-07-01 17:01:23 +02:00
9b46887f75 chore(http): do not require auth on /health endpoint
This makes it easier to determine the health of the server using http.

closes #822
2020-07-01 16:33:01 +02:00
ad267cbe59 Merge pull request #813 from Rio/remove-hardcoded-sentry-dsn
feat(sentry): make sentry dsn customizable
2020-07-01 16:15:21 +02:00
029772e11f Fix typo in option.rs 2020-07-01 13:45:00 +02:00
2ef888d100 chore(sentry): make sentry dsn customizable
By removing the hardcoded value the sentry client will fall back to pulling
it from the SENTRY_DSN environment variable. The hardcoded value has been
moved to the default value of the commandline options so the default
behavior will be the same.

A `--no-sentry` and `MEILI_NO_SENTRY` option has also been introduced
that effectively disables sentry reporting.
2020-07-01 12:55:14 +02:00
ce7a9073e1 Adding a tracking issue template 2020-06-18 11:09:00 +02:00
52 changed files with 2938 additions and 102611 deletions

View File

@ -0,0 +1,40 @@
---
name: Tracking issue
about: Template for a tracking issue
title: ''
labels: tracking-issue
assignees: ''
---
# Summary
One paragraph to explain the feature.
# Motivations
Why are we doing this? What use cases does it support? What is the expected outcome?
# Explanation
Explain the proposal like it was the final documentation of this proposal.
- What is changing for end-users.
- How it works.
- What is breaking?
- Examples.
# Implementation
Explain the technical specificities that will need to be known or done in order to implement this proposal.
## Steps
Describe each step to create the feature with it's associated issue/PR.
# Related
- [ ] Validated by the team (@people needed)
- [ ] Test added
- [ ] [Documentation](https://github.com/meilisearch/documentation/issues/#xxx) //Change xxx or remove the line
- [ ] [SDK/Integrations](https://github.com/meilisearch/integration-guides/issues/#xxx) //Change xxx or remove the line

View File

@ -1,9 +1,8 @@
name: Publish binaries to GitHub release
on:
push:
tags:
- '*'
release:
types: [published]
name: Publish binaries to release
jobs:
publish:

View File

@ -2,7 +2,7 @@ name: Publish deb pkg to GitHub release & APT repository & Homebrew
on:
release:
types: [published]
types: [released]
jobs:
debian:

View File

@ -1,7 +1,7 @@
---
on:
release:
types: [published]
types: [released]
name: Publish latest image to Docker Hub

View File

@ -1,5 +1,12 @@
---
on: [pull_request]
on:
push:
branches:
- release-v*
- trying
- staging
tags:
- 'v[0-9]+.[0-9]+.[0-9]+' # this only concerns tags on stable
name: Test binaries with cargo test
@ -10,7 +17,6 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest, macos-latest]
steps:
- uses: actions/checkout@v1
- uses: actions-rs/toolchain@v1
@ -18,11 +24,17 @@ jobs:
profile: minimal
toolchain: stable
override: true
components: clippy
- name: Run cargo test
uses: actions-rs/cargo@v1
with:
command: test
args: --locked --release
- name: Run cargo clippy
uses: actions-rs/cargo@v1
with:
command: clippy
build-image:
name: Test the build of Docker image
runs-on: ubuntu-latest
@ -30,3 +42,52 @@ jobs:
- uses: actions/checkout@v1
- run: docker build . --file Dockerfile -t meilisearch
name: Docker build
## A push occurred on a release branch, a prerelease is created and assets are generated
prerelease:
name: create prerelease
needs: [check, build-image]
if: ${{ contains(github.ref, 'release-') && github.event_name == 'push' }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Get version number
id: version-number
run: echo "##[set-output name=number;]$(echo ${{ github.ref }} | sed 's/.*\(v.*\)/\1/')"
- name: Get commit count
id: commit-count
run: echo "##[set-output name=count;]$(git rev-list remotes/origin/master..remotes/origin/release-${{ steps.version-number.outputs.number }} --count)"
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.PUBLISH_TOKEN }} # Personal Access Token
with:
tag_name: ${{ steps.version-number.outputs.number }}rc${{ steps.commit-count.outputs.count }}
release_name: Pre-release ${{ steps.version-number.outputs.number }}-rc${{ steps.commit-count.outputs.count }}
prerelease: true
## If a tag is pushed, a release is created for this tag, and assets will be generated
release:
name: create release
needs: [check, build-image]
if: ${{ contains(github.ref, 'tags/v') }}
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Get version number
id: version-number
run: echo "##[set-output name=number;]$(echo ${{ github.ref }} | sed 's/.*\(v.*\)/\1/')"
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.PUBLISH_TOKEN }} # PAT
with:
tag_name: ${{ steps.version-number.outputs.number }}
release_name: Meilisearch ${{ steps.version-number.outputs.number }}
prerelease: false

View File

@ -1,6 +1,19 @@
## v0.13.0
- placeholder search (#771)
- Add database version mismatch check (#794)
- Displayed and searchable attributes wildcard (#846)
- Remove sys-info route (#810)
- Fix facet distribution case (#797)
- Check database version mismatch (#794)
- Fix unique docid bug (#841)
- Error codes in updates (#792)
- Sentry disable argument (#813)
- Log analytics if enabled (#825)
## v0.12.0
- Fix long documents not being indexed completely bug (#816)
- Fix long documents not being indexed completely bug (#816)
- Fix distinct attribute returning id instead of name (#800)
- error code rename (#805)

63
Cargo.lock generated
View File

@ -804,12 +804,6 @@ dependencies = [
"generic-array",
]
[[package]]
name = "doc-comment"
version = "0.3.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fea41bba32d969b513997752735605054bc0dfa92b4c56bf1189f2e174be7a10"
[[package]]
name = "dtoa"
version = "0.4.5"
@ -1059,15 +1053,6 @@ dependencies = [
"typenum",
]
[[package]]
name = "getopts"
version = "0.2.21"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "14dbbfd5c71d70241ecf9e6f13737f7b5ce823821063188d7e46c41d371eebd5"
dependencies = [
"unicode-width",
]
[[package]]
name = "getrandom"
version = "0.1.14"
@ -1496,7 +1481,7 @@ checksum = "60302e4db3a61da70c0cb7991976248362f30319e88850c487b9b95bbf059e00"
[[package]]
name = "meilisearch-core"
version = "0.12.0"
version = "0.13.0"
dependencies = [
"arc-swap",
"assert_matches",
@ -1543,14 +1528,14 @@ dependencies = [
[[package]]
name = "meilisearch-error"
version = "0.12.0"
version = "0.13.0"
dependencies = [
"actix-http",
]
[[package]]
name = "meilisearch-http"
version = "0.12.0"
version = "0.13.0"
dependencies = [
"actix-cors",
"actix-http",
@ -1574,7 +1559,6 @@ dependencies = [
"meilisearch-schema",
"meilisearch-tokenizer",
"mime",
"pretty-bytes",
"rand 0.7.3",
"regex",
"rustls 0.16.0",
@ -1587,7 +1571,6 @@ dependencies = [
"siphasher",
"slice-group-by",
"structopt",
"sysinfo",
"tempdir",
"tokio",
"ureq",
@ -1598,7 +1581,7 @@ dependencies = [
[[package]]
name = "meilisearch-schema"
version = "0.12.0"
version = "0.13.0"
dependencies = [
"indexmap",
"meilisearch-error",
@ -1609,7 +1592,7 @@ dependencies = [
[[package]]
name = "meilisearch-tokenizer"
version = "0.12.0"
version = "0.13.0"
dependencies = [
"deunicode",
"slice-group-by",
@ -1617,7 +1600,7 @@ dependencies = [
[[package]]
name = "meilisearch-types"
version = "0.12.0"
version = "0.13.0"
dependencies = [
"serde",
"zerocopy",
@ -1729,15 +1712,6 @@ dependencies = [
"void",
]
[[package]]
name = "ntapi"
version = "0.3.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7a31937dea023539c72ddae0e3571deadc1414b300483fa7aaec176168cfa9d2"
dependencies = [
"winapi 0.3.8",
]
[[package]]
name = "num-integer"
version = "0.1.42"
@ -1954,16 +1928,6 @@ version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "237a5ed80e274dbc66f86bd59c1e25edc039660be53194b5fe0a482e0f2612ea"
[[package]]
name = "pretty-bytes"
version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "009d6edd2c1dbf2e1c0cd48a2f7766e03498d49ada7109a01c6911815c685316"
dependencies = [
"atty",
"getopts",
]
[[package]]
name = "proc-macro-error"
version = "1.0.2"
@ -2621,21 +2585,6 @@ dependencies = [
"unicode-xid",
]
[[package]]
name = "sysinfo"
version = "0.14.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5b796215da5a4b2a1a5db53ee55866c13b74a89acd259ab762eb10e28e937cb5"
dependencies = [
"cfg-if",
"doc-comment",
"libc",
"ntapi",
"once_cell",
"rayon",
"winapi 0.3.8",
]
[[package]]
name = "tempdir"
version = "0.3.7"

View File

@ -20,6 +20,7 @@
<a href="https://github.com/meilisearch/MeiliSearch/blob/master/LICENSE"><img src="https://img.shields.io/badge/license-MIT-informational" alt="License"></a>
<a href="https://slack.meilisearch.com"><img src="https://img.shields.io/badge/slack-MeiliSearch-blue.svg?logo=slack" alt="Slack"></a>
<a href="https://github.com/meilisearch/MeiliSearch/discussions" alt="Discussions"><img src="https://img.shields.io/badge/github-discussions-red" /></a>
<a href="https://app.bors.tech/repositories/26457"><img src="https://bors.tech/images/badge_small.svg" alt="Bors enabled"></a>
</p>
<p align="center">⚡ Lightning Fast, Ultra Relevant, and Typo-Tolerant Search Engine 🔍</p>
@ -43,6 +44,7 @@ For more information about features go to [our documentation](https://docs.meili
* Whole documents are returned
* Highly customizable
* RESTful API
* Faceted search and filtering
## Get started
@ -92,6 +94,8 @@ cd MeiliSearch
In the cloned repository, compile MeiliSearch.
```bash
rustup override set stable
rustup update stable
cargo run --release
```

3
bors.toml Normal file
View File

@ -0,0 +1,3 @@
status = ["Test on macos-latest", "Test on ubuntu-latest"]
# 4 hours timeout
timeout-sec = 14400

View File

@ -1,6 +1,6 @@
[package]
name = "meilisearch-core"
version = "0.12.0"
version = "0.13.0"
license = "MIT"
authors = ["Kerollmops <clement@meilisearch.com>"]
edition = "2018"
@ -24,10 +24,10 @@ intervaltree = "0.2.5"
itertools = "0.9.0"
levenshtein_automata = { version = "0.2.0", features = ["fst_automaton"] }
log = "0.4.8"
meilisearch-error = { path = "../meilisearch-error", version = "0.12.0" }
meilisearch-schema = { path = "../meilisearch-schema", version = "0.12.0" }
meilisearch-tokenizer = { path = "../meilisearch-tokenizer", version = "0.12.0" }
meilisearch-types = { path = "../meilisearch-types", version = "0.12.0" }
meilisearch-error = { path = "../meilisearch-error", version = "0.13.0" }
meilisearch-schema = { path = "../meilisearch-schema", version = "0.13.0" }
meilisearch-tokenizer = { path = "../meilisearch-tokenizer", version = "0.13.0" }
meilisearch-types = { path = "../meilisearch-types", version = "0.13.0" }
once_cell = "1.3.1"
ordered-float = { version = "1.0.2", features = ["serde"] }
pest = { git = "https://github.com/MarinPostma/pest.git", tag = "meilisearch-patch1" }

View File

@ -368,7 +368,7 @@ fn search_command(command: SearchCommand, database: Database) -> Result<(), Box<
});
}
let result = builder.query(ref_reader, &query, 0..command.number_results)?;
let result = builder.query(ref_reader, Some(&query), 0..command.number_results)?;
let mut retrieve_duration = Duration::default();

View File

@ -10,16 +10,16 @@ use std::fmt;
use compact_arena::{SmallArena, Idx32, mk_arena};
use log::debug;
use meilisearch_types::DocIndex;
use sdset::{Set, SetBuf, exponential_search, SetOperation, Counter, duo::OpBuilder};
use slice_group_by::{GroupBy, GroupByMut};
use crate::error::Error;
use meilisearch_types::DocIndex;
use crate::criterion::{Criteria, Context, ContextMut};
use crate::distinct_map::{BufferedDistinctMap, DistinctMap};
use crate::raw_document::RawDocument;
use crate::{database::MainT, reordered_attrs::ReorderedAttrs};
use crate::{Document, DocumentId, MResult, Index};
use crate::{store, Document, DocumentId, MResult, Index, RankedMap, MainReader, Error};
use crate::query_tree::{create_query_tree, traverse_query_tree};
use crate::query_tree::{Operation, QueryResult, QueryKind, QueryId, PostingsKey};
use crate::query_tree::Context as QTContext;
@ -588,8 +588,55 @@ impl Deref for PostingsListView<'_> {
}
}
/// sorts documents ids according to user defined ranking rules.
pub fn placeholder_document_sort(
document_ids: &mut [DocumentId],
index: &store::Index,
reader: &MainReader,
ranked_map: &RankedMap
) -> MResult<()> {
use crate::settings::RankingRule;
use std::cmp::Ordering;
enum SortOrder {
Asc,
Desc,
}
if let Some(ranking_rules) = index.main.ranking_rules(reader)? {
let schema = index.main.schema(reader)?
.ok_or(Error::SchemaMissing)?;
// Select custom rules from ranking rules, and map them to custom rules
// containing a field_id
let ranking_rules = ranking_rules.iter().filter_map(|r|
match r {
RankingRule::Asc(name) => schema.id(name).map(|f| (f, SortOrder::Asc)),
RankingRule::Desc(name) => schema.id(name).map(|f| (f, SortOrder::Desc)),
_ => None,
}).collect::<Vec<_>>();
document_ids.sort_unstable_by(|a, b| {
for (field_id, order) in &ranking_rules {
let a_value = ranked_map.get(*a, *field_id);
let b_value = ranked_map.get(*b, *field_id);
let (a, b) = match order {
SortOrder::Asc => (a_value, b_value),
SortOrder::Desc => (b_value, a_value),
};
match a.cmp(&b) {
Ordering::Equal => continue,
ordering => return ordering,
}
}
Ordering::Equal
});
}
Ok(())
}
/// For each entry in facet_docids, calculates the number of documents in the intersection with candidate_docids.
fn facet_count(
pub fn facet_count(
facet_docids: HashMap<String, HashMap<String, Cow<Set<DocumentId>>>>,
candidate_docids: &Set<DocumentId>,
) -> HashMap<String, HashMap<String, usize>> {

View File

@ -3,13 +3,15 @@ use std::fs::File;
use std::path::Path;
use std::sync::{Arc, RwLock};
use std::{fs, thread};
use std::io::{Read, Write, ErrorKind};
use chrono::{DateTime, Utc};
use crossbeam_channel::{Receiver, Sender};
use heed::types::{Str, Unit, SerdeBincode};
use heed::CompactionOption;
use heed::types::{Str, Unit, SerdeBincode};
use log::{debug, error};
use meilisearch_schema::Schema;
use regex::Regex;
use crate::{store, update, Index, MResult, Error};
@ -161,11 +163,69 @@ fn update_awaiter(
Ok(())
}
/// Ensures Meilisearch version is compatible with the database, returns an error versions mismatch.
/// If create is set to true, a VERSION file is created with the current version.
fn version_guard(path: &Path, create: bool) -> MResult<()> {
let current_version_major = env!("CARGO_PKG_VERSION_MAJOR");
let current_version_minor = env!("CARGO_PKG_VERSION_MINOR");
let current_version_patch = env!("CARGO_PKG_VERSION_PATCH");
let version_path = path.join("VERSION");
match File::open(&version_path) {
Ok(mut file) => {
let mut version = String::new();
file.read_to_string(&mut version)?;
// Matches strings like XX.XX.XX
let re = Regex::new(r"(\d+).(\d+).(\d+)").unwrap();
// Make sure there is a result
let version = re
.captures_iter(&version)
.next()
.ok_or(Error::VersionMismatch("bad VERSION file".to_string()))?;
// the first is always the complete match, safe to unwrap because we have a match
let version_major = version.get(1).unwrap().as_str();
let version_minor = version.get(2).unwrap().as_str();
if version_major != current_version_major || version_minor != current_version_minor {
return Err(Error::VersionMismatch(format!("{}.{}.XX", version_major, version_minor)));
}
}
Err(error) => {
match error.kind() {
ErrorKind::NotFound => {
if create {
// when no version file is found, and we've been told to create one,
// create a new file with the current version in it.
let mut version_file = File::create(&version_path)?;
version_file.write_all(format!("{}.{}.{}",
current_version_major,
current_version_minor,
current_version_patch).as_bytes())?;
} else {
// when no version file is found and we were not told to create one, this
// means that the version is inferior to the one this feature was added in.
return Err(Error::VersionMismatch(format!("<0.12.0")));
}
}
_ => return Err(error.into())
}
}
}
Ok(())
}
impl Database {
pub fn open_or_create(path: impl AsRef<Path>, options: DatabaseOptions) -> MResult<Database> {
let main_path = path.as_ref().join("main");
let update_path = path.as_ref().join("update");
//create db directory
fs::create_dir_all(&path)?;
// create file only if main db wasn't created before (first run)
version_guard(path.as_ref(), !main_path.exists() && !update_path.exists())?;
fs::create_dir_all(&main_path)?;
let env = heed::EnvOpenOptions::new()
.map_size(options.main_map_size)
@ -814,7 +874,7 @@ mod tests {
// even try to search for a document
let reader = db.main_read_txn().unwrap();
let SortResult {documents, .. } = index.query_builder().query(&reader, "21 ", 0..20).unwrap();
let SortResult {documents, .. } = index.query_builder().query(&reader, Some("21 "), 0..20).unwrap();
assert_matches!(documents.len(), 1);
reader.abort().unwrap();
@ -1212,7 +1272,7 @@ mod tests {
let builder = index.query_builder_with_criteria(criteria);
let SortResult {documents, .. } = builder.query(&reader, "Kevin", 0..20).unwrap();
let SortResult {documents, .. } = builder.query(&reader, Some("Kevin"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(

View File

@ -15,22 +15,23 @@ pub type MResult<T> = Result<T, Error>;
#[derive(Debug)]
pub enum Error {
Io(io::Error),
IndexAlreadyExists,
MissingPrimaryKey,
SchemaMissing,
WordIndexMissing,
MissingDocumentId,
MaxFieldsLimitExceeded,
Schema(meilisearch_schema::Error),
Heed(heed::Error),
Fst(fst::Error),
SerdeJson(SerdeJsonError),
Bincode(bincode::Error),
Serializer(SerializerError),
Deserializer(DeserializerError),
FilterParseError(PestError<Rule>),
FacetError(FacetError),
FilterParseError(PestError<Rule>),
Fst(fst::Error),
Heed(heed::Error),
IndexAlreadyExists,
Io(io::Error),
MaxFieldsLimitExceeded,
MissingDocumentId,
MissingPrimaryKey,
Schema(meilisearch_schema::Error),
SchemaMissing,
SerdeJson(SerdeJsonError),
Serializer(SerializerError),
VersionMismatch(String),
WordIndexMissing,
}
impl ErrorCode for Error {
@ -41,7 +42,7 @@ impl ErrorCode for Error {
FacetError(_) => Code::Facet,
FilterParseError(_) => Code::Filter,
IndexAlreadyExists => Code::IndexAlreadyExists,
MissingPrimaryKey => Code::InvalidState,
MissingPrimaryKey => Code::MissingPrimaryKey,
MissingDocumentId => Code::MissingDocumentId,
MaxFieldsLimitExceeded => Code::MaxFieldsLimitExceeded,
Schema(s) => s.error_code(),
@ -53,6 +54,7 @@ impl ErrorCode for Error {
| Bincode(_)
| Serializer(_)
| Deserializer(_)
| VersionMismatch(_)
| Io(_) => Code::Internal,
}
}
@ -124,7 +126,10 @@ impl From<BincodeError> for Error {
impl From<SerializerError> for Error {
fn from(error: SerializerError) -> Error {
Error::Serializer(error)
match error {
SerializerError::DocumentIdNotFound => Error::MissingDocumentId,
e => Error::Serializer(e),
}
}
}
@ -138,22 +143,27 @@ impl fmt::Display for Error {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
use self::Error::*;
match self {
Io(e) => write!(f, "{}", e),
IndexAlreadyExists => write!(f, "index already exists"),
MissingPrimaryKey => write!(f, "schema cannot be built without a primary key"),
SchemaMissing => write!(f, "this index does not have a schema"),
WordIndexMissing => write!(f, "this index does not have a word index"),
MissingDocumentId => write!(f, "document id is missing"),
MaxFieldsLimitExceeded => write!(f, "maximum number of fields in a document exceeded"),
Schema(e) => write!(f, "schema error; {}", e),
Heed(e) => write!(f, "heed error; {}", e),
Fst(e) => write!(f, "fst error; {}", e),
SerdeJson(e) => write!(f, "serde json error; {}", e),
Bincode(e) => write!(f, "bincode error; {}", e),
Serializer(e) => write!(f, "serializer error; {}", e),
Deserializer(e) => write!(f, "deserializer error; {}", e),
FilterParseError(e) => write!(f, "error parsing filter; {}", e),
FacetError(e) => write!(f, "error processing facet filter: {}", e),
FilterParseError(e) => write!(f, "error parsing filter; {}", e),
Fst(e) => write!(f, "fst error; {}", e),
Heed(e) => write!(f, "heed error; {}", e),
IndexAlreadyExists => write!(f, "index already exists"),
Io(e) => write!(f, "{}", e),
MaxFieldsLimitExceeded => write!(f, "maximum number of fields in a document exceeded"),
MissingDocumentId => write!(f, "document id is missing"),
MissingPrimaryKey => write!(f, "schema cannot be built without a primary key"),
Schema(e) => write!(f, "schema error; {}", e),
SchemaMissing => write!(f, "this index does not have a schema"),
SerdeJson(e) => write!(f, "serde json error; {}", e),
Serializer(e) => write!(f, "serializer error; {}", e),
VersionMismatch(version) => write!(f, "Cannot open database, expected MeiliSearch engine version: {}, current engine version: {}.{}.{}",
version,
env!("CARGO_PKG_VERSION_MAJOR"),
env!("CARGO_PKG_VERSION_MINOR"),
env!("CARGO_PKG_VERSION_PATCH")),
WordIndexMissing => write!(f, "this index does not have a word index"),
}
}
}

View File

@ -1,18 +1,20 @@
use std::borrow::Cow;
use std::collections::HashMap;
use std::ops::{Range, Deref};
use std::ops::{Deref, Range};
use std::time::Duration;
use either::Either;
use sdset::SetOperation;
use sdset::{SetOperation, SetBuf, Set};
use meilisearch_schema::FieldId;
use crate::bucket_sort::{bucket_sort, bucket_sort_with_distinct, SortResult, placeholder_document_sort, facet_count};
use crate::database::MainT;
use crate::bucket_sort::{bucket_sort, bucket_sort_with_distinct, SortResult};
use crate::{criterion::Criteria, DocumentId};
use crate::{reordered_attrs::ReorderedAttrs, store, MResult};
use crate::facets::FacetFilter;
use crate::distinct_map::{DistinctMap, BufferedDistinctMap};
use crate::Document;
use crate::{criterion::Criteria, DocumentId};
use crate::{reordered_attrs::ReorderedAttrs, store, MResult, MainReader};
pub struct QueryBuilder<'c, 'f, 'd, 'i> {
criteria: Criteria<'c>,
@ -27,10 +29,7 @@ pub struct QueryBuilder<'c, 'f, 'd, 'i> {
impl<'c, 'f, 'd, 'i> QueryBuilder<'c, 'f, 'd, 'i> {
pub fn new(index: &'i store::Index) -> Self {
QueryBuilder::with_criteria(
index,
Criteria::default(),
)
QueryBuilder::with_criteria(index, Criteria::default())
}
/// sets facet attributes to filter on
@ -43,10 +42,7 @@ impl<'c, 'f, 'd, 'i> QueryBuilder<'c, 'f, 'd, 'i> {
self.facets = facets;
}
pub fn with_criteria(
index: &'i store::Index,
criteria: Criteria<'c>,
) -> Self {
pub fn with_criteria(index: &'i store::Index, criteria: Criteria<'c>) -> Self {
QueryBuilder {
criteria,
searchable_attrs: None,
@ -82,14 +78,11 @@ impl<'c, 'f, 'd, 'i> QueryBuilder<'c, 'f, 'd, 'i> {
reorders.insert_attribute(attribute);
}
pub fn query(
self,
reader: &heed::RoTxn<MainT>,
query: &str,
range: Range<usize>,
) -> MResult<SortResult> {
let facets_docids = match self.facet_filter {
Some(facets) => {
/// returns the documents ids associated with a facet filter by computing the union and
/// intersection of the document sets
fn facets_docids(&self, reader: &MainReader) -> MResult<Option<SetBuf<DocumentId>>> {
let facet_docids = match self.facet_filter {
Some(ref facets) => {
let mut ands = Vec::with_capacity(facets.len());
let mut ors = Vec::new();
for f in facets.deref() {
@ -97,48 +90,50 @@ impl<'c, 'f, 'd, 'i> QueryBuilder<'c, 'f, 'd, 'i> {
Either::Left(keys) => {
ors.reserve(keys.len());
for key in keys {
let docids = self.index.facets.facet_document_ids(reader, &key)?.unwrap_or_default();
let docids = self
.index
.facets
.facet_document_ids(reader, &key)?
.unwrap_or_default();
ors.push(docids);
}
let sets: Vec<_> = ors.iter().map(Cow::deref).collect();
let or_result = sdset::multi::OpBuilder::from_vec(sets).union().into_set_buf();
let or_result = sdset::multi::OpBuilder::from_vec(sets)
.union()
.into_set_buf();
ands.push(Cow::Owned(or_result));
ors.clear();
}
Either::Right(key) =>{
Either::Right(key) => {
match self.index.facets.facet_document_ids(reader, &key)? {
Some(docids) => ands.push(docids),
// no candidates for search, early return.
None => return Ok(SortResult::default()),
None => return Ok(Some(SetBuf::default())),
}
}
};
}
let ands: Vec<_> = ands.iter().map(Cow::deref).collect();
Some(sdset::multi::OpBuilder::from_vec(ands).intersection().into_set_buf())
}
None => None
};
// for each field to retrieve the count for, create an HashMap associating the attribute
// value to a set of matching documents. The HashMaps are them collected in another
// HashMap, associating each HashMap to it's field.
let facet_count_docids = match self.facets {
Some(field_ids) => {
let mut facet_count_map = HashMap::new();
for (field_id, field_name) in field_ids {
let mut key_map = HashMap::new();
for pair in self.index.facets.field_document_ids(reader, field_id)? {
let (facet_key, document_ids) = pair?;
let value = facet_key.value();
key_map.insert(value.to_string(), document_ids);
}
facet_count_map.insert(field_name, key_map);
}
Some(facet_count_map)
Some(
sdset::multi::OpBuilder::from_vec(ands)
.intersection()
.into_set_buf(),
)
}
None => None,
};
Ok(facet_docids)
}
fn standard_query(self, reader: &MainReader, query: &str, range: Range<usize>) -> MResult<SortResult> {
let facets_docids = match self.facets_docids(reader)? {
Some(ids) if ids.is_empty() => return Ok(SortResult::default()),
other => other
};
// for each field to retrieve the count for, create an HashMap associating the attribute
// value to a set of matching documents. The HashMaps are them collected in another
// HashMap, associating each HashMap to it's field.
let facet_count_docids = self.facet_count_docids(reader)?;
match self.distinct {
Some((distinct, distinct_size)) => bucket_sort_with_distinct(
@ -167,6 +162,117 @@ impl<'c, 'f, 'd, 'i> QueryBuilder<'c, 'f, 'd, 'i> {
),
}
}
fn placeholder_query(self, reader: &heed::RoTxn<MainT>, range: Range<usize>) -> MResult<SortResult> {
match self.facets_docids(reader)? {
Some(docids) => {
// We sort the docids from facets according to the criteria set by the user
let mut sorted_docids = docids.clone().into_vec();
let mut sort_result = match self.index.main.ranked_map(reader)? {
Some(ranked_map) => {
placeholder_document_sort(&mut sorted_docids, self.index, reader, &ranked_map)?;
self.sort_result_from_docids(&sorted_docids, range)
},
// if we can't perform a sort, we return documents unordered
None => self.sort_result_from_docids(&docids, range),
};
if let Some(f) = self.facet_count_docids(reader)? {
sort_result.exhaustive_facets_count = Some(true);
sort_result.facets = Some(facet_count(f, &docids));
}
Ok(sort_result)
},
None => {
match self.index.main.sorted_document_ids_cache(reader)? {
// build result from cached document ids
Some(docids) => {
let mut sort_result = self.sort_result_from_docids(&docids, range);
if let Some(f) = self.facet_count_docids(reader)? {
sort_result.exhaustive_facets_count = Some(true);
// document ids are not sorted in natural order, we need to construct a new set
let document_set = SetBuf::from_dirty(Vec::from(docids));
sort_result.facets = Some(facet_count(f, &document_set));
}
Ok(sort_result)
},
// no document id cached, return empty result
None => Ok(SortResult::default()),
}
}
}
}
fn facet_count_docids<'a>(&self, reader: &'a MainReader) -> MResult<Option<HashMap<String, HashMap<String, Cow<'a, Set<DocumentId>>>>>> {
match self.facets {
Some(ref field_ids) => {
let mut facet_count_map = HashMap::new();
for (field_id, field_name) in field_ids {
let mut key_map = HashMap::new();
for pair in self.index.facets.field_document_ids(reader, *field_id)? {
let (facet_key, document_ids) = pair?;
let value = facet_key.value();
key_map.insert(value.to_string(), document_ids);
}
facet_count_map.insert(field_name.clone(), key_map);
}
Ok(Some(facet_count_map))
}
None => Ok(None),
}
}
fn sort_result_from_docids(&self, docids: &[DocumentId], range: Range<usize>) -> SortResult {
let mut sort_result = SortResult::default();
let mut result = match self.filter {
Some(ref filter) => docids
.iter()
.filter(|item| (filter)(**item))
.skip(range.start)
.take(range.end - range.start)
.map(|&id| Document::from_highlights(id, &[]))
.collect::<Vec<_>>(),
None => docids
.iter()
.skip(range.start)
.take(range.end - range.start)
.map(|&id| Document::from_highlights(id, &[]))
.collect::<Vec<_>>(),
};
// distinct is set, remove duplicates with disctinct function
if let Some((distinct, distinct_size)) = &self.distinct {
let mut distinct_map = DistinctMap::new(*distinct_size);
let mut distinct_map = BufferedDistinctMap::new(&mut distinct_map);
result.retain(|doc| {
let id = doc.id;
let key = (distinct)(id);
match key {
Some(key) => distinct_map.register(key),
None => distinct_map.register_without_key(),
}
});
}
sort_result.documents = result;
sort_result.nb_hits = docids.len();
sort_result
}
pub fn query(
self,
reader: &heed::RoTxn<MainT>,
query: Option<&str>,
range: Range<usize>,
) -> MResult<SortResult> {
match query {
Some(query) => self.standard_query(reader, query, range),
None => self.placeholder_query(reader, range),
}
}
}
#[cfg(test)]
@ -181,12 +287,12 @@ mod tests {
use sdset::SetBuf;
use tempfile::TempDir;
use crate::DocIndex;
use crate::Document;
use crate::automaton::normalize_str;
use crate::bucket_sort::SimpleMatch;
use crate::database::{Database,DatabaseOptions};
use crate::database::{Database, DatabaseOptions};
use crate::store::Index;
use crate::DocIndex;
use crate::Document;
use meilisearch_schema::Schema;
fn set_from_stream<'f, I, S>(stream: I) -> fst::Set<Vec<u8>>
@ -366,7 +472,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult { documents, .. } = builder.query(&reader, "iphone from apple", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("iphone from apple"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -389,7 +495,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult { documents, .. } = builder.query(&reader, "hello", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("hello"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -400,7 +506,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult { documents, .. } = builder.query(&reader, "bonjour", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("bonjour"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -504,7 +610,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult { documents, .. } = builder.query(&reader, "hello", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("hello"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -525,7 +631,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult { documents, .. } = builder.query(&reader, "bonjour", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("bonjour"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -546,7 +652,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult { documents, .. } = builder.query(&reader, "salut", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("salut"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -592,7 +698,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult { documents, .. } = builder.query(&reader, "NY subway", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("NY subway"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(1), matches, .. }) => {
@ -614,7 +720,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult { documents, .. } = builder.query(&reader, "NYC subway", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("NYC subway"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(1), matches, .. }) => {
@ -656,7 +762,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "NY", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("NY"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(2), matches, .. }) => {
@ -680,7 +786,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "new york", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("new york"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -714,7 +820,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "NY subway", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("NY subway"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -731,7 +837,8 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "new york subway", 0..20).unwrap();
let SortResult { documents, .. } =
builder.query(&reader, Some("new york subway"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(1), matches, .. }) => {
@ -779,7 +886,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "NY subway", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("NY subway"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(1), matches, .. }) => {
@ -801,7 +908,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "NYC subway", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("NYC subway"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(1), matches, .. }) => {
@ -854,7 +961,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "NY subway broken", 0..20).unwrap();
let SortResult {documents, .. } = builder.query(&reader, Some("NY subway broken"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -870,7 +977,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "NYC subway", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("NYC subway"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(1), matches, .. }) => {
@ -926,8 +1033,8 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder
.query(&reader, "new york underground train broken", 0..20)
let SortResult { documents, .. } = builder
.query(&reader, Some("new york underground train broken"), 0..20)
.unwrap();
let mut iter = documents.into_iter();
@ -956,8 +1063,8 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult {documents, .. } = builder
.query(&reader, "new york city underground train broken", 0..20)
let SortResult { documents, .. } = builder
.query(&reader, Some("new york city underground train broken"), 0..20)
.unwrap();
let mut iter = documents.into_iter();
@ -1000,7 +1107,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "new york big ", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("new york big "), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -1034,7 +1141,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "NY subway ", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("NY subway "), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -1084,8 +1191,8 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder
.query(&reader, "new york city long subway cool ", 0..20)
let SortResult { documents, .. } = builder
.query(&reader, Some("new york city long subway cool "), 0..20)
.unwrap();
let mut iter = documents.into_iter();
@ -1117,7 +1224,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "telephone", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("telephone"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -1134,7 +1241,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "téléphone", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("téléphone"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -1151,7 +1258,7 @@ mod tests {
assert_matches!(iter.next(), None);
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "télephone", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("télephone"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(1), matches, .. }) => {
@ -1178,7 +1285,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "i phone case", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("i phone case"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -1207,7 +1314,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "searchengine", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("searchengine"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -1247,7 +1354,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "searchengine", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("searchengine"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {
@ -1279,7 +1386,7 @@ mod tests {
let reader = db.main_read_txn().unwrap();
let builder = store.query_builder();
let SortResult {documents, .. } = builder.query(&reader, "searchengine", 0..20).unwrap();
let SortResult { documents, .. } = builder.query(&reader, Some("searchengine"), 0..20).unwrap();
let mut iter = documents.into_iter();
assert_matches!(iter.next(), Some(Document { id: DocumentId(0), matches, .. }) => {

View File

@ -29,8 +29,6 @@ pub struct Settings {
#[serde(default, deserialize_with = "deserialize_some")]
pub synonyms: Option<Option<BTreeMap<String, Vec<String>>>>,
#[serde(default, deserialize_with = "deserialize_some")]
pub accept_new_fields: Option<Option<bool>>,
#[serde(default, deserialize_with = "deserialize_some")]
pub attributes_for_faceting: Option<Option<Vec<String>>>,
}
@ -60,7 +58,6 @@ impl Settings {
displayed_attributes: settings.displayed_attributes.into(),
stop_words: settings.stop_words.into(),
synonyms: settings.synonyms.into(),
accept_new_fields: settings.accept_new_fields.into(),
attributes_for_faceting: settings.attributes_for_faceting.into(),
})
}
@ -167,7 +164,6 @@ pub struct SettingsUpdate {
pub displayed_attributes: UpdateState<HashSet<String>>,
pub stop_words: UpdateState<BTreeSet<String>>,
pub synonyms: UpdateState<BTreeMap<String, Vec<String>>>,
pub accept_new_fields: UpdateState<bool>,
pub attributes_for_faceting: UpdateState<Vec<String>>,
}
@ -181,7 +177,6 @@ impl Default for SettingsUpdate {
displayed_attributes: UpdateState::Nothing,
stop_words: UpdateState::Nothing,
synonyms: UpdateState::Nothing,
accept_new_fields: UpdateState::Nothing,
attributes_for_faceting: UpdateState::Nothing,
}
}

View File

@ -2,7 +2,7 @@ use std::borrow::Cow;
use std::collections::HashMap;
use chrono::{DateTime, Utc};
use heed::types::{ByteSlice, OwnedType, SerdeBincode, Str};
use heed::types::{ByteSlice, OwnedType, SerdeBincode, Str, CowSlice};
use meilisearch_schema::{FieldId, Schema};
use meilisearch_types::DocumentId;
use sdset::Set;
@ -25,6 +25,7 @@ const NUMBER_OF_DOCUMENTS_KEY: &str = "number-of-documents";
const RANKED_MAP_KEY: &str = "ranked-map";
const RANKING_RULES_KEY: &str = "ranking-rules";
const SCHEMA_KEY: &str = "schema";
const SORTED_DOCUMENT_IDS_CACHE_KEY: &str = "sorted-document-ids-cache";
const STOP_WORDS_KEY: &str = "stop-words";
const SYNONYMS_KEY: &str = "synonyms";
const UPDATED_AT_KEY: &str = "updated-at";
@ -165,6 +166,14 @@ impl Main {
Ok(self.main.put::<_, Str, ByteSlice>(writer, WORDS_KEY, fst.as_fst().as_bytes())?)
}
pub fn put_sorted_document_ids_cache(self, writer: &mut heed::RwTxn<MainT>, documents_ids: &[DocumentId]) -> MResult<()> {
Ok(self.main.put::<_, Str, CowSlice<DocumentId>>(writer, SORTED_DOCUMENT_IDS_CACHE_KEY, documents_ids)?)
}
pub fn sorted_document_ids_cache(self, reader: &heed::RoTxn<MainT>) -> MResult<Option<Cow<[DocumentId]>>> {
Ok(self.main.get::<_, Str, CowSlice<DocumentId>>(reader, SORTED_DOCUMENT_IDS_CACHE_KEY)?)
}
pub fn put_schema(self, writer: &mut heed::RwTxn<MainT>, schema: &Schema) -> MResult<()> {
Ok(self.main.put::<_, Str, SerdeBincode<Schema>>(writer, SCHEMA_KEY, schema)?)
}

View File

@ -171,15 +171,23 @@ pub fn apply_addition<'a, 'b>(
let mut new_internal_docids = Vec::with_capacity(new_documents.len());
for mut document in new_documents {
let external_docids_get = |docid: &str| {
match (external_docids.get(docid), new_external_docids.get(docid)) {
(_, Some(&id))
| (Some(id), _) => Some(id as u32),
(None, None) => None,
}
};
let (internal_docid, external_docid) =
extract_document_id(
&primary_key,
&document,
&external_docids,
&external_docids_get,
&mut available_ids,
)?;
new_external_docids.insert(external_docid, internal_docid.0);
new_external_docids.insert(external_docid, internal_docid.0 as u64);
new_internal_docids.push(internal_docid);
if partial {
@ -217,7 +225,7 @@ pub fn apply_addition<'a, 'b>(
let mut indexer = RawIndexer::new(stop_words);
// For each document in this update
for (document_id, document) in documents_additions {
for (document_id, document) in &documents_additions {
// For each key-value pair in the document.
for (attribute, value) in document {
let field_id = schema.insert_and_index(&attribute)?;
@ -229,7 +237,7 @@ pub fn apply_addition<'a, 'b>(
&mut indexer,
&schema,
field_id,
document_id,
*document_id,
&value,
)?;
}
@ -257,6 +265,10 @@ pub fn apply_addition<'a, 'b>(
index.facets.add(writer, facet_map)?;
}
// update is finished; update sorted document id cache with new state
let mut document_ids = index.main.internal_docids(writer)?.to_vec();
super::cache_document_ids_sorted(writer, &ranked_map, index, &mut document_ids)?;
Ok(())
}
@ -313,8 +325,8 @@ pub fn reindex_all_documents(writer: &mut heed::RwTxn<MainT>, index: &store::Ind
index.facets.add(writer, facet_map)?;
}
// ^-- https://github.com/meilisearch/MeiliSearch/pull/631#issuecomment-626624470 --v
for document_id in documents_ids_to_reindex {
for result in index.documents_fields.document_fields(writer, document_id)? {
for document_id in &documents_ids_to_reindex {
for result in index.documents_fields.document_fields(writer, *document_id)? {
let (field_id, bytes) = result?;
let value: Value = serde_json::from_slice(bytes)?;
ram_store.insert((document_id, field_id), value);
@ -330,7 +342,7 @@ pub fn reindex_all_documents(writer: &mut heed::RwTxn<MainT>, index: &store::Ind
&mut indexer,
&schema,
field_id,
document_id,
*document_id,
&value,
)?;
}
@ -354,6 +366,10 @@ pub fn reindex_all_documents(writer: &mut heed::RwTxn<MainT>, index: &store::Ind
index.facets.add(writer, facet_map)?;
}
// update is finished; update sorted document id cache with new state
let mut document_ids = index.main.internal_docids(writer)?.to_vec();
super::cache_document_ids_sorted(writer, &ranked_map, index, &mut document_ids)?;
Ok(())
}

View File

@ -8,7 +8,7 @@ use crate::database::{UpdateEvent, UpdateEventsEmitter};
use crate::facets;
use crate::store;
use crate::update::{next_update_id, compute_short_prefixes, Update};
use crate::{DocumentId, Error, MResult, RankedMap};
use crate::{DocumentId, Error, MResult, RankedMap, MainWriter, Index};
pub struct DocumentsDeletion {
updates_store: store::Updates,
@ -153,8 +153,8 @@ pub fn apply_documents_deletion(
}
let deleted_documents_len = deleted_documents.len() as u64;
for id in deleted_documents {
index.docs_words.del_doc_words(writer, id)?;
for id in &deleted_documents {
index.docs_words.del_doc_words(writer, *id)?;
}
let removed_words = fst::Set::from_iter(removed_words).unwrap();
@ -180,5 +180,28 @@ pub fn apply_documents_deletion(
compute_short_prefixes(writer, &words, index)?;
// update is finished; update sorted document id cache with new state
document_cache_remove_deleted(writer, index, &ranked_map, &deleted_documents)?;
Ok(())
}
/// rebuilds the document id cache by either removing deleted documents from the existing cache,
/// and generating a new one from docs in store
fn document_cache_remove_deleted(writer: &mut MainWriter, index: &Index, ranked_map: &RankedMap, documents_to_delete: &HashSet<DocumentId>) -> MResult<()> {
let new_cache = match index.main.sorted_document_ids_cache(writer)? {
// only keep documents that are not in the list of deleted documents. Order is preserved,
// no need to resort
Some(old_cache) => {
old_cache.iter().filter(|docid| !documents_to_delete.contains(docid)).cloned().collect::<Vec<_>>()
}
// couldn't find cached documents, try building a new cache from documents in store
None => {
let mut document_ids = index.main.internal_docids(writer)?.to_vec();
super::cache_document_ids_sorted(writer, ranked_map, index, &mut document_ids)?;
document_ids
}
};
index.main.put_sorted_document_ids_cache(writer, &new_cache)?;
Ok(())
}

View File

@ -6,7 +6,7 @@ use meilisearch_types::DocumentId;
use ordered_float::OrderedFloat;
use serde_json::Value;
use crate::{Number, FstMapCow};
use crate::Number;
use crate::raw_indexer::RawIndexer;
use crate::serde::SerializerError;
use crate::store::DiscoverIds;
@ -98,15 +98,17 @@ pub fn value_to_number(value: &Value) -> Option<Number> {
/// Validates a string representation to be a correct document id and returns
/// the corresponding id or generate a new one, this is the way we produce documents ids.
pub fn discover_document_id(
pub fn discover_document_id<F>(
docid: &str,
external_docids: &FstMapCow,
external_docids_get: F,
available_docids: &mut DiscoverIds<'_>,
) -> Result<DocumentId, SerializerError>
where
F: FnOnce(&str) -> Option<u32>
{
if docid.chars().all(|x| x.is_ascii_alphanumeric() || x == '-' || x == '_') {
match external_docids.get(docid) {
Some(id) => Ok(DocumentId(id as u32)),
match external_docids_get(docid) {
Some(id) => Ok(DocumentId(id)),
None => {
let internal_id = available_docids.next().expect("no more ids available");
Ok(internal_id)
@ -118,12 +120,14 @@ pub fn discover_document_id(
}
/// Extracts and validates the document id of a document.
pub fn extract_document_id(
pub fn extract_document_id<F>(
primary_key: &str,
document: &IndexMap<String, Value>,
external_docids: &FstMapCow,
external_docids_get: F,
available_docids: &mut DiscoverIds<'_>,
) -> Result<(DocumentId, String), SerializerError>
where
F: FnOnce(&str) -> Option<u32>
{
match document.get(primary_key) {
Some(value) => {
@ -132,7 +136,7 @@ pub fn extract_document_id(
Value::String(string) => string.clone(),
_ => return Err(SerializerError::InvalidDocumentIdFormat),
};
discover_document_id(&docid, external_docids, available_docids).map(|id| (id, docid))
discover_document_id(&docid, external_docids_get, available_docids).map(|id| (id, docid))
}
None => Err(SerializerError::DocumentIdNotFound),
}

View File

@ -24,7 +24,10 @@ use sdset::Set;
use serde::{Deserialize, Serialize};
use serde_json::Value;
use crate::{store, MResult};
use meilisearch_error::ErrorCode;
use meilisearch_types::DocumentId;
use crate::{store, MResult, RankedMap};
use crate::database::{MainT, UpdateT};
use crate::settings::SettingsUpdate;
@ -128,6 +131,12 @@ pub struct ProcessedUpdateResult {
pub update_type: UpdateType,
#[serde(skip_serializing_if = "Option::is_none")]
pub error: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub error_type: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub error_code: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub error_link: Option<String>,
pub duration: f64, // in seconds
pub enqueued_at: DateTime<Utc>,
pub processed_at: DateTime<Utc>,
@ -288,7 +297,10 @@ pub fn update_task<'a, 'b>(
let status = ProcessedUpdateResult {
update_id,
update_type,
error: result.map_err(|e| e.to_string()).err(),
error: result.as_ref().map_err(|e| e.to_string()).err(),
error_code: result.as_ref().map_err(|e| e.error_name()).err(),
error_type: result.as_ref().map_err(|e| e.error_type()).err(),
error_link: result.as_ref().map_err(|e| e.error_url()).err(),
duration: duration.as_secs_f64(),
enqueued_at,
processed_at: Utc::now(),
@ -360,3 +372,13 @@ where A: AsRef<[u8]>,
Ok(())
}
fn cache_document_ids_sorted(
writer: &mut heed::RwTxn<MainT>,
ranked_map: &RankedMap,
index: &store::Index,
document_ids: &mut [DocumentId],
) -> MResult<()> {
crate::bucket_sort::placeholder_document_sort(document_ids, index, writer, ranked_map)?;
index.main.put_sorted_document_ids_cache(writer, &document_ids)
}

View File

@ -68,19 +68,13 @@ pub fn apply_settings_update(
UpdateState::Nothing => (),
}
match settings.accept_new_fields {
UpdateState::Update(v) => {
schema.set_accept_new_fields(v);
},
UpdateState::Clear => {
schema.set_accept_new_fields(true);
},
UpdateState::Nothing => (),
}
match settings.searchable_attributes.clone() {
UpdateState::Update(v) => {
schema.update_indexed(v)?;
if v.iter().any(|e| e == "*") || v.is_empty() {
schema.set_all_fields_as_indexed();
} else {
schema.update_indexed(v)?;
}
must_reindex = true;
},
UpdateState::Clear => {
@ -90,7 +84,13 @@ pub fn apply_settings_update(
UpdateState::Nothing => (),
}
match settings.displayed_attributes.clone() {
UpdateState::Update(v) => schema.update_displayed(v)?,
UpdateState::Update(v) => {
if v.contains("*") || v.is_empty() {
schema.set_all_fields_as_displayed();
} else {
schema.update_displayed(v)?
}
},
UpdateState::Clear => {
schema.set_all_fields_as_displayed();
},

View File

@ -1,6 +1,6 @@
[package]
name = "meilisearch-error"
version = "0.12.0"
version = "0.13.0"
authors = ["marin <postma.marin@protonmail.com>"]
edition = "2018"

View File

@ -86,26 +86,33 @@ impl Code {
match self {
// index related errors
CreateIndex => ErrCode::invalid("index_creation_failed", StatusCode::BAD_REQUEST),
// create index is thrown on internal error while creating an index.
CreateIndex => ErrCode::internal("index_creation_failed", StatusCode::BAD_REQUEST),
IndexAlreadyExists => ErrCode::invalid("index_already_exists", StatusCode::BAD_REQUEST),
IndexNotFound => ErrCode::invalid("index_not_found", StatusCode::NOT_FOUND), InvalidIndexUid => ErrCode::invalid("invalid_index_uid", StatusCode::BAD_REQUEST),
// thrown when requesting an unexisting index
IndexNotFound => ErrCode::invalid("index_not_found", StatusCode::NOT_FOUND),
InvalidIndexUid => ErrCode::invalid("invalid_index_uid", StatusCode::BAD_REQUEST),
OpenIndex => ErrCode::internal("index_not_accessible", StatusCode::INTERNAL_SERVER_ERROR),
// invalid state error
InvalidState => ErrCode::internal("invalid_state", StatusCode::INTERNAL_SERVER_ERROR),
MissingPrimaryKey => ErrCode::internal("missing_primary_key", StatusCode::INTERNAL_SERVER_ERROR),
PrimaryKeyAlreadyPresent => ErrCode::internal("primary_key_already_present", StatusCode::INTERNAL_SERVER_ERROR),
// thrown when no primary key has been set
MissingPrimaryKey => ErrCode::invalid("missing_primary_key", StatusCode::BAD_REQUEST),
// error thrown when trying to set an already existing primary key
PrimaryKeyAlreadyPresent => ErrCode::invalid("primary_key_already_present", StatusCode::BAD_REQUEST),
// invalid document
MaxFieldsLimitExceeded => ErrCode::invalid("max_field_limit_exceeded", StatusCode::BAD_REQUEST),
MaxFieldsLimitExceeded => ErrCode::invalid("max_fields_limit_exceeded", StatusCode::BAD_REQUEST),
MissingDocumentId => ErrCode::invalid("missing_document_id", StatusCode::BAD_REQUEST),
// error related to facets
Facet => ErrCode::invalid("invalid_facet", StatusCode::BAD_REQUEST),
// error related to filters
Filter => ErrCode::invalid("invalid_filter", StatusCode::BAD_REQUEST),
BadParameter => ErrCode::invalid("bad_parameter", StatusCode::BAD_REQUEST),
BadRequest => ErrCode::invalid("bad_request", StatusCode::BAD_REQUEST),
DocumentNotFound => ErrCode::internal("document_not_found", StatusCode::NOT_FOUND),
DocumentNotFound => ErrCode::invalid("document_not_found", StatusCode::NOT_FOUND),
Internal => ErrCode::internal("internal", StatusCode::INTERNAL_SERVER_ERROR),
InvalidToken => ErrCode::authentication("invalid_token", StatusCode::FORBIDDEN),
Maintenance => ErrCode::internal("maintenance", StatusCode::SERVICE_UNAVAILABLE),
@ -135,7 +142,7 @@ impl Code {
/// return the doc url ascociated with the error
fn url(&self) -> String {
format!("https://docs.meilisearch.com/error/{}", self.name())
format!("https://docs.meilisearch.com/errors#{}", self.name())
}
}

View File

@ -1,7 +1,7 @@
[package]
name = "meilisearch-http"
description = "MeiliSearch HTTP server"
version = "0.12.0"
version = "0.13.0"
license = "MIT"
authors = [
"Quentin de Quelen <quentin@dequelen.me>",
@ -32,12 +32,11 @@ http = "0.1.19"
indexmap = { version = "1.3.2", features = ["serde-1"] }
log = "0.4.8"
main_error = "0.1.0"
meilisearch-core = { path = "../meilisearch-core", version = "0.12.0" }
meilisearch-error = { path = "../meilisearch-error", version = "0.12.0" }
meilisearch-schema = { path = "../meilisearch-schema", version = "0.12.0" }
meilisearch-tokenizer = {path = "../meilisearch-tokenizer", version = "0.12.0"}
meilisearch-core = { path = "../meilisearch-core", version = "0.13.0" }
meilisearch-error = { path = "../meilisearch-error", version = "0.13.0" }
meilisearch-schema = { path = "../meilisearch-schema", version = "0.13.0" }
meilisearch-tokenizer = {path = "../meilisearch-tokenizer", version = "0.13.0"}
mime = "0.3.16"
pretty-bytes = "0.2.2"
rand = "0.7.3"
regex = "1.3.6"
rustls = "0.16.0"
@ -48,7 +47,6 @@ sha2 = "0.8.1"
siphasher = "0.3.2"
slice-group-by = "0.2.6"
structopt = "0.3.12"
sysinfo = "0.14.5"
tokio = { version = "0.2.18", features = ["macros"] }
ureq = { version = "0.12.0", features = ["tls"], default-features = false }
walkdir = "2.3.1"
@ -70,9 +68,9 @@ features = [
optional = true
[dev-dependencies]
serde_url_params = "0.2.0"
tempdir = "0.3.7"
tokio = { version = "0.2.18", features = ["macros", "time"] }
serde_url_params = "0.2.0"
[dev-dependencies.assert-json-diff]
git = "https://github.com/qdequele/assert-json-diff"

View File

@ -1,9 +1,9 @@
use std::error::Error;
use std::ops::Deref;
use std::sync::Arc;
use meilisearch_core::{Database, DatabaseOptions};
use sha2::Digest;
use sysinfo::Pid;
use crate::index_update_callback;
use crate::option::Opt;
@ -26,7 +26,7 @@ pub struct DataInner {
pub db: Arc<Database>,
pub db_path: String,
pub api_keys: ApiKeys,
pub server_pid: Pid,
pub server_pid: u32,
pub http_payload_size_limit: usize,
}
@ -55,9 +55,9 @@ impl ApiKeys {
}
impl Data {
pub fn new(opt: Opt) -> Data {
pub fn new(opt: Opt) -> Result<Data, Box<dyn Error>> {
let db_path = opt.db_path.clone();
let server_pid = sysinfo::get_current_pid().unwrap();
let server_pid = std::process::id();
let db_opt = DatabaseOptions {
main_map_size: opt.main_map_size,
@ -66,7 +66,7 @@ impl Data {
let http_payload_size_limit = opt.http_payload_size_limit;
let db = Arc::new(Database::open_or_create(opt.db_path, db_opt).unwrap());
let db = Arc::new(Database::open_or_create(opt.db_path, db_opt)?);
let mut api_keys = ApiKeys {
master: opt.master_key,
@ -93,6 +93,6 @@ impl Data {
index_update_callback(&index_uid, &callback_context, status);
}));
data
Ok(data)
}
}

View File

@ -20,11 +20,11 @@ use slice_group_by::GroupBy;
use crate::error::{Error, ResponseError};
pub trait IndexSearchExt {
fn new_search(&self, query: String) -> SearchBuilder;
fn new_search(&self, query: Option<String>) -> SearchBuilder;
}
impl IndexSearchExt for Index {
fn new_search(&self, query: String) -> SearchBuilder {
fn new_search(&self, query: Option<String>) -> SearchBuilder {
SearchBuilder {
index: self,
query,
@ -43,7 +43,7 @@ impl IndexSearchExt for Index {
pub struct SearchBuilder<'a> {
index: &'a Index,
query: String,
query: Option<String>,
offset: usize,
limit: usize,
attributes_to_crop: Option<HashMap<String, usize>>,
@ -156,7 +156,7 @@ impl<'a> SearchBuilder<'a> {
query_builder.set_facets(self.facets);
let start = Instant::now();
let result = query_builder.query(reader, &self.query, self.offset..(self.offset + self.limit));
let result = query_builder.query(reader, self.query.as_deref(), self.offset..(self.offset + self.limit));
let search_result = result.map_err(Error::search_documents)?;
let time_ms = start.elapsed().as_millis() as usize;
@ -245,7 +245,7 @@ impl<'a> SearchBuilder<'a> {
nb_hits: search_result.nb_hits,
exhaustive_nb_hits: search_result.exhaustive_nb_hit,
processing_time_ms: time_ms,
query: self.query.to_string(),
query: self.query.unwrap_or_default(),
facets_distribution: search_result.facets,
exhaustive_facets_count: search_result.exhaustive_facets_count,
};

View File

@ -4,7 +4,7 @@ use actix_cors::Cors;
use actix_web::{middleware, HttpServer};
use main_error::MainError;
use meilisearch_http::helpers::NormalizePath;
use meilisearch_http::{Data, Opt, create_app, index_update_callback};
use meilisearch_http::{create_app, index_update_callback, Data, Opt};
use structopt::StructOpt;
mod analytics;
@ -19,7 +19,11 @@ async fn main() -> Result<(), MainError> {
#[cfg(all(not(debug_assertions), feature = "sentry"))]
let _sentry = sentry::init((
"https://5ddfa22b95f241198be2271aaf028653@sentry.io/3060337",
if !opt.no_sentry {
Some(opt.sentry_dsn.clone())
} else {
None
},
sentry::ClientOptions {
release: sentry::release_name!(),
..Default::default()
@ -36,8 +40,8 @@ async fn main() -> Result<(), MainError> {
}
#[cfg(all(not(debug_assertions), feature = "sentry"))]
if !opt.no_analytics {
sentry::integrations::panic::register_panic_handler();
if !opt.no_sentry && _sentry.is_enabled() {
sentry::integrations::panic::register_panic_handler(); // TODO: This shouldn't be needed when upgrading to sentry 0.19.0. These integrations are turned on by default when using `sentry::init`.
sentry::integrations::env_logger::init(None, Default::default());
}
}
@ -47,14 +51,12 @@ async fn main() -> Result<(), MainError> {
_ => unreachable!(),
}
let data = Data::new(opt.clone());
let data = Data::new(opt.clone())?;
if !opt.no_analytics {
let analytics_data = data.clone();
let analytics_opt = opt.clone();
thread::spawn(move|| {
analytics::analytics_sender(analytics_data, analytics_opt)
});
thread::spawn(move || analytics::analytics_sender(analytics_data, analytics_opt));
}
let data_cloned = data.clone();
@ -69,7 +71,7 @@ async fn main() -> Result<(), MainError> {
.wrap(
Cors::new()
.send_wildcard()
.allowed_headers(vec!["content-type","x-meili-api-key"])
.allowed_headers(vec!["content-type", "x-meili-api-key"])
.max_age(86_400) // 24h
.finish(),
)
@ -117,6 +119,25 @@ pub fn print_launch_resume(opt: &Opt, data: &Data) {
env!("CARGO_PKG_VERSION").to_string()
);
#[cfg(all(not(debug_assertions), feature = "sentry"))]
eprintln!(
"Sentry DSN:\t\t{:?}",
if !opt.no_sentry {
&opt.sentry_dsn
} else {
"Disabled"
}
);
eprintln!(
"Amplitude Analytics:\t{:?}",
if !opt.no_analytics {
"Enabled"
} else {
"Disabled"
}
);
eprintln!();
if data.api_keys.master.is_some() {

View File

@ -26,7 +26,18 @@ pub struct Opt {
#[structopt(long, env = "MEILI_MASTER_KEY")]
pub master_key: Option<String>,
/// This environment variable must be set to `production` if your are running in production.
/// The Sentry DSN to use for error reporting. This defaults to the MeiliSearch Sentry project.
/// You can disable sentry all together using the `--no-sentry` flag or `MEILI_NO_SENTRY` environment variable.
#[cfg(all(not(debug_assertions), feature = "sentry"))]
#[structopt(long, env = "SENTRY_DSN", default_value = "https://5ddfa22b95f241198be2271aaf028653@sentry.io/3060337")]
pub sentry_dsn: String,
/// Disable Sentry error reporting.
#[cfg(all(not(debug_assertions), feature = "sentry"))]
#[structopt(long, env = "MEILI_NO_SENTRY")]
pub no_sentry: bool,
/// This environment variable must be set to `production` if you are running in production.
/// If the server is running in development mode more logs will be displayed,
/// and the master key can be avoided which implies that there is no security on the updates routes.
/// This is useful to debug when integrating the engine with another service.

View File

@ -156,7 +156,7 @@ async fn update_multiple_documents(
let mut schema = index
.main
.schema(&reader)?
.ok_or(Error::internal("Impossible to retrieve the schema"))?;
.ok_or(meilisearch_core::Error::SchemaMissing)?;
if schema.primary_key().is_none() {
let id = match &params.primary_key {
@ -164,7 +164,7 @@ async fn update_multiple_documents(
None => body
.first()
.and_then(find_primary_key)
.ok_or(Error::bad_request("Could not infer a primary key"))?,
.ok_or(meilisearch_core::Error::MissingPrimaryKey)?
};
schema

View File

@ -10,7 +10,7 @@ pub fn services(cfg: &mut web::ServiceConfig) {
cfg.service(get_health).service(change_healthyness);
}
#[get("/health", wrap = "Authentication::Private")]
#[get("/health")]
async fn get_health(data: web::Data<Data>) -> Result<HttpResponse, ResponseError> {
let reader = data.db.main_read_txn()?;
if let Ok(Some(_)) = data.db.get_health(&reader) {

View File

@ -253,17 +253,8 @@ async fn update_index(
if let Some(id) = body.primary_key.clone() {
if let Some(mut schema) = index.main.schema(writer)? {
match schema.primary_key() {
Some(_) => {
return Err(Error::bad_request(
"The primary key cannot be updated",
).into());
}
None => {
schema.set_primary_key(&id)?;
index.main.put_schema(writer, &schema)?;
}
}
schema.set_primary_key(&id)?;
index.main.put_schema(writer, &schema)?;
}
}
index.main.put_updated_at(writer)?;

View File

@ -24,7 +24,7 @@ pub fn services(cfg: &mut web::ServiceConfig) {
#[derive(Serialize, Deserialize)]
#[serde(rename_all = "camelCase", deny_unknown_fields)]
pub struct SearchQuery {
q: String,
q: Option<String>,
offset: Option<usize>,
limit: Option<usize>,
attributes_to_retrieve: Option<String>,
@ -50,7 +50,7 @@ async fn search_with_url_query(
#[derive(Deserialize)]
#[serde(rename_all = "camelCase", deny_unknown_fields)]
pub struct SearchQueryPost {
q: String,
q: Option<String>,
offset: Option<usize>,
limit: Option<usize>,
attributes_to_retrieve: Option<Vec<String>>,
@ -177,7 +177,6 @@ impl SearchQuery {
None => (),
}
}
search_builder.attributes_to_crop(final_attributes);
}

View File

@ -1,6 +1,7 @@
use actix_web::{web, HttpResponse};
use actix_web_macros::{delete, get, post};
use meilisearch_core::settings::{Settings, SettingsUpdate, UpdateState, DEFAULT_RANKING_RULES};
use meilisearch_schema::Schema;
use std::collections::{BTreeMap, BTreeSet, HashSet};
use crate::error::{Error, ResponseError};
@ -24,8 +25,6 @@ pub fn services(cfg: &mut web::ServiceConfig) {
.service(get_displayed)
.service(update_displayed)
.service(delete_displayed)
.service(get_accept_new_fields)
.service(update_accept_new_fields)
.service(get_attributes_for_faceting)
.service(delete_attributes_for_faceting)
.service(update_attributes_for_faceting);
@ -108,23 +107,8 @@ async fn get_all(
_ => vec![],
};
println!("{:?}", attributes_for_faceting);
let searchable_attributes = schema.as_ref().map(|s| {
s.indexed_name()
.iter()
.map(|s| s.to_string())
.collect::<Vec<String>>()
});
let displayed_attributes = schema.as_ref().map(|s| {
s.displayed_name()
.iter()
.map(|s| s.to_string())
.collect::<HashSet<String>>()
});
let accept_new_fields = schema.map(|s| s.accept_new_fields());
let searchable_attributes = schema.as_ref().map(get_indexed_attributes);
let displayed_attributes = schema.as_ref().map(get_displayed_attributes);
let settings = Settings {
ranking_rules: Some(Some(ranking_rules)),
@ -133,7 +117,6 @@ async fn get_all(
displayed_attributes: Some(displayed_attributes),
stop_words: Some(Some(stop_words)),
synonyms: Some(Some(synonyms)),
accept_new_fields: Some(accept_new_fields),
attributes_for_faceting: Some(Some(attributes_for_faceting)),
};
@ -158,7 +141,6 @@ async fn delete_all(
displayed_attributes: UpdateState::Clear,
stop_words: UpdateState::Clear,
synonyms: UpdateState::Clear,
accept_new_fields: UpdateState::Clear,
attributes_for_faceting: UpdateState::Clear,
};
@ -326,7 +308,7 @@ async fn get_searchable(
let reader = data.db.main_read_txn()?;
let schema = index.main.schema(&reader)?;
let searchable_attributes: Option<Vec<String>> =
schema.map(|s| s.indexed_name().iter().map(|i| i.to_string()).collect());
schema.as_ref().map(get_indexed_attributes);
Ok(HttpResponse::Ok().json(searchable_attributes))
}
@ -396,8 +378,7 @@ async fn get_displayed(
let schema = index.main.schema(&reader)?;
let displayed_attributes: Option<HashSet<String>> =
schema.map(|s| s.displayed_name().iter().map(|i| i.to_string()).collect());
let displayed_attributes = schema.as_ref().map(get_displayed_attributes);
Ok(HttpResponse::Ok().json(displayed_attributes))
}
@ -450,52 +431,6 @@ async fn delete_displayed(
Ok(HttpResponse::Accepted().json(IndexUpdateResponse::with_id(update_id)))
}
#[get(
"/indexes/{index_uid}/settings/accept-new-fields",
wrap = "Authentication::Private"
)]
async fn get_accept_new_fields(
data: web::Data<Data>,
path: web::Path<IndexParam>,
) -> Result<HttpResponse, ResponseError> {
let index = data
.db
.open_index(&path.index_uid)
.ok_or(Error::index_not_found(&path.index_uid))?;
let reader = data.db.main_read_txn()?;
let schema = index.main.schema(&reader)?;
let accept_new_fields = schema.map(|s| s.accept_new_fields());
Ok(HttpResponse::Ok().json(accept_new_fields))
}
#[post(
"/indexes/{index_uid}/settings/accept-new-fields",
wrap = "Authentication::Private"
)]
async fn update_accept_new_fields(
data: web::Data<Data>,
path: web::Path<IndexParam>,
body: web::Json<Option<bool>>,
) -> Result<HttpResponse, ResponseError> {
let index = data
.db
.open_index(&path.index_uid)
.ok_or(Error::index_not_found(&path.index_uid))?;
let settings = Settings {
accept_new_fields: Some(body.into_inner()),
..Settings::default()
};
let settings = settings.to_update().map_err(Error::bad_request)?;
let update_id = data.db.update_write(|w| index.settings_update(w, settings))?;
Ok(HttpResponse::Accepted().json(IndexUpdateResponse::with_id(update_id)))
}
#[get(
"/indexes/{index_uid}/settings/attributes-for-faceting",
wrap = "Authentication::Private"
@ -577,3 +512,25 @@ async fn delete_attributes_for_faceting(
Ok(HttpResponse::Accepted().json(IndexUpdateResponse::with_id(update_id)))
}
fn get_indexed_attributes(schema: &Schema) -> Vec<String> {
if schema.is_indexed_all() {
["*"].iter().map(|s| s.to_string()).collect()
} else {
schema.indexed_name()
.iter()
.map(|s| s.to_string())
.collect()
}
}
fn get_displayed_attributes(schema: &Schema) -> HashSet<String> {
if schema.is_displayed_all() {
["*"].iter().map(|s| s.to_string()).collect()
} else {
schema.displayed_name()
.iter()
.map(|s| s.to_string())
.collect()
}
}

View File

@ -5,9 +5,7 @@ use actix_web::HttpResponse;
use actix_web_macros::get;
use chrono::{DateTime, Utc};
use log::error;
use pretty_bytes::converter::convert;
use serde::Serialize;
use sysinfo::{NetworkExt, ProcessExt, ProcessorExt, System, SystemExt};
use walkdir::WalkDir;
use crate::error::{Error, ResponseError};
@ -18,9 +16,7 @@ use crate::Data;
pub fn services(cfg: &mut web::ServiceConfig) {
cfg.service(index_stats)
.service(get_stats)
.service(get_version)
.service(get_sys_info)
.service(get_sys_info_pretty);
.service(get_version);
}
#[derive(Serialize)]
@ -136,204 +132,3 @@ async fn get_version() -> HttpResponse {
pkg_version: env!("CARGO_PKG_VERSION").to_string(),
})
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
struct SysGlobal {
total_memory: u64,
used_memory: u64,
total_swap: u64,
used_swap: u64,
input_data: u64,
output_data: u64,
}
impl SysGlobal {
fn new() -> SysGlobal {
SysGlobal {
total_memory: 0,
used_memory: 0,
total_swap: 0,
used_swap: 0,
input_data: 0,
output_data: 0,
}
}
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
struct SysProcess {
memory: u64,
cpu: f32,
}
impl SysProcess {
fn new() -> SysProcess {
SysProcess {
memory: 0,
cpu: 0.0,
}
}
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
struct SysInfo {
memory_usage: f64,
processor_usage: Vec<f32>,
global: SysGlobal,
process: SysProcess,
}
impl SysInfo {
fn new() -> SysInfo {
SysInfo {
memory_usage: 0.0,
processor_usage: Vec::new(),
global: SysGlobal::new(),
process: SysProcess::new(),
}
}
}
#[get("/sys-info", wrap = "Authentication::Private")]
async fn get_sys_info(data: web::Data<Data>) -> HttpResponse {
let mut sys = System::new();
let mut info = SysInfo::new();
// need to refresh twice for cpu usage
sys.refresh_all();
sys.refresh_all();
for processor in sys.get_processors() {
info.processor_usage.push(processor.get_cpu_usage());
}
info.global.total_memory = sys.get_total_memory();
info.global.used_memory = sys.get_used_memory();
info.global.total_swap = sys.get_total_swap();
info.global.used_swap = sys.get_used_swap();
info.global.input_data = sys
.get_networks()
.into_iter()
.map(|(_, n)| n.get_received())
.sum::<u64>();
info.global.output_data = sys
.get_networks()
.into_iter()
.map(|(_, n)| n.get_transmitted())
.sum::<u64>();
info.memory_usage = sys.get_used_memory() as f64 / sys.get_total_memory() as f64 * 100.0;
if let Some(process) = sys.get_process(data.server_pid) {
info.process.memory = process.memory();
println!("cpu usafe: {}", process.cpu_usage());
info.process.cpu = process.cpu_usage();
}
HttpResponse::Ok().json(info)
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
struct SysGlobalPretty {
total_memory: String,
used_memory: String,
total_swap: String,
used_swap: String,
input_data: String,
output_data: String,
}
impl SysGlobalPretty {
fn new() -> SysGlobalPretty {
SysGlobalPretty {
total_memory: "None".to_owned(),
used_memory: "None".to_owned(),
total_swap: "None".to_owned(),
used_swap: "None".to_owned(),
input_data: "None".to_owned(),
output_data: "None".to_owned(),
}
}
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
struct SysProcessPretty {
memory: String,
cpu: String,
}
impl SysProcessPretty {
fn new() -> SysProcessPretty {
SysProcessPretty {
memory: "None".to_owned(),
cpu: "None".to_owned(),
}
}
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
struct SysInfoPretty {
memory_usage: String,
processor_usage: Vec<String>,
global: SysGlobalPretty,
process: SysProcessPretty,
}
impl SysInfoPretty {
fn new() -> SysInfoPretty {
SysInfoPretty {
memory_usage: "None".to_owned(),
processor_usage: Vec::new(),
global: SysGlobalPretty::new(),
process: SysProcessPretty::new(),
}
}
}
#[get("/sys-info/pretty", wrap = "Authentication::Private")]
async fn get_sys_info_pretty(data: web::Data<Data>) -> HttpResponse {
let mut sys = System::new();
let mut info = SysInfoPretty::new();
sys.refresh_all();
sys.refresh_all();
info.memory_usage = format!(
"{:.1} %",
sys.get_used_memory() as f64 / sys.get_total_memory() as f64 * 100.0
);
for processor in sys.get_processors() {
info.processor_usage
.push(format!("{:.1} %", processor.get_cpu_usage()));
}
info.global.total_memory = convert(sys.get_total_memory() as f64 * 1024.0);
info.global.used_memory = convert(sys.get_used_memory() as f64 * 1024.0);
info.global.total_swap = convert(sys.get_total_swap() as f64 * 1024.0);
info.global.used_swap = convert(sys.get_used_swap() as f64 * 1024.0);
info.global.input_data = convert(
sys.get_networks()
.into_iter()
.map(|(_, n)| n.get_received())
.sum::<u64>() as f64,
);
info.global.output_data = convert(
sys.get_networks()
.into_iter()
.map(|(_, n)| n.get_transmitted())
.sum::<u64>() as f64,
);
if let Some(process) = sys.get_process(data.server_pid) {
info.process.memory = convert(process.memory() as f64 * 1024.0);
info.process.cpu = format!("{:.1} %", process.cpu_usage());
}
HttpResponse::Ok().json(info)
}

File diff suppressed because it is too large Load Diff

View File

@ -1,16 +1,32 @@
#![allow(dead_code)]
use actix_web::{http::StatusCode, test};
use serde_json::{json, Value};
use std::time::Duration;
use actix_web::{http::StatusCode, test};
use meilisearch_core::DatabaseOptions;
use meilisearch_http::data::Data;
use meilisearch_http::option::Opt;
use meilisearch_http::helpers::NormalizePath;
use tempdir::TempDir;
use tokio::time::delay_for;
use meilisearch_core::DatabaseOptions;
use meilisearch_http::data::Data;
use meilisearch_http::helpers::NormalizePath;
use meilisearch_http::option::Opt;
/// Performs a search test on both post and get routes
#[macro_export]
macro_rules! test_post_get_search {
($server:expr, $query:expr, |$response:ident, $status_code:ident | $block:expr) => {
let post_query: meilisearch_http::routes::search::SearchQueryPost = serde_json::from_str(&$query.clone().to_string()).unwrap();
let get_query: meilisearch_http::routes::search::SearchQuery = post_query.into();
let get_query = ::serde_url_params::to_string(&get_query).unwrap();
let ($response, $status_code) = $server.search_get(&get_query).await;
let _ =::std::panic::catch_unwind(|| $block)
.map_err(|e| panic!("panic in get route: {:?}", e.downcast_ref::<&str>().unwrap()));
let ($response, $status_code) = $server.search_post($query).await;
let _ = ::std::panic::catch_unwind(|| $block)
.map_err(|e| panic!("panic in post route: {:?}", e.downcast_ref::<&str>().unwrap()));
};
}
pub struct Server {
uid: String,
data: Data,
@ -34,7 +50,7 @@ impl Server {
..Opt::default()
};
let data = Data::new(opt.clone());
let data = Data::new(opt.clone()).unwrap();
Server {
uid: uid.to_string(),
@ -96,7 +112,6 @@ impl Server {
"longitude",
"tags",
],
"acceptNewFields": false,
});
server.update_all_settings(body).await;
@ -111,17 +126,19 @@ impl Server {
pub async fn wait_update_id(&mut self, update_id: u64) {
loop {
// try 10 times to get status, or panic to not wait forever
for _ in 0..10 {
let (response, status_code) = self.get_update_status(update_id).await;
assert_eq!(status_code, 200);
if response["status"] == "processed" || response["status"] == "error" {
if response["status"] == "processed" || response["status"] == "failed" {
eprintln!("{:#?}", response);
return;
}
delay_for(Duration::from_secs(1)).await;
}
panic!("Timeout waiting for update id");
}
// Global Http request GET/POST/DELETE async or sync
@ -408,16 +425,6 @@ impl Server {
self.delete_request_async(&url).await
}
pub async fn get_accept_new_fields(&mut self) -> (Value, StatusCode) {
let url = format!("/indexes/{}/settings/accept-new-fields", self.uid);
self.get_request(&url).await
}
pub async fn update_accept_new_fields(&mut self, body: Value) {
let url = format!("/indexes/{}/settings/accept-new-fields", self.uid);
self.post_request_async(&url, body).await;
}
pub async fn get_synonyms(&mut self) -> (Value, StatusCode) {
let url = format!("/indexes/{}/settings/synonyms", self.uid);
self.get_request(&url).await
@ -476,59 +483,4 @@ impl Server {
pub async fn get_sys_info_pretty(&mut self) -> (Value, StatusCode) {
self.get_request("/sys-info/pretty").await
}
// Populate routes
pub async fn populate_movies(&mut self) {
let body = json!({
"uid": "movies",
"primaryKey": "id",
});
self.create_index(body).await;
let body = json!({
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"desc(popularity)",
"exactness",
"desc(vote_average)",
],
"searchableAttributes": [
"title",
"tagline",
"overview",
"cast",
"director",
"producer",
"production_companies",
"genres",
],
"displayedAttributes": [
"title",
"director",
"producer",
"tagline",
"genres",
"id",
"overview",
"vote_count",
"vote_average",
"poster_path",
"popularity",
],
"acceptNewFields": false,
});
self.update_all_settings(body).await;
let dataset = include_bytes!("assets/movies.json");
let body: Value = serde_json::from_slice(dataset).unwrap();
self.add_or_replace_multiple_documents(body).await;
}
}

View File

@ -195,3 +195,23 @@ async fn add_document_with_long_field() {
let (response, _status) = server.search_post(json!({ "q": "request_buffering" })).await;
assert!(!response["hits"].as_array().unwrap().is_empty());
}
#[actix_rt::test]
async fn documents_with_same_id_are_overwritten() {
let mut server = common::Server::with_uid("test");
server.create_index(json!({ "uid": "test"})).await;
let documents = json!([
{
"id": 1,
"content": "test1"
},
{
"id": 1,
"content": "test2"
},
]);
server.add_or_replace_multiple_documents(documents).await;
let (response, _status) = server.get_all_documents().await;
assert_eq!(response.as_array().unwrap().len(), 1);
assert_eq!(response.as_array().unwrap()[0].as_object().unwrap()["content"], "test2");
}

View File

@ -2,30 +2,33 @@ mod common;
#[actix_rt::test]
async fn delete() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let (_response, status_code) = server.get_document(419704).await;
let (_response, status_code) = server.get_document(50).await;
assert_eq!(status_code, 200);
server.delete_document(419704).await;
server.delete_document(50).await;
let (_response, status_code) = server.get_document(419704).await;
let (_response, status_code) = server.get_document(50).await;
assert_eq!(status_code, 404);
}
// Resolve the issue https://github.com/meilisearch/MeiliSearch/issues/493
#[actix_rt::test]
async fn delete_batch() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let (_response, status_code) = server.get_document(419704).await;
assert_eq!(status_code, 200);
let doc_ids = vec!(50, 55, 60);
for doc_id in &doc_ids {
let (_response, status_code) = server.get_document(doc_id).await;
assert_eq!(status_code, 200);
}
let body = serde_json::json!([419704, 512200, 181812]);
let body = serde_json::json!(&doc_ids);
server.delete_multiple_documents(body).await;
let (_response, status_code) = server.get_document(419704).await;
assert_eq!(status_code, 404);
for doc_id in &doc_ids {
let (_response, status_code) = server.get_document(doc_id).await;
assert_eq!(status_code, 404);
}
}

View File

@ -0,0 +1,196 @@
mod common;
use std::thread;
use std::time::Duration;
use actix_http::http::StatusCode;
use serde_json::{json, Map, Value};
macro_rules! assert_error {
($code:literal, $type:literal, $status:path, $req:expr) => {
let (response, status_code) = $req;
assert_eq!(status_code, $status);
assert_eq!(response["errorCode"].as_str().unwrap(), $code);
assert_eq!(response["errorType"].as_str().unwrap(), $type);
};
}
macro_rules! assert_error_async {
($code:literal, $type:literal, $server:expr, $req:expr) => {
let (response, _) = $req;
let update_id = response["updateId"].as_u64().unwrap();
for _ in 1..10 {
let (response, status_code) = $server.get_update_status(update_id).await;
assert_eq!(status_code, StatusCode::OK);
if response["status"] == "processed" || response["status"] == "failed" {
println!("response: {}", response);
assert_eq!(response["status"], "failed");
assert_eq!(response["errorCode"], $code);
assert_eq!(response["errorType"], $type);
return
}
thread::sleep(Duration::from_secs(1));
}
};
}
#[actix_rt::test]
async fn index_already_exists_error() {
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "test"
});
let (response, status_code) = server.create_index(body.clone()).await;
println!("{}", response);
assert_eq!(status_code, StatusCode::CREATED);
assert_error!(
"index_already_exists",
"invalid_request_error",
StatusCode::BAD_REQUEST,
server.create_index(body).await);
}
#[actix_rt::test]
async fn index_not_found_error() {
let mut server = common::Server::with_uid("test");
assert_error!(
"index_not_found",
"invalid_request_error",
StatusCode::NOT_FOUND,
server.get_index().await);
}
#[actix_rt::test]
async fn primary_key_already_present_error() {
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "test",
"primaryKey": "test"
});
server.create_index(body.clone()).await;
let body = json!({
"primaryKey": "t"
});
assert_error!(
"primary_key_already_present",
"invalid_request_error",
StatusCode::BAD_REQUEST,
server.update_index(body).await);
}
#[actix_rt::test]
async fn max_field_limit_exceeded_error() {
let mut server = common::Server::test_server().await;
let body = json!({
"uid": "test",
});
server.create_index(body).await;
let mut doc = Map::with_capacity(70_000);
doc.insert("id".into(), Value::String("foo".into()));
for i in 0..69_999 {
doc.insert(format!("field{}", i), Value::String("foo".into()));
}
let docs = json!([doc]);
assert_error_async!(
"max_fields_limit_exceeded",
"invalid_request_error",
server,
server.add_or_replace_multiple_documents_sync(docs).await);
}
#[actix_rt::test]
async fn missing_document_id() {
let mut server = common::Server::test_server().await;
let body = json!({
"uid": "test",
"primaryKey": "test"
});
server.create_index(body).await;
let docs = json!([
{
"foo": "bar",
}
]);
assert_error_async!(
"missing_document_id",
"invalid_request_error",
server,
server.add_or_replace_multiple_documents_sync(docs).await);
}
#[actix_rt::test]
async fn facet_error() {
let mut server = common::Server::test_server().await;
let search = json!({
"q": "foo",
"facetFilters": ["test:hello"]
});
assert_error!(
"invalid_facet",
"invalid_request_error",
StatusCode::BAD_REQUEST,
server.search_post(search).await);
}
#[actix_rt::test]
async fn filters_error() {
let mut server = common::Server::test_server().await;
let search = json!({
"q": "foo",
"filters": "fo:12"
});
assert_error!(
"invalid_filter",
"invalid_request_error",
StatusCode::BAD_REQUEST,
server.search_post(search).await);
}
#[actix_rt::test]
async fn bad_request_error() {
let mut server = common::Server::with_uid("test");
let body = json!({
"foo": "bar",
});
assert_error!(
"bad_request",
"invalid_request_error",
StatusCode::BAD_REQUEST,
server.search_post(body).await);
}
#[actix_rt::test]
async fn document_not_found_error() {
let mut server = common::Server::with_uid("test");
server.create_index(json!({"uid": "test"})).await;
assert_error!(
"document_not_found",
"invalid_request_error",
StatusCode::NOT_FOUND,
server.get_document(100).await);
}
#[actix_rt::test]
async fn payload_too_large_error() {
let mut server = common::Server::with_uid("test");
let bigvec = vec![0u64; 10_000_000]; // 80mb
assert_error!(
"payload_too_large",
"invalid_request_error",
StatusCode::PAYLOAD_TOO_LARGE,
server.create_index(json!(bigvec)).await);
}
#[actix_rt::test]
async fn missing_primary_key_error() {
let mut server = common::Server::with_uid("test");
server.create_index(json!({"uid": "test"})).await;
let document = json!([{
"content": "test"
}]);
assert_error!(
"missing_primary_key",
"invalid_request_error",
StatusCode::BAD_REQUEST,
server.add_or_replace_multiple_documents_sync(document).await);
}

View File

@ -658,18 +658,17 @@ async fn check_add_documents_without_primary_key() {
let (response, status_code) = server.add_or_replace_multiple_documents_sync(body).await;
let message = response["message"].as_str().unwrap();
assert_eq!(response.as_object().unwrap().len(), 4);
assert_eq!(message, "Could not infer a primary key");
assert_eq!(response["errorCode"], "missing_primary_key");
assert_eq!(status_code, 400);
}
#[actix_rt::test]
async fn check_first_update_should_bring_up_processed_status_after_first_docs_addition() {
let mut server = common::Server::with_uid("movies");
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "movies",
"uid": "test",
});
// 1. Create Index
@ -677,7 +676,7 @@ async fn check_first_update_should_bring_up_processed_status_after_first_docs_ad
assert_eq!(status_code, 201);
assert_eq!(response["primaryKey"], json!(null));
let dataset = include_bytes!("assets/movies.json");
let dataset = include_bytes!("assets/test_set.json");
let body: Value = serde_json::from_slice(dataset).unwrap();

View File

@ -0,0 +1,497 @@
use std::convert::Into;
use serde_json::json;
use serde_json::Value;
use std::sync::Mutex;
use std::cell::RefCell;
#[macro_use] mod common;
#[actix_rt::test]
async fn placeholder_search_with_limit() {
let mut server = common::Server::test_server().await;
let query = json! ({
"limit": 3
});
test_post_get_search!(server, query, |response, status_code| {
assert_eq!(status_code, 200);
assert_eq!(response["hits"].as_array().unwrap().len(), 3);
});
}
#[actix_rt::test]
async fn placeholder_search_with_offset() {
let mut server = common::Server::test_server().await;
let query = json!({
"limit": 6,
});
// hack to take a value out of macro (must implement UnwindSafe)
let expected = Mutex::new(RefCell::new(Vec::new()));
test_post_get_search!(server, query, |response, status_code| {
assert_eq!(status_code, 200);
// take results at offset 3 as reference
let lock = expected.lock().unwrap();
lock.replace(response["hits"].as_array().unwrap()[3..6].iter().cloned().collect());
});
let expected = expected.into_inner().unwrap().into_inner();
let query = json!({
"limit": 3,
"offset": 3,
});
test_post_get_search!(server, query, |response, status_code| {
assert_eq!(status_code, 200);
let response = response["hits"].as_array().unwrap();
assert_eq!(&expected, response);
});
}
#[actix_rt::test]
async fn placeholder_search_with_attribute_to_highlight_wildcard() {
// there should be no highlight in placeholder search
let mut server = common::Server::test_server().await;
let query = json!({
"limit": 1,
"attributesToHighlight": ["*"]
});
test_post_get_search!(server, query, |response, status_code| {
assert_eq!(status_code, 200);
let result = response["hits"]
.as_array()
.unwrap()[0]
.as_object()
.unwrap();
for value in result.values() {
assert!(value.to_string().find("<em>").is_none());
}
});
}
#[actix_rt::test]
async fn placeholder_search_with_matches() {
// matches is always empty
let mut server = common::Server::test_server().await;
let query = json!({
"matches": true
});
test_post_get_search!(server, query, |response, status_code| {
assert_eq!(status_code, 200);
let result = response["hits"]
.as_array()
.unwrap()
.iter()
.map(|v| v.as_object().unwrap()["_matchesInfo"].clone())
.all(|m| m.as_object().unwrap().is_empty());
assert!(result);
});
}
#[actix_rt::test]
async fn placeholder_search_witch_crop() {
// placeholder search crop always crop from beggining
let mut server = common::Server::test_server().await;
let query = json!({
"attributesToCrop": ["about"],
"cropLength": 20
});
test_post_get_search!(server, query, |response, status_code| {
assert_eq!(status_code, 200);
let hits = response["hits"].as_array().unwrap();
for hit in hits {
let hit = hit.as_object().unwrap();
let formatted = hit["_formatted"].as_object().unwrap();
let about = hit["about"].as_str().unwrap();
let about_formatted = formatted["about"].as_str().unwrap();
// the formatted about length should be about 20 characters long
assert!(about_formatted.len() < 20 + 10);
// the formatted part should be located at the beginning of the original one
assert_eq!(about.find(&about_formatted).unwrap(), 0);
}
});
}
#[actix_rt::test]
async fn placeholder_search_with_attributes_to_retrieve() {
let mut server = common::Server::test_server().await;
let query = json!({
"limit": 1,
"attributesToRetrieve": ["gender", "about"],
});
test_post_get_search!(server, query, |response, _status_code| {
let hit = response["hits"]
.as_array()
.unwrap()[0]
.as_object()
.unwrap();
assert_eq!(hit.values().count(), 2);
let _ = hit["gender"];
let _ = hit["about"];
});
}
#[actix_rt::test]
async fn placeholder_search_with_filter() {
let mut server = common::Server::test_server().await;
let query = json!({
"filters": "color='green'"
});
test_post_get_search!(server, query, |response, _status_code| {
let hits = response["hits"].as_array().unwrap();
assert!(hits.iter().all(|v| v["color"].as_str().unwrap() == "green"));
});
let query = json!({
"filters": "tags=bug"
});
test_post_get_search!(server, query, |response, _status_code| {
let hits = response["hits"].as_array().unwrap();
let value = Value::String(String::from("bug"));
assert!(hits.iter().all(|v| v["tags"].as_array().unwrap().contains(&value)));
});
let query = json!({
"filters": "color='green' AND (tags='bug' OR tags='wontfix')"
});
test_post_get_search!(server, query, |response, _status_code| {
let hits = response["hits"].as_array().unwrap();
let bug = Value::String(String::from("bug"));
let wontfix = Value::String(String::from("wontfix"));
assert!(hits.iter().all(|v|
v["color"].as_str().unwrap() == "green" &&
v["tags"].as_array().unwrap().contains(&bug) ||
v["tags"].as_array().unwrap().contains(&wontfix)));
});
}
#[actix_rt::test]
async fn placeholder_test_faceted_search_valid() {
let mut server = common::Server::test_server().await;
// simple tests on attributes with string value
let body = json!({
"attributesForFaceting": ["color"]
});
server.update_all_settings(body).await;
let query = json!({
"facetFilters": ["color:green"]
});
test_post_get_search!(server, query, |response, _status_code| {
assert!(!response.get("hits").unwrap().as_array().unwrap().is_empty());
assert!(response
.get("hits")
.unwrap()
.as_array()
.unwrap()
.iter()
.all(|value| value.get("color").unwrap() == "green"));
});
let query = json!({
"facetFilters": [["color:blue"]]
});
test_post_get_search!(server, query, |response, _status_code| {
assert!(!response.get("hits").unwrap().as_array().unwrap().is_empty());
assert!(response
.get("hits")
.unwrap()
.as_array()
.unwrap()
.iter()
.all(|value| value.get("color").unwrap() == "blue"));
});
let query = json!({
"facetFilters": ["color:Blue"]
});
test_post_get_search!(server, query, |response, _status_code| {
assert!(!response.get("hits").unwrap().as_array().unwrap().is_empty());
assert!(response
.get("hits")
.unwrap()
.as_array()
.unwrap()
.iter()
.all(|value| value.get("color").unwrap() == "blue"));
});
// test on arrays: ["tags:bug"]
let body = json!({
"attributesForFaceting": ["color", "tags"]
});
server.update_all_settings(body).await;
let query = json!({
"facetFilters": ["tags:bug"]
});
test_post_get_search!(server, query, |response, _status_code| {
assert!(!response.get("hits").unwrap().as_array().unwrap().is_empty());
assert!(response
.get("hits")
.unwrap()
.as_array()
.unwrap()
.iter()
.all(|value| value.get("tags").unwrap().as_array().unwrap().contains(&Value::String("bug".to_owned()))));
});
// test and: ["color:blue", "tags:bug"]
let query = json!({
"facetFilters": ["color:blue", "tags:bug"]
});
test_post_get_search!(server, query, |response, _status_code| {
assert!(!response.get("hits").unwrap().as_array().unwrap().is_empty());
assert!(response
.get("hits")
.unwrap()
.as_array()
.unwrap()
.iter()
.all(|value| value
.get("color")
.unwrap() == "blue"
&& value.get("tags").unwrap().as_array().unwrap().contains(&Value::String("bug".to_owned()))));
});
// test or: [["color:blue", "color:green"]]
let query = json!({
"facetFilters": [["color:blue", "color:green"]]
});
test_post_get_search!(server, query, |response, _status_code| {
assert!(!response.get("hits").unwrap().as_array().unwrap().is_empty());
assert!(response
.get("hits")
.unwrap()
.as_array()
.unwrap()
.iter()
.all(|value|
value
.get("color")
.unwrap() == "blue"
|| value
.get("color")
.unwrap() == "green"));
});
// test and-or: ["tags:bug", ["color:blue", "color:green"]]
let query = json!({
"facetFilters": ["tags:bug", ["color:blue", "color:green"]]
});
test_post_get_search!(server, query, |response, _status_code| {
assert!(!response.get("hits").unwrap().as_array().unwrap().is_empty());
assert!(response
.get("hits")
.unwrap()
.as_array()
.unwrap()
.iter()
.all(|value|
value
.get("tags")
.unwrap()
.as_array()
.unwrap()
.contains(&Value::String("bug".to_owned()))
&& (value
.get("color")
.unwrap() == "blue"
|| value
.get("color")
.unwrap() == "green")));
});
}
#[actix_rt::test]
async fn placeholder_test_faceted_search_invalid() {
let mut server = common::Server::test_server().await;
//no faceted attributes set
let query = json!({
"facetFilters": ["color:blue"]
});
test_post_get_search!(server, query, |_response, status_code| assert_ne!(status_code, 202));
let body = json!({
"attributesForFaceting": ["color", "tags"]
});
server.update_all_settings(body).await;
// empty arrays are error
// []
let query = json!({
"facetFilters": []
});
test_post_get_search!(server, query, |_response, status_code| assert_ne!(status_code, 202));
// [[]]
let query = json!({
"facetFilters": [[]]
});
test_post_get_search!(server, query, |_response, status_code| assert_ne!(status_code, 202));
// ["color:green", []]
let query = json!({
"facetFilters": ["color:green", []]
});
test_post_get_search!(server, query, |_response, status_code| assert_ne!(status_code, 202));
// too much depth
// [[[]]]
let query = json!({
"facetFilters": [[[]]]
});
test_post_get_search!(server, query, |_response, status_code| assert_ne!(status_code, 202));
// [["color:green", ["color:blue"]]]
let query = json!({
"facetFilters": [["color:green", ["color:blue"]]]
});
test_post_get_search!(server, query, |_response, status_code| assert_ne!(status_code, 202));
// "color:green"
let query = json!({
"facetFilters": "color:green"
});
test_post_get_search!(server, query, |_response, status_code| assert_ne!(status_code, 202));
}
#[actix_rt::test]
async fn placeholder_test_facet_count() {
let mut server = common::Server::test_server().await;
// test without facet distribution
let query = json!({
});
test_post_get_search!(server, query, |response, _status_code|{
assert!(response.get("exhaustiveFacetsCount").is_none());
assert!(response.get("facetsDistribution").is_none());
});
// test no facets set, search on color
let query = json!({
"facetsDistribution": ["color"]
});
test_post_get_search!(server, query.clone(), |_response, status_code|{
assert_eq!(status_code, 400);
});
let body = json!({
"attributesForFaceting": ["color", "tags"]
});
server.update_all_settings(body).await;
// same as before, but now facets are set:
test_post_get_search!(server, query, |response, _status_code|{
println!("{}", response);
assert!(response.get("exhaustiveFacetsCount").is_some());
assert_eq!(response.get("facetsDistribution").unwrap().as_object().unwrap().values().count(), 1);
});
// searching on color and tags
let query = json!({
"facetsDistribution": ["color", "tags"]
});
test_post_get_search!(server, query, |response, _status_code|{
let facets = response.get("facetsDistribution").unwrap().as_object().unwrap();
assert_eq!(facets.values().count(), 2);
assert_ne!(!facets.get("color").unwrap().as_object().unwrap().values().count(), 0);
assert_ne!(!facets.get("tags").unwrap().as_object().unwrap().values().count(), 0);
});
// wildcard
let query = json!({
"facetsDistribution": ["*"]
});
test_post_get_search!(server, query, |response, _status_code|{
assert_eq!(response.get("facetsDistribution").unwrap().as_object().unwrap().values().count(), 2);
});
// wildcard with other attributes:
let query = json!({
"facetsDistribution": ["color", "*"]
});
test_post_get_search!(server, query, |response, _status_code|{
assert_eq!(response.get("facetsDistribution").unwrap().as_object().unwrap().values().count(), 2);
});
// empty facet list
let query = json!({
"facetsDistribution": []
});
test_post_get_search!(server, query, |response, _status_code|{
assert_eq!(response.get("facetsDistribution").unwrap().as_object().unwrap().values().count(), 0);
});
// attr not set as facet passed:
let query = json!({
"facetsDistribution": ["gender"]
});
test_post_get_search!(server, query, |_response, status_code|{
assert_eq!(status_code, 400);
});
}
#[actix_rt::test]
#[should_panic]
async fn placeholder_test_bad_facet_distribution() {
let mut server = common::Server::test_server().await;
// string instead of array:
let query = json!({
"facetsDistribution": "color"
});
test_post_get_search!(server, query, |_response, _status_code| {});
// invalid value in array:
let query = json!({
"facetsDistribution": ["color", true]
});
test_post_get_search!(server, query, |_response, _status_code| {});
}
#[actix_rt::test]
async fn placeholder_test_sort() {
let mut server = common::Server::test_server().await;
let body = json!({
"rankingRules": ["asc(age)"],
"attributesForFaceting": ["color"]
});
server.update_all_settings(body).await;
let query = json!({ });
test_post_get_search!(server, query, |response, _status_code| {
let hits = response["hits"].as_array().unwrap();
hits.iter().map(|v| v["age"].as_u64().unwrap()).fold(0, |prev, cur| {
assert!(cur >= prev);
cur
});
});
let query = json!({
"facetFilters": ["color:green"]
});
test_post_get_search!(server, query, |response, _status_code| {
let hits = response["hits"].as_array().unwrap();
hits.iter().map(|v| v["age"].as_u64().unwrap()).fold(0, |prev, cur| {
assert!(cur >= prev);
cur
});
});
}

File diff suppressed because it is too large Load Diff

View File

@ -6,8 +6,7 @@ mod common;
#[actix_rt::test]
async fn search_with_settings_basic() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let config = json!({
"rankingRules": [
@ -16,93 +15,68 @@ async fn search_with_settings_basic() {
"proximity",
"attribute",
"wordsPosition",
"desc(popularity)",
"desc(age)",
"exactness",
"desc(vote_average)"
"desc(balance)"
],
"distinctAttribute": null,
"searchableAttributes": [
"title",
"tagline",
"overview",
"cast",
"director",
"producer",
"production_companies",
"genres"
"name",
"age",
"color",
"gender",
"email",
"address",
"about"
],
"displayedAttributes": [
"title",
"director",
"producer",
"tagline",
"genres",
"id",
"overview",
"vote_count",
"vote_average",
"poster_path",
"popularity"
"name",
"age",
"gender",
"color",
"email",
"phone",
"address",
"balance"
],
"stopWords": null,
"synonyms": null,
"acceptNewFields": false,
});
server.update_all_settings(config).await;
let query = "q=the%20avangers&limit=3";
let query = "q=ea%20exercitation&limit=3";
let expect = json!([
{
"id": 24428,
"popularity": 44.506,
"vote_average": 7.7,
"title": "The Avengers",
"tagline": "Some assembly required.",
"overview": "When an unexpected enemy emerges and threatens global safety and security, Nick Fury, director of the international peacekeeping agency known as S.H.I.E.L.D., finds himself in need of a team to pull the world back from the brink of disaster. Spanning the globe, a daring recruitment effort begins!",
"director": "Joss Whedon",
"producer": "Kevin Feige",
"genres": [
"Science Fiction",
"Action",
"Adventure"
],
"poster_path": "https://image.tmdb.org/t/p/w500/cezWGskPY5x7GaglTTRN4Fugfb8.jpg",
"vote_count": 21079
"balance": "$2,467.47",
"age": 34,
"color": "blue",
"name": "Patricia Goff",
"gender": "female",
"email": "patriciagoff@chorizon.com",
"phone": "+1 (864) 463-2277",
"address": "866 Hornell Loop, Cresaptown, Ohio, 1700"
},
{
"id": 299534,
"popularity": 38.659,
"vote_average": 8.3,
"title": "Avengers: Endgame",
"tagline": "Part of the journey is the end.",
"overview": "After the devastating events of Avengers: Infinity War, the universe is in ruins due to the efforts of the Mad Titan, Thanos. With the help of remaining allies, the Avengers must assemble once more in order to undo Thanos' actions and restore order to the universe once and for all, no matter what consequences may be in store.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Science Fiction",
"Action"
],
"poster_path": "https://image.tmdb.org/t/p/w500/or06FN3Dka5tukK1e9sl16pB3iy.jpg",
"vote_count": 10497
"balance": "$3,344.40",
"age": 35,
"color": "blue",
"name": "Adeline Flynn",
"gender": "female",
"email": "adelineflynn@chorizon.com",
"phone": "+1 (994) 600-2840",
"address": "428 Paerdegat Avenue, Hollymead, Pennsylvania, 948"
},
{
"id": 299536,
"popularity": 65.013,
"vote_average": 8.3,
"title": "Avengers: Infinity War",
"tagline": "An entire universe. Once and for all.",
"overview": "As the Avengers and their allies have continued to protect the world from threats too large for any one hero to handle, a new danger has emerged from the cosmic shadows: Thanos. A despot of intergalactic infamy, his goal is to collect all six Infinity Stones, artifacts of unimaginable power, and use them to inflict his twisted will on all of reality. Everything the Avengers have fought for has led up to this moment - the fate of Earth and existence itself has never been more uncertain.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Action",
"Science Fiction"
],
"poster_path": "https://image.tmdb.org/t/p/w500/7WsyChQLEftFiDOVTGkv3hFpyyt.jpg",
"vote_count": 16056
"balance": "$3,394.96",
"age": 25,
"color": "blue",
"name": "Aida Kirby",
"gender": "female",
"email": "aidakirby@chorizon.com",
"phone": "+1 (942) 532-2325",
"address": "797 Engert Avenue, Wilsonia, Idaho, 6532"
}
]);
@ -112,8 +86,7 @@ async fn search_with_settings_basic() {
#[actix_rt::test]
async fn search_with_settings_stop_words() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let config = json!({
"rankingRules": [
@ -122,93 +95,67 @@ async fn search_with_settings_stop_words() {
"proximity",
"attribute",
"wordsPosition",
"desc(popularity)",
"desc(age)",
"exactness",
"desc(vote_average)"
"desc(balance)"
],
"distinctAttribute": null,
"searchableAttributes": [
"title",
"tagline",
"overview",
"cast",
"director",
"producer",
"production_companies",
"genres"
"name",
"age",
"color",
"gender",
"email",
"address",
"about"
],
"displayedAttributes": [
"title",
"director",
"producer",
"tagline",
"genres",
"id",
"overview",
"vote_count",
"vote_average",
"poster_path",
"popularity"
"name",
"age",
"gender",
"color",
"email",
"phone",
"address",
"balance"
],
"stopWords": ["the"],
"stopWords": ["ea"],
"synonyms": null,
"acceptNewFields": false,
});
server.update_all_settings(config).await;
let query = "q=the%20avangers&limit=3";
let query = "q=ea%20exercitation&limit=3";
let expect = json!([
{
"id": 299536,
"popularity": 65.013,
"vote_average": 8.3,
"title": "Avengers: Infinity War",
"tagline": "An entire universe. Once and for all.",
"overview": "As the Avengers and their allies have continued to protect the world from threats too large for any one hero to handle, a new danger has emerged from the cosmic shadows: Thanos. A despot of intergalactic infamy, his goal is to collect all six Infinity Stones, artifacts of unimaginable power, and use them to inflict his twisted will on all of reality. Everything the Avengers have fought for has led up to this moment - the fate of Earth and existence itself has never been more uncertain.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Action",
"Science Fiction"
],
"poster_path": "https://image.tmdb.org/t/p/w500/7WsyChQLEftFiDOVTGkv3hFpyyt.jpg",
"vote_count": 16056
"balance": "$1,921.58",
"age": 31,
"color": "green",
"name": "Harper Carson",
"gender": "male",
"email": "harpercarson@chorizon.com",
"phone": "+1 (912) 430-3243",
"address": "883 Dennett Place, Knowlton, New Mexico, 9219"
},
{
"id": 299534,
"popularity": 38.659,
"vote_average": 8.3,
"title": "Avengers: Endgame",
"tagline": "Part of the journey is the end.",
"overview": "After the devastating events of Avengers: Infinity War, the universe is in ruins due to the efforts of the Mad Titan, Thanos. With the help of remaining allies, the Avengers must assemble once more in order to undo Thanos' actions and restore order to the universe once and for all, no matter what consequences may be in store.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Science Fiction",
"Action"
],
"poster_path": "https://image.tmdb.org/t/p/w500/or06FN3Dka5tukK1e9sl16pB3iy.jpg",
"vote_count": 10497
"balance": "$1,706.13",
"age": 27,
"color": "green",
"name": "Cherry Orr",
"gender": "female",
"email": "cherryorr@chorizon.com",
"phone": "+1 (995) 479-3174",
"address": "442 Beverly Road, Ventress, New Mexico, 3361"
},
{
"id": 99861,
"popularity": 33.938,
"vote_average": 7.3,
"title": "Avengers: Age of Ultron",
"tagline": "A New Age Has Come.",
"overview": "When Tony Stark tries to jumpstart a dormant peacekeeping program, things go awry and Earth’s Mightiest Heroes are put to the ultimate test as the fate of the planet hangs in the balance. As the villainous Ultron emerges, it is up to The Avengers to stop him from enacting his terrible plans, and soon uneasy alliances and unexpected action pave the way for an epic and unique global adventure.",
"director": "Joss Whedon",
"producer": "Kevin Feige",
"genres": [
"Action",
"Adventure",
"Science Fiction"
],
"poster_path": "https://image.tmdb.org/t/p/w500/t90Y3G8UGQp0f0DrP60wRu9gfrH.jpg",
"vote_count": 14661
"balance": "$1,476.39",
"age": 28,
"color": "brown",
"name": "Maureen Dale",
"gender": "female",
"email": "maureendale@chorizon.com",
"phone": "+1 (984) 538-3684",
"address": "817 Newton Street, Bannock, Wyoming, 1468"
}
]);
@ -218,8 +165,7 @@ async fn search_with_settings_stop_words() {
#[actix_rt::test]
async fn search_with_settings_synonyms() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let config = json!({
"rankingRules": [
@ -228,98 +174,71 @@ async fn search_with_settings_synonyms() {
"proximity",
"attribute",
"wordsPosition",
"desc(popularity)",
"desc(age)",
"exactness",
"desc(vote_average)"
"desc(balance)"
],
"distinctAttribute": null,
"searchableAttributes": [
"title",
"tagline",
"overview",
"cast",
"director",
"producer",
"production_companies",
"genres"
"name",
"age",
"color",
"gender",
"email",
"address",
"about"
],
"displayedAttributes": [
"title",
"director",
"producer",
"tagline",
"genres",
"id",
"overview",
"vote_count",
"vote_average",
"poster_path",
"popularity"
"name",
"age",
"gender",
"color",
"email",
"phone",
"address",
"balance"
],
"stopWords": null,
"synonyms": {
"avangers": [
"Captain America",
"Iron Man"
]
"application": [
"exercitation"
]
},
"acceptNewFields": false,
});
server.update_all_settings(config).await;
let query = "q=avangers&limit=3";
let query = "q=application&limit=3";
let expect = json!([
{
"id": 299536,
"popularity": 65.013,
"vote_average": 8.3,
"title": "Avengers: Infinity War",
"tagline": "An entire universe. Once and for all.",
"overview": "As the Avengers and their allies have continued to protect the world from threats too large for any one hero to handle, a new danger has emerged from the cosmic shadows: Thanos. A despot of intergalactic infamy, his goal is to collect all six Infinity Stones, artifacts of unimaginable power, and use them to inflict his twisted will on all of reality. Everything the Avengers have fought for has led up to this moment - the fate of Earth and existence itself has never been more uncertain.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Action",
"Science Fiction"
],
"vote_count": 16056,
"poster_path": "https://image.tmdb.org/t/p/w500/7WsyChQLEftFiDOVTGkv3hFpyyt.jpg"
"balance": "$1,921.58",
"age": 31,
"color": "green",
"name": "Harper Carson",
"gender": "male",
"email": "harpercarson@chorizon.com",
"phone": "+1 (912) 430-3243",
"address": "883 Dennett Place, Knowlton, New Mexico, 9219"
},
{
"id": 299534,
"popularity": 38.659,
"vote_average": 8.3,
"title": "Avengers: Endgame",
"tagline": "Part of the journey is the end.",
"overview": "After the devastating events of Avengers: Infinity War, the universe is in ruins due to the efforts of the Mad Titan, Thanos. With the help of remaining allies, the Avengers must assemble once more in order to undo Thanos' actions and restore order to the universe once and for all, no matter what consequences may be in store.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Science Fiction",
"Action"
],
"vote_count": 10497,
"poster_path": "https://image.tmdb.org/t/p/w500/or06FN3Dka5tukK1e9sl16pB3iy.jpg"
"balance": "$1,706.13",
"age": 27,
"color": "green",
"name": "Cherry Orr",
"gender": "female",
"email": "cherryorr@chorizon.com",
"phone": "+1 (995) 479-3174",
"address": "442 Beverly Road, Ventress, New Mexico, 3361"
},
{
"id": 99861,
"popularity": 33.938,
"vote_average": 7.3,
"title": "Avengers: Age of Ultron",
"tagline": "A New Age Has Come.",
"overview": "When Tony Stark tries to jumpstart a dormant peacekeeping program, things go awry and Earth’s Mightiest Heroes are put to the ultimate test as the fate of the planet hangs in the balance. As the villainous Ultron emerges, it is up to The Avengers to stop him from enacting his terrible plans, and soon uneasy alliances and unexpected action pave the way for an epic and unique global adventure.",
"director": "Joss Whedon",
"producer": "Kevin Feige",
"genres": [
"Action",
"Adventure",
"Science Fiction"
],
"vote_count": 14661,
"poster_path": "https://image.tmdb.org/t/p/w500/t90Y3G8UGQp0f0DrP60wRu9gfrH.jpg"
"balance": "$1,476.39",
"age": 28,
"color": "brown",
"name": "Maureen Dale",
"gender": "female",
"email": "maureendale@chorizon.com",
"phone": "+1 (984) 538-3684",
"address": "817 Newton Street, Bannock, Wyoming, 1468"
}
]);
@ -329,8 +248,7 @@ async fn search_with_settings_synonyms() {
#[actix_rt::test]
async fn search_with_settings_ranking_rules() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let config = json!({
"rankingRules": [
@ -339,104 +257,78 @@ async fn search_with_settings_ranking_rules() {
"proximity",
"attribute",
"wordsPosition",
"asc(vote_average)",
"desc(age)",
"exactness",
"desc(popularity)"
"desc(balance)"
],
"distinctAttribute": null,
"searchableAttributes": [
"title",
"tagline",
"overview",
"cast",
"director",
"producer",
"production_companies",
"genres"
"name",
"age",
"color",
"gender",
"email",
"address",
"about"
],
"displayedAttributes": [
"title",
"director",
"producer",
"tagline",
"genres",
"id",
"overview",
"vote_count",
"vote_average",
"poster_path",
"popularity"
"name",
"age",
"gender",
"color",
"email",
"phone",
"address",
"balance"
],
"stopWords": null,
"synonyms": null,
"acceptNewFields": false,
});
server.update_all_settings(config).await;
let query = "q=avangers&limit=3";
let query = "q=exarcitation&limit=3";
let expect = json!([
{
"id": 99861,
"popularity": 33.938,
"vote_average": 7.3,
"title": "Avengers: Age of Ultron",
"tagline": "A New Age Has Come.",
"overview": "When Tony Stark tries to jumpstart a dormant peacekeeping program, things go awry and Earth’s Mightiest Heroes are put to the ultimate test as the fate of the planet hangs in the balance. As the villainous Ultron emerges, it is up to The Avengers to stop him from enacting his terrible plans, and soon uneasy alliances and unexpected action pave the way for an epic and unique global adventure.",
"director": "Joss Whedon",
"producer": "Kevin Feige",
"genres": [
"Action",
"Adventure",
"Science Fiction"
],
"poster_path": "https://image.tmdb.org/t/p/w500/t90Y3G8UGQp0f0DrP60wRu9gfrH.jpg",
"vote_count": 14661
"balance": "$1,921.58",
"age": 31,
"color": "green",
"name": "Harper Carson",
"gender": "male",
"email": "harpercarson@chorizon.com",
"phone": "+1 (912) 430-3243",
"address": "883 Dennett Place, Knowlton, New Mexico, 9219"
},
{
"id": 299536,
"popularity": 65.013,
"vote_average": 8.3,
"title": "Avengers: Infinity War",
"tagline": "An entire universe. Once and for all.",
"overview": "As the Avengers and their allies have continued to protect the world from threats too large for any one hero to handle, a new danger has emerged from the cosmic shadows: Thanos. A despot of intergalactic infamy, his goal is to collect all six Infinity Stones, artifacts of unimaginable power, and use them to inflict his twisted will on all of reality. Everything the Avengers have fought for has led up to this moment - the fate of Earth and existence itself has never been more uncertain.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Action",
"Science Fiction"
],
"poster_path": "https://image.tmdb.org/t/p/w500/7WsyChQLEftFiDOVTGkv3hFpyyt.jpg",
"vote_count": 16056
"balance": "$1,706.13",
"age": 27,
"color": "green",
"name": "Cherry Orr",
"gender": "female",
"email": "cherryorr@chorizon.com",
"phone": "+1 (995) 479-3174",
"address": "442 Beverly Road, Ventress, New Mexico, 3361"
},
{
"id": 299534,
"popularity": 38.659,
"vote_average": 8.3,
"title": "Avengers: Endgame",
"tagline": "Part of the journey is the end.",
"overview": "After the devastating events of Avengers: Infinity War, the universe is in ruins due to the efforts of the Mad Titan, Thanos. With the help of remaining allies, the Avengers must assemble once more in order to undo Thanos' actions and restore order to the universe once and for all, no matter what consequences may be in store.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Science Fiction",
"Action"
],
"poster_path": "https://image.tmdb.org/t/p/w500/or06FN3Dka5tukK1e9sl16pB3iy.jpg",
"vote_count": 10497
"balance": "$1,476.39",
"age": 28,
"color": "brown",
"name": "Maureen Dale",
"gender": "female",
"email": "maureendale@chorizon.com",
"phone": "+1 (984) 538-3684",
"address": "817 Newton Street, Bannock, Wyoming, 1468"
}
]);
let (response, _status_code) = server.search_get(query).await;
println!("{}", response["hits"].clone());
assert_json_eq!(expect, response["hits"].clone(), ordered: false);
}
#[actix_rt::test]
async fn search_with_settings_searchable_attributes() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let config = json!({
"rankingRules": [
@ -445,92 +337,59 @@ async fn search_with_settings_searchable_attributes() {
"proximity",
"attribute",
"wordsPosition",
"desc(popularity)",
"desc(age)",
"exactness",
"desc(vote_average)"
"desc(balance)"
],
"distinctAttribute": null,
"searchableAttributes": [
"tagline",
"overview",
"cast",
"director",
"producer",
"production_companies",
"genres"
"age",
"color",
"gender",
"address",
"about"
],
"displayedAttributes": [
"title",
"director",
"producer",
"tagline",
"genres",
"id",
"overview",
"vote_count",
"vote_average",
"poster_path",
"popularity"
"name",
"age",
"gender",
"color",
"email",
"phone",
"address",
"balance"
],
"stopWords": null,
"synonyms": null,
"acceptNewFields": false,
"synonyms": {
"exarcitation": [
"exercitation"
]
},
});
server.update_all_settings(config).await;
let query = "q=avangers&limit=3";
let query = "q=Carol&limit=3";
let expect = json!([
{
"id": 299536,
"popularity": 65.013,
"vote_average": 8.3,
"title": "Avengers: Infinity War",
"tagline": "An entire universe. Once and for all.",
"overview": "As the Avengers and their allies have continued to protect the world from threats too large for any one hero to handle, a new danger has emerged from the cosmic shadows: Thanos. A despot of intergalactic infamy, his goal is to collect all six Infinity Stones, artifacts of unimaginable power, and use them to inflict his twisted will on all of reality. Everything the Avengers have fought for has led up to this moment - the fate of Earth and existence itself has never been more uncertain.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Action",
"Science Fiction"
],
"poster_path": "https://image.tmdb.org/t/p/w500/7WsyChQLEftFiDOVTGkv3hFpyyt.jpg",
"vote_count": 16056
"balance": "$1,440.09",
"age": 40,
"color": "blue",
"name": "Levy Whitley",
"gender": "male",
"email": "levywhitley@chorizon.com",
"phone": "+1 (911) 458-2411",
"address": "187 Thomas Street, Hachita, North Carolina, 2989"
},
{
"id": 299534,
"popularity": 38.659,
"vote_average": 8.3,
"title": "Avengers: Endgame",
"tagline": "Part of the journey is the end.",
"overview": "After the devastating events of Avengers: Infinity War, the universe is in ruins due to the efforts of the Mad Titan, Thanos. With the help of remaining allies, the Avengers must assemble once more in order to undo Thanos' actions and restore order to the universe once and for all, no matter what consequences may be in store.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Adventure",
"Science Fiction",
"Action"
],
"poster_path": "https://image.tmdb.org/t/p/w500/or06FN3Dka5tukK1e9sl16pB3iy.jpg",
"vote_count": 10497
},
{
"id": 100402,
"popularity": 16.418,
"vote_average": 7.7,
"title": "Captain America: The Winter Soldier",
"tagline": "In heroes we trust.",
"overview": "After the cataclysmic events in New York with The Avengers, Steve Rogers, aka Captain America is living quietly in Washington, D.C. and trying to adjust to the modern world. But when a S.H.I.E.L.D. colleague comes under attack, Steve becomes embroiled in a web of intrigue that threatens to put the world at risk. Joining forces with the Black Widow, Captain America struggles to expose the ever-widening conspiracy while fighting off professional assassins sent to silence him at every turn. When the full scope of the villainous plot is revealed, Captain America and the Black Widow enlist the help of a new ally, the Falcon. However, they soon find themselves up against an unexpected and formidable enemy—the Winter Soldier.",
"director": "Anthony Russo",
"producer": "Kevin Feige",
"genres": [
"Action",
"Adventure",
"Science Fiction"
],
"poster_path": "https://image.tmdb.org/t/p/w500/5TQ6YDmymBpnF005OyoB7ohZps9.jpg",
"vote_count": 11972
"balance": "$1,977.66",
"age": 36,
"color": "brown",
"name": "Combs Stanley",
"gender": "male",
"email": "combsstanley@chorizon.com",
"phone": "+1 (827) 419-2053",
"address": "153 Beverley Road, Siglerville, South Carolina, 3666"
}
]);
@ -540,8 +399,7 @@ async fn search_with_settings_searchable_attributes() {
#[actix_rt::test]
async fn search_with_settings_displayed_attributes() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let config = json!({
"rankingRules": [
@ -550,57 +408,57 @@ async fn search_with_settings_displayed_attributes() {
"proximity",
"attribute",
"wordsPosition",
"desc(popularity)",
"desc(age)",
"exactness",
"desc(vote_average)"
"desc(balance)"
],
"distinctAttribute": null,
"searchableAttributes": [
"title",
"tagline",
"overview",
"cast",
"director",
"producer",
"production_companies",
"genres"
"age",
"color",
"gender",
"address",
"about"
],
"displayedAttributes": [
"title",
"tagline",
"id",
"overview",
"poster_path"
"name",
"age",
"gender",
"color",
"email",
"phone"
],
"stopWords": null,
"synonyms": null,
"acceptNewFields": false,
});
server.update_all_settings(config).await;
let query = "q=avangers&limit=3";
let query = "q=exercitation&limit=3";
let expect = json!([
{
"id": 299536,
"title": "Avengers: Infinity War",
"tagline": "An entire universe. Once and for all.",
"overview": "As the Avengers and their allies have continued to protect the world from threats too large for any one hero to handle, a new danger has emerged from the cosmic shadows: Thanos. A despot of intergalactic infamy, his goal is to collect all six Infinity Stones, artifacts of unimaginable power, and use them to inflict his twisted will on all of reality. Everything the Avengers have fought for has led up to this moment - the fate of Earth and existence itself has never been more uncertain.",
"poster_path": "https://image.tmdb.org/t/p/w500/7WsyChQLEftFiDOVTGkv3hFpyyt.jpg"
"age": 31,
"color": "green",
"name": "Harper Carson",
"gender": "male",
"email": "harpercarson@chorizon.com",
"phone": "+1 (912) 430-3243"
},
{
"id": 299534,
"title": "Avengers: Endgame",
"tagline": "Part of the journey is the end.",
"overview": "After the devastating events of Avengers: Infinity War, the universe is in ruins due to the efforts of the Mad Titan, Thanos. With the help of remaining allies, the Avengers must assemble once more in order to undo Thanos' actions and restore order to the universe once and for all, no matter what consequences may be in store.",
"poster_path": "https://image.tmdb.org/t/p/w500/or06FN3Dka5tukK1e9sl16pB3iy.jpg"
"age": 27,
"color": "green",
"name": "Cherry Orr",
"gender": "female",
"email": "cherryorr@chorizon.com",
"phone": "+1 (995) 479-3174"
},
{
"id": 99861,
"title": "Avengers: Age of Ultron",
"tagline": "A New Age Has Come.",
"overview": "When Tony Stark tries to jumpstart a dormant peacekeeping program, things go awry and Earth’s Mightiest Heroes are put to the ultimate test as the fate of the planet hangs in the balance. As the villainous Ultron emerges, it is up to The Avengers to stop him from enacting his terrible plans, and soon uneasy alliances and unexpected action pave the way for an epic and unique global adventure.",
"poster_path": "https://image.tmdb.org/t/p/w500/t90Y3G8UGQp0f0DrP60wRu9gfrH.jpg"
"age": 28,
"color": "brown",
"name": "Maureen Dale",
"gender": "female",
"email": "maureendale@chorizon.com",
"phone": "+1 (984) 538-3684"
}
]);
@ -610,8 +468,7 @@ async fn search_with_settings_displayed_attributes() {
#[actix_rt::test]
async fn search_with_settings_searchable_attributes_2() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
let config = json!({
"rankingRules": [
@ -620,57 +477,45 @@ async fn search_with_settings_searchable_attributes_2() {
"proximity",
"attribute",
"wordsPosition",
"desc(popularity)",
"desc(age)",
"exactness",
"desc(vote_average)"
"desc(balance)"
],
"distinctAttribute": null,
"searchableAttributes": [
"tagline",
"overview",
"title",
"cast",
"director",
"producer",
"production_companies",
"genres"
"age",
"color",
"gender",
"address",
"about"
],
"displayedAttributes": [
"title",
"tagline",
"id",
"overview",
"poster_path"
"name",
"age",
"gender"
],
"stopWords": null,
"synonyms": null,
"acceptNewFields": false,
});
server.update_all_settings(config).await;
let query = "q=avangers&limit=3";
let query = "q=exercitation&limit=3";
let expect = json!([
{
"id": 299536,
"title": "Avengers: Infinity War",
"tagline": "An entire universe. Once and for all.",
"overview": "As the Avengers and their allies have continued to protect the world from threats too large for any one hero to handle, a new danger has emerged from the cosmic shadows: Thanos. A despot of intergalactic infamy, his goal is to collect all six Infinity Stones, artifacts of unimaginable power, and use them to inflict his twisted will on all of reality. Everything the Avengers have fought for has led up to this moment - the fate of Earth and existence itself has never been more uncertain.",
"poster_path": "https://image.tmdb.org/t/p/w500/7WsyChQLEftFiDOVTGkv3hFpyyt.jpg"
"age": 31,
"name": "Harper Carson",
"gender": "male"
},
{
"id": 299534,
"title": "Avengers: Endgame",
"tagline": "Part of the journey is the end.",
"overview": "After the devastating events of Avengers: Infinity War, the universe is in ruins due to the efforts of the Mad Titan, Thanos. With the help of remaining allies, the Avengers must assemble once more in order to undo Thanos' actions and restore order to the universe once and for all, no matter what consequences may be in store.",
"poster_path": "https://image.tmdb.org/t/p/w500/or06FN3Dka5tukK1e9sl16pB3iy.jpg"
"age": 27,
"name": "Cherry Orr",
"gender": "female"
},
{
"id": 100402,
"title": "Captain America: The Winter Soldier",
"tagline": "In heroes we trust.",
"overview": "After the cataclysmic events in New York with The Avengers, Steve Rogers, aka Captain America is living quietly in Washington, D.C. and trying to adjust to the modern world. But when a S.H.I.E.L.D. colleague comes under attack, Steve becomes embroiled in a web of intrigue that threatens to put the world at risk. Joining forces with the Black Widow, Captain America struggles to expose the ever-widening conspiracy while fighting off professional assassins sent to silence him at every turn. When the full scope of the villainous plot is revealed, Captain America and the Black Widow enlist the help of a new ally, the Falcon. However, they soon find themselves up against an unexpected and formidable enemy—the Winter Soldier.",
"poster_path": "https://image.tmdb.org/t/p/w500/5TQ6YDmymBpnF005OyoB7ohZps9.jpg"
"age": 28,
"name": "Maureen Dale",
"gender": "female"
}
]);

View File

@ -1,13 +1,11 @@
use assert_json_diff::assert_json_eq;
use serde_json::json;
use std::convert::Into;
mod common;
#[actix_rt::test]
async fn write_all_and_delete() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
// 2 - Send the settings
let body = json!({
@ -18,40 +16,40 @@ async fn write_all_and_delete() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(rank)",
"desc(registered)",
"desc(age)",
],
"distinctAttribute": "movie_id",
"distinctAttribute": "id",
"searchableAttributes": [
"id",
"movie_id",
"title",
"description",
"poster",
"release_date",
"rank",
"name",
"color",
"gender",
"email",
"phone",
"address",
"registered",
"about"
],
"displayedAttributes": [
"title",
"description",
"poster",
"release_date",
"rank",
"name",
"gender",
"email",
"registered",
"age",
],
"stopWords": [
"the",
"a",
"an",
"ad",
"in",
"ut",
],
"synonyms": {
"wolverine": ["xmen", "logan"],
"logan": ["wolverine"],
"road": ["street", "avenue"],
"street": ["avenue"],
},
"attributesForFaceting": ["title"],
"acceptNewFields": false,
"attributesForFaceting": ["name"],
});
server.update_all_settings(body.clone()).await;
// 3 - Get all settings and compare to the previous one
@ -78,50 +76,11 @@ async fn write_all_and_delete() {
"exactness"
],
"distinctAttribute": null,
"searchableAttributes": [
"poster_path",
"director",
"id",
"production_companies",
"producer",
"poster",
"movie_id",
"vote_count",
"cast",
"release_date",
"vote_average",
"rank",
"genres",
"overview",
"description",
"tagline",
"popularity",
"title"
],
"displayedAttributes": [
"poster_path",
"poster",
"vote_count",
"id",
"movie_id",
"title",
"rank",
"tagline",
"cast",
"producer",
"production_companies",
"description",
"director",
"genres",
"release_date",
"overview",
"vote_average",
"popularity"
],
"searchableAttributes": ["*"],
"displayedAttributes": ["*"],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": [],
"acceptNewFields": true,
});
assert_json_eq!(expect, response, ordered: false);
@ -129,8 +88,7 @@ async fn write_all_and_delete() {
#[actix_rt::test]
async fn write_all_and_update() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
// 2 - Send the settings
@ -142,37 +100,38 @@ async fn write_all_and_update() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(rank)",
"desc(registered)",
"desc(age)",
],
"distinctAttribute": "movie_id",
"distinctAttribute": "id",
"searchableAttributes": [
"uid",
"movie_id",
"title",
"description",
"poster",
"release_date",
"rank",
"id",
"name",
"color",
"gender",
"email",
"phone",
"address",
"registered",
"about"
],
"displayedAttributes": [
"title",
"description",
"poster",
"release_date",
"rank",
"name",
"gender",
"email",
"registered",
"age",
],
"stopWords": [
"the",
"a",
"an",
"ad",
"in",
"ut",
],
"synonyms": {
"wolverine": ["xmen", "logan"],
"logan": ["wolverine"],
"road": ["street", "avenue"],
"street": ["avenue"],
},
"attributesForFaceting": ["title"],
"acceptNewFields": false,
"attributesForFaceting": ["name"],
});
server.update_all_settings(body.clone()).await;
@ -193,28 +152,27 @@ async fn write_all_and_update() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(age)",
],
"distinctAttribute": null,
"searchableAttributes": [
"title",
"description",
"uid",
"name",
"color",
"age",
],
"displayedAttributes": [
"title",
"description",
"release_date",
"rank",
"poster",
"name",
"color",
"age",
"registered",
"picture",
],
"stopWords": [],
"synonyms": {
"wolverine": ["xmen", "logan"],
"logan": ["wolverine", "xmen"],
"road": ["street", "avenue"],
"street": ["avenue"],
},
"attributesForFaceting": ["title"],
"acceptNewFields": false,
});
server.update_all_settings(body).await;
@ -231,28 +189,27 @@ async fn write_all_and_update() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(age)",
],
"distinctAttribute": null,
"searchableAttributes": [
"title",
"description",
"uid",
"name",
"color",
"age",
],
"displayedAttributes": [
"title",
"description",
"release_date",
"rank",
"poster",
"name",
"color",
"age",
"registered",
"picture",
],
"stopWords": [],
"synonyms": {
"wolverine": ["xmen", "logan"],
"logan": ["wolverine", "xmen"],
"road": ["street", "avenue"],
"street": ["avenue"],
},
"attributesForFaceting": ["title"],
"acceptNewFields": false
});
assert_json_eq!(expected, response, ordered: false);
@ -260,9 +217,9 @@ async fn write_all_and_update() {
#[actix_rt::test]
async fn test_default_settings() {
let mut server = common::Server::with_uid("movies");
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "movies",
"uid": "test",
});
server.create_index(body).await;
@ -278,12 +235,11 @@ async fn test_default_settings() {
"exactness"
],
"distinctAttribute": null,
"searchableAttributes": [],
"displayedAttributes": [],
"searchableAttributes": ["*"],
"displayedAttributes": ["*"],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": [],
"acceptNewFields": true,
});
let (response, _status_code) = server.get_all_settings().await;
@ -293,9 +249,9 @@ async fn test_default_settings() {
#[actix_rt::test]
async fn test_default_settings_2() {
let mut server = common::Server::with_uid("movies");
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "movies",
"uid": "test",
"primaryKey": "id",
});
server.create_index(body).await;
@ -312,16 +268,11 @@ async fn test_default_settings_2() {
"exactness"
],
"distinctAttribute": null,
"searchableAttributes": [
"id"
],
"displayedAttributes": [
"id"
],
"searchableAttributes": ["*"],
"displayedAttributes": ["*"],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": [],
"acceptNewFields": true,
});
let (response, _status_code) = server.get_all_settings().await;
@ -332,9 +283,9 @@ async fn test_default_settings_2() {
// Test issue https://github.com/meilisearch/MeiliSearch/issues/516
#[actix_rt::test]
async fn write_setting_and_update_partial() {
let mut server = common::Server::with_uid("movies");
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "movies",
"uid": "test",
});
server.create_index(body).await;
@ -342,20 +293,21 @@ async fn write_setting_and_update_partial() {
let body = json!({
"searchableAttributes": [
"uid",
"movie_id",
"title",
"description",
"poster",
"release_date",
"rank",
"id",
"name",
"color",
"gender",
"email",
"phone",
"address",
"about"
],
"displayedAttributes": [
"title",
"description",
"poster",
"release_date",
"rank",
"name",
"gender",
"email",
"registered",
"age",
]
});
@ -371,20 +323,19 @@ async fn write_setting_and_update_partial() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(rank)",
"desc(age)",
"desc(registered)",
],
"distinctAttribute": "movie_id",
"distinctAttribute": "id",
"stopWords": [
"the",
"a",
"an",
"ad",
"in",
"ut",
],
"synonyms": {
"wolverine": ["xmen", "logan"],
"logan": ["wolverine"],
"road": ["street", "avenue"],
"street": ["avenue"],
},
"acceptNewFields": false,
});
server.update_all_settings(body.clone()).await;
@ -399,37 +350,37 @@ async fn write_setting_and_update_partial() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(rank)",
"desc(age)",
"desc(registered)",
],
"distinctAttribute": "movie_id",
"distinctAttribute": "id",
"searchableAttributes": [
"uid",
"movie_id",
"title",
"description",
"poster",
"release_date",
"rank",
"id",
"name",
"color",
"gender",
"email",
"phone",
"address",
"about"
],
"displayedAttributes": [
"title",
"description",
"poster",
"release_date",
"rank",
"name",
"gender",
"email",
"registered",
"age",
],
"stopWords": [
"the",
"a",
"an",
"ad",
"in",
"ut",
],
"synonyms": {
"wolverine": ["xmen", "logan"],
"logan": ["wolverine"],
"road": ["street", "avenue"],
"street": ["avenue"],
},
"attributesForFaceting": [],
"acceptNewFields": false,
});
let (response, _status_code) = server.get_all_settings().await;
@ -471,17 +422,49 @@ async fn setting_ranking_rules_dont_mess_with_other_settings() {
}
#[actix_rt::test]
async fn distinct_attribute_recorded_as_known_field() {
let mut server = common::Server::test_server().await;
let body = json!({
"distinctAttribute": "foobar",
"acceptNewFields": true
});
server.update_all_settings(body).await;
let document = json!([{"id": 9348127, "foobar": "hello", "foo": "bar"}]);
server.add_or_update_multiple_documents(document).await;
// foobar should not be added to the searchable attributes because it is already known, but "foo" should
let (response, _) = server.get_all_settings().await;
assert!(response["searchableAttributes"].as_array().unwrap().iter().any(|v| v.as_str().unwrap() == "foo"));
assert!(!response["searchableAttributes"].as_array().unwrap().iter().any(|v| v.as_str().unwrap() == "foobar"));
async fn displayed_and_searchable_attributes_reset_to_wildcard() {
let mut server = common::Server::test_server().await;
server.update_all_settings(json!({ "searchableAttributes": ["color"], "displayedAttributes": ["color"] })).await;
let (response, _) = server.get_all_settings().await;
assert_eq!(response["searchableAttributes"].as_array().unwrap()[0], "color");
assert_eq!(response["displayedAttributes"].as_array().unwrap()[0], "color");
server.delete_searchable_attributes().await;
server.delete_displayed_attributes().await;
let (response, _) = server.get_all_settings().await;
assert_eq!(response["searchableAttributes"].as_array().unwrap().len(), 1);
assert_eq!(response["displayedAttributes"].as_array().unwrap().len(), 1);
assert_eq!(response["searchableAttributes"].as_array().unwrap()[0], "*");
assert_eq!(response["displayedAttributes"].as_array().unwrap()[0], "*");
let mut server = common::Server::test_server().await;
server.update_all_settings(json!({ "searchableAttributes": ["color"], "displayedAttributes": ["color"] })).await;
let (response, _) = server.get_all_settings().await;
assert_eq!(response["searchableAttributes"].as_array().unwrap()[0], "color");
assert_eq!(response["displayedAttributes"].as_array().unwrap()[0], "color");
server.update_all_settings(json!({ "searchableAttributes": [], "displayedAttributes": [] })).await;
let (response, _) = server.get_all_settings().await;
assert_eq!(response["searchableAttributes"].as_array().unwrap().len(), 1);
assert_eq!(response["displayedAttributes"].as_array().unwrap().len(), 1);
assert_eq!(response["searchableAttributes"].as_array().unwrap()[0], "*");
assert_eq!(response["displayedAttributes"].as_array().unwrap()[0], "*");
}
#[actix_rt::test]
async fn settings_that_contains_wildcard_is_wildcard() {
let mut server = common::Server::test_server().await;
server.update_all_settings(json!({ "searchableAttributes": ["color", "*"], "displayedAttributes": ["color", "*"] })).await;
let (response, _) = server.get_all_settings().await;
assert_eq!(response["searchableAttributes"].as_array().unwrap().len(), 1);
assert_eq!(response["displayedAttributes"].as_array().unwrap().len(), 1);
assert_eq!(response["searchableAttributes"].as_array().unwrap()[0], "*");
assert_eq!(response["displayedAttributes"].as_array().unwrap()[0], "*");
}

View File

@ -1,349 +0,0 @@
use assert_json_diff::assert_json_eq;
use serde_json::json;
mod common;
#[actix_rt::test]
async fn index_new_fields_default() {
let mut server = common::Server::with_uid("movies");
let body = json!({
"uid": "movies",
"primaryKey": "id",
});
server.create_index(body).await;
// 1 - Add a document
let body = json!([{
"id": 1,
"title": "I'm a legend",
}]);
server.add_or_replace_multiple_documents(body).await;
// 2 - Get the complete document
let expected = json!({
"id": 1,
"title": "I'm a legend",
});
let (response, status_code) = server.get_document(1).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
// 3 - Add a document with more fields
let body = json!([{
"id": 2,
"title": "I'm not a legend",
"description": "A bad copy of the original movie I'm a lengend"
}]);
server.add_or_replace_multiple_documents(body).await;
// 4 - Get the complete document
let expected = json!({
"id": 2,
"title": "I'm not a legend",
"description": "A bad copy of the original movie I'm a lengend"
});
let (response, status_code) = server.get_document(2).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
}
#[actix_rt::test]
async fn index_new_fields_true() {
let mut server = common::Server::with_uid("movies");
let body = json!({
"uid": "movies",
"primaryKey": "id",
});
server.create_index(body).await;
// 1 - Set indexNewFields = true
server.update_accept_new_fields(json!(true)).await;
// 2 - Add a document
let body = json!([{
"id": 1,
"title": "I'm a legend",
}]);
server.add_or_replace_multiple_documents(body).await;
// 3 - Get the complete document
let expected = json!({
"id": 1,
"title": "I'm a legend",
});
let (response, status_code) = server.get_document(1).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
// 4 - Add a document with more fields
let body = json!([{
"id": 2,
"title": "I'm not a legend",
"description": "A bad copy of the original movie I'm a lengend"
}]);
server.add_or_replace_multiple_documents(body).await;
// 5 - Get the complete document
let expected = json!({
"id": 2,
"title": "I'm not a legend",
"description": "A bad copy of the original movie I'm a lengend"
});
let (response, status_code) = server.get_document(2).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
}
#[actix_rt::test]
async fn index_new_fields_false() {
let mut server = common::Server::with_uid("movies");
let body = json!({
"uid": "movies",
"primaryKey": "id",
});
server.create_index(body).await;
// 1 - Set indexNewFields = false
server.update_accept_new_fields(json!(false)).await;
// 2 - Add a document
let body = json!([{
"id": 1,
"title": "I'm a legend",
}]);
server.add_or_replace_multiple_documents(body).await;
// 3 - Get the complete document
let expected = json!({
"id": 1,
});
let (response, status_code) = server.get_document(1).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
// 4 - Add a document with more fields
let body = json!([{
"id": 2,
"title": "I'm not a legend",
"description": "A bad copy of the original movie I'm a lengend"
}]);
server.add_or_replace_multiple_documents(body).await;
// 5 - Get the complete document
let expected = json!({
"id": 2,
});
let (response, status_code) = server.get_document(2).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
}
#[actix_rt::test]
async fn index_new_fields_true_then_false() {
let mut server = common::Server::with_uid("movies");
let body = json!({
"uid": "movies",
"primaryKey": "id",
});
server.create_index(body).await;
// 1 - Set indexNewFields = true
server.update_accept_new_fields(json!(true)).await;
// 2 - Add a document
let body = json!([{
"id": 1,
"title": "I'm a legend",
}]);
server.add_or_replace_multiple_documents(body).await;
// 3 - Get the complete document
let expected = json!({
"id": 1,
"title": "I'm a legend",
});
let (response, status_code) = server.get_document(1).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
// 4 - Set indexNewFields = false
server.update_accept_new_fields(json!(false)).await;
// 5 - Add a document with more fields
let body = json!([{
"id": 2,
"title": "I'm not a legend",
"description": "A bad copy of the original movie I'm a lengend"
}]);
server.add_or_replace_multiple_documents(body).await;
// 6 - Get the complete document
let expected = json!({
"id": 2,
"title": "I'm not a legend",
});
let (response, status_code) = server.get_document(2).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
}
#[actix_rt::test]
async fn index_new_fields_false_then_true() {
let mut server = common::Server::with_uid("movies");
let body = json!({
"uid": "movies",
"primaryKey": "id",
});
server.create_index(body).await;
// 1 - Set indexNewFields = false
server.update_accept_new_fields(json!(false)).await;
// 2 - Add a document
let body = json!([{
"id": 1,
"title": "I'm a legend",
}]);
server.add_or_replace_multiple_documents(body).await;
// 3 - Get the complete document
let expected = json!({
"id": 1,
});
let (response, status_code) = server.get_document(1).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
// 4 - Set indexNewFields = false
server.update_accept_new_fields(json!(true)).await;
// 5 - Add a document with more fields
let body = json!([{
"id": 2,
"title": "I'm not a legend",
"description": "A bad copy of the original movie I'm a lengend"
}]);
server.add_or_replace_multiple_documents(body).await;
// 6 - Get the complete document
let expected = json!({
"id": 1,
});
let (response, status_code) = server.get_document(1).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
let expected = json!({
"id": 2,
"description": "A bad copy of the original movie I'm a lengend"
});
let (response, status_code) = server.get_document(2).await;
assert_eq!(status_code, 200);
assert_json_eq!(response, expected);
}
// Fix issue https://github.com/meilisearch/MeiliSearch/issues/518
#[actix_rt::test]
async fn accept_new_fields_does_not_take_into_account_the_primary_key() {
let mut server = common::Server::with_uid("movies");
// 1 - Create an index with no primary-key
let body = json!({
"uid": "movies",
});
let (response, status_code) = server.create_index(body).await;
assert_eq!(status_code, 201);
assert_eq!(response["primaryKey"], json!(null));
// 2 - Add searchable and displayed attributes as: ["title"] & Set acceptNewFields to false
let body = json!({
"searchableAttributes": ["title"],
"displayedAttributes": ["title"],
"acceptNewFields": false,
});
server.update_all_settings(body).await;
// 4 - Add a document
let body = json!([{
"id": 1,
"title": "Test",
"comment": "comment test"
}]);
server.add_or_replace_multiple_documents(body).await;
// 5 - Get settings, they should not changed
let (response, _status_code) = server.get_all_settings().await;
let expected = json!({
"rankingRules": [
"typo",
"words",
"proximity",
"attribute",
"wordsPosition",
"exactness",
],
"distinctAttribute": null,
"searchableAttributes": ["title"],
"displayedAttributes": ["title"],
"stopWords": [],
"synonyms": {},
"attributesForFaceting": [],
"acceptNewFields": false,
});
assert_json_eq!(response, expected, ordered: false);
}

View File

@ -5,8 +5,7 @@ mod common;
#[actix_rt::test]
async fn write_all_and_delete() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
// 2 - Send the settings
@ -17,8 +16,8 @@ async fn write_all_and_delete() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(rank)",
"desc(registered)",
"desc(age)",
]);
server.update_ranking_rules(body.clone()).await;
@ -51,8 +50,7 @@ async fn write_all_and_delete() {
#[actix_rt::test]
async fn write_all_and_update() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
// 2 - Send the settings
@ -63,8 +61,8 @@ async fn write_all_and_update() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(rank)",
"desc(registered)",
"desc(age)",
]);
server.update_ranking_rules(body.clone()).await;
@ -84,7 +82,7 @@ async fn write_all_and_update() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(registered)",
]);
server.update_ranking_rules(body).await;
@ -100,7 +98,7 @@ async fn write_all_and_update() {
"attribute",
"wordsPosition",
"exactness",
"desc(release_date)",
"desc(registered)",
]);
assert_json_eq!(expected, response, ordered: false);
@ -108,9 +106,9 @@ async fn write_all_and_update() {
#[actix_rt::test]
async fn send_undefined_rule() {
let mut server = common::Server::with_uid("movies");
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "movies",
"uid": "test",
"primaryKey": "id",
});
server.create_index(body).await;
@ -123,9 +121,9 @@ async fn send_undefined_rule() {
#[actix_rt::test]
async fn send_malformed_custom_rule() {
let mut server = common::Server::with_uid("movies");
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "movies",
"uid": "test",
"primaryKey": "id",
});
server.create_index(body).await;
@ -139,16 +137,16 @@ async fn send_malformed_custom_rule() {
// Test issue https://github.com/meilisearch/MeiliSearch/issues/521
#[actix_rt::test]
async fn write_custom_ranking_and_index_documents() {
let mut server = common::Server::with_uid("movies");
let mut server = common::Server::with_uid("test");
let body = json!({
"uid": "movies",
"uid": "test",
"primaryKey": "id",
});
server.create_index(body).await;
// 1 - Add ranking rules with one custom ranking on a string
let body = json!(["asc(title)", "typo"]);
let body = json!(["asc(name)", "typo"]);
server.update_ranking_rules(body).await;
@ -157,13 +155,13 @@ async fn write_custom_ranking_and_index_documents() {
let body = json!([
{
"id": 1,
"title": "Le Petit Prince",
"author": "Exupéry"
"name": "Cherry Orr",
"color": "green"
},
{
"id": 2,
"title": "Pride and Prejudice",
"author": "Jane Austen"
"name": "Lucas Hess",
"color": "yellow"
}
]);
@ -173,7 +171,8 @@ async fn write_custom_ranking_and_index_documents() {
let expected = json!({
"id": 1,
"author": "Exupéry"
"name": "Cherry Orr",
"color": "green"
});
let (response, status_code) = server.get_document(1).await;

View File

@ -5,8 +5,7 @@ mod common;
#[actix_rt::test]
async fn update_stop_words() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
// 1 - Get stop words
@ -15,7 +14,7 @@ async fn update_stop_words() {
// 2 - Update stop words
let body = json!(["the", "a"]);
let body = json!(["ut", "ea"]);
server.update_stop_words(body.clone()).await;
// 3 - Get all stop words and compare to the previous one
@ -35,22 +34,21 @@ async fn update_stop_words() {
#[actix_rt::test]
async fn add_documents_and_stop_words() {
let mut server = common::Server::with_uid("movies");
server.populate_movies().await;
let mut server = common::Server::test_server().await;
// 2 - Update stop words
let body = json!(["the", "of"]);
let body = json!(["ad", "in"]);
server.update_stop_words(body.clone()).await;
// 3 - Search for a document with stop words
let (response, _status_code) = server.search_get("q=the%20mask").await;
let (response, _status_code) = server.search_get("q=in%20exercitation").await;
assert!(!response["hits"].as_array().unwrap().is_empty());
// 4 - Search for documents with *only* stop words
let (response, _status_code) = server.search_get("q=the%20of").await;
let (response, _status_code) = server.search_get("q=ad%20in").await;
assert!(response["hits"].as_array().unwrap().is_empty());
// 5 - Delete all stop words

View File

@ -1,13 +1,13 @@
[package]
name = "meilisearch-schema"
version = "0.12.0"
version = "0.13.0"
license = "MIT"
authors = ["Kerollmops <renault.cle@gmail.com>"]
edition = "2018"
[dependencies]
indexmap = { version = "1.3.2", features = ["serde-1"] }
meilisearch-error = { path = "../meilisearch-error", version = "0.12.0" }
meilisearch-error = { path = "../meilisearch-error", version = "0.13.0" }
serde = { version = "1.0.105", features = ["derive"] }
serde_json = { version = "1.0.50", features = ["preserve_order"] }
zerocopy = "0.3.0"

View File

@ -1,6 +1,42 @@
use crate::{FieldsMap, FieldId, SResult, Error, IndexedPos};
use serde::{Serialize, Deserialize};
use std::collections::{HashMap, HashSet};
use std::borrow::Cow;
#[derive(Clone, Debug, Serialize, Deserialize)]
enum OptionAll<T> {
All,
Some(T),
None,
}
impl<T> OptionAll<T> {
// replace the value with None and return the previous value
fn take(&mut self) -> OptionAll<T> {
std::mem::replace(self, OptionAll::None)
}
fn map<U, F: FnOnce(T) -> U>(self, f: F) -> OptionAll<U> {
match self {
OptionAll::Some(x) => OptionAll::Some(f(x)),
OptionAll::All => OptionAll::All,
OptionAll::None => OptionAll::None,
}
}
pub fn is_all(&self) -> bool {
match self {
OptionAll::All => true,
_ => false,
}
}
}
impl<T> Default for OptionAll<T> {
fn default() -> OptionAll<T> {
OptionAll::All
}
}
#[derive(Clone, Debug, Serialize, Deserialize, Default)]
pub struct Schema {
@ -8,20 +44,15 @@ pub struct Schema {
primary_key: Option<FieldId>,
ranked: HashSet<FieldId>,
displayed: HashSet<FieldId>,
displayed: OptionAll<HashSet<FieldId>>,
indexed: Vec<FieldId>,
indexed: OptionAll<Vec<FieldId>>,
indexed_map: HashMap<FieldId, IndexedPos>,
accept_new_fields: bool,
}
impl Schema {
pub fn new() -> Schema {
Schema {
accept_new_fields: true,
..Default::default()
}
Schema::default()
}
pub fn with_primary_key(name: &str) -> Schema {
@ -29,21 +60,18 @@ impl Schema {
let field_id = fields_map.insert(name).unwrap();
let mut displayed = HashSet::new();
let mut indexed = Vec::new();
let mut indexed_map = HashMap::new();
displayed.insert(field_id);
indexed.push(field_id);
indexed_map.insert(field_id, 0.into());
Schema {
fields_map,
primary_key: Some(field_id),
ranked: HashSet::new(),
displayed,
indexed,
displayed: OptionAll::All,
indexed: OptionAll::All,
indexed_map,
accept_new_fields: true,
}
}
@ -58,10 +86,8 @@ impl Schema {
let id = self.insert(name)?;
self.primary_key = Some(id);
if self.accept_new_fields {
self.set_indexed(name)?;
self.set_displayed(name)?;
}
self.set_indexed(name)?;
self.set_displayed(name)?;
Ok(id)
}
@ -92,12 +118,8 @@ impl Schema {
Ok(id)
}
None => {
if self.accept_new_fields {
self.set_indexed(name)?;
self.set_displayed(name)
} else {
self.fields_map.insert(name)
}
self.set_indexed(name)?;
self.set_displayed(name)
}
}
}
@ -110,20 +132,50 @@ impl Schema {
self.ranked.iter().filter_map(|a| self.name(*a)).collect()
}
pub fn displayed(&self) -> &HashSet<FieldId> {
&self.displayed
pub fn displayed(&self) -> Cow<HashSet<FieldId>> {
match self.displayed {
OptionAll::Some(ref v) => Cow::Borrowed(v),
OptionAll::All => {
let fields = self
.fields_map
.iter()
.map(|(_, &v)| v)
.collect::<HashSet<_>>();
Cow::Owned(fields)
}
OptionAll::None => Cow::Owned(HashSet::new())
}
}
pub fn is_displayed_all(&self) -> bool {
self.displayed.is_all()
}
pub fn displayed_name(&self) -> HashSet<&str> {
self.displayed.iter().filter_map(|a| self.name(*a)).collect()
match self.displayed {
OptionAll::All => self.fields_map.iter().filter_map(|(_, &v)| self.name(v)).collect(),
OptionAll::Some(ref v) => v.iter().filter_map(|a| self.name(*a)).collect(),
OptionAll::None => HashSet::new(),
}
}
pub fn indexed(&self) -> &Vec<FieldId> {
&self.indexed
pub fn indexed(&self) -> Cow<[FieldId]> {
match self.indexed {
OptionAll::Some(ref v) => Cow::Borrowed(v),
OptionAll::All => {
let fields = self
.fields_map
.iter()
.map(|(_, &f)| f)
.collect();
Cow::Owned(fields)
},
OptionAll::None => Cow::Owned(Vec::new())
}
}
pub fn indexed_name(&self) -> Vec<&str> {
self.indexed.iter().filter_map(|a| self.name(*a)).collect()
self.indexed().iter().filter_map(|a| self.name(*a)).collect()
}
pub fn set_ranked(&mut self, name: &str) -> SResult<FieldId> {
@ -134,18 +186,33 @@ impl Schema {
pub fn set_displayed(&mut self, name: &str) -> SResult<FieldId> {
let id = self.fields_map.insert(name)?;
self.displayed.insert(id);
self.displayed = match self.displayed.take() {
OptionAll::All => OptionAll::All,
OptionAll::None => {
let mut displayed = HashSet::new();
displayed.insert(id);
OptionAll::Some(displayed)
},
OptionAll::Some(mut v) => {
v.insert(id);
OptionAll::Some(v)
}
};
Ok(id)
}
pub fn set_indexed(&mut self, name: &str) -> SResult<(FieldId, IndexedPos)> {
let id = self.fields_map.insert(name)?;
if let Some(indexed_pos) = self.indexed_map.get(&id) {
return Ok((id, *indexed_pos))
};
let pos = self.indexed.len() as u16;
self.indexed.push(id);
let pos = self.indexed_map.len() as u16;
self.indexed_map.insert(id, pos.into());
self.indexed = self.indexed.take().map(|mut v| {
v.push(id);
v
});
Ok((id, pos.into()))
}
@ -159,16 +226,47 @@ impl Schema {
}
}
/// remove field from displayed attributes. If diplayed attributes is OptionAll::All,
/// dipslayed attributes is turned into OptionAll::Some(v) where v is all displayed attributes
/// except name.
pub fn remove_displayed(&mut self, name: &str) {
if let Some(id) = self.fields_map.id(name) {
self.displayed.remove(&id);
self.displayed = match self.displayed.take() {
OptionAll::Some(mut v) => {
v.remove(&id);
OptionAll::Some(v)
}
OptionAll::All => {
let displayed = self.fields_map
.iter()
.filter_map(|(key, &value)| {
if key != name {
Some(value)
} else {
None
}
})
.collect::<HashSet<_>>();
OptionAll::Some(displayed)
}
OptionAll::None => OptionAll::None,
};
}
}
pub fn remove_indexed(&mut self, name: &str) {
if let Some(id) = self.fields_map.id(name) {
self.indexed_map.remove(&id);
self.indexed.retain(|x| *x != id);
self.indexed = match self.indexed.take() {
// valid because indexed is All and indexed() return the content of
// indexed_map that is already updated
OptionAll::All => OptionAll::Some(self.indexed().into_owned()),
OptionAll::Some(mut v) => {
v.retain(|x| *x != id);
OptionAll::Some(v)
}
OptionAll::None => OptionAll::None,
}
}
}
@ -177,20 +275,28 @@ impl Schema {
}
pub fn is_displayed(&self, id: FieldId) -> bool {
self.displayed.get(&id).is_some()
match self.displayed {
OptionAll::Some(ref v) => v.contains(&id),
OptionAll::All => true,
OptionAll::None => false,
}
}
pub fn is_indexed(&self, id: FieldId) -> Option<&IndexedPos> {
self.indexed_map.get(&id)
}
pub fn is_indexed_all(&self) -> bool {
self.indexed.is_all()
}
pub fn indexed_pos_to_field_id<I: Into<IndexedPos>>(&self, pos: I) -> Option<FieldId> {
let indexed_pos = pos.into().0 as usize;
if indexed_pos < self.indexed.len() {
Some(self.indexed[indexed_pos as usize])
} else {
None
}
let indexed_pos = pos.into().0;
self
.indexed_map
.iter()
.find(|(_, &v)| v.0 == indexed_pos)
.map(|(&k, _)| k)
}
pub fn update_ranked<S: AsRef<str>>(&mut self, data: impl IntoIterator<Item = S>) -> SResult<()> {
@ -202,7 +308,13 @@ impl Schema {
}
pub fn update_displayed<S: AsRef<str>>(&mut self, data: impl IntoIterator<Item = S>) -> SResult<()> {
self.displayed.clear();
self.displayed = match self.displayed.take() {
OptionAll::Some(mut v) => {
v.clear();
OptionAll::Some(v)
}
_ => OptionAll::Some(HashSet::new())
};
for name in data {
self.set_displayed(name.as_ref())?;
}
@ -210,7 +322,13 @@ impl Schema {
}
pub fn update_indexed<S: AsRef<str>>(&mut self, data: Vec<S>) -> SResult<()> {
self.indexed.clear();
self.indexed = match self.indexed.take() {
OptionAll::Some(mut v) => {
v.clear();
OptionAll::Some(v)
},
_ => OptionAll::Some(Vec::new()),
};
self.indexed_map.clear();
for name in data {
self.set_indexed(name.as_ref())?;
@ -219,29 +337,16 @@ impl Schema {
}
pub fn set_all_fields_as_indexed(&mut self) {
self.indexed.clear();
self.indexed = OptionAll::All;
self.indexed_map.clear();
for (_name, id) in self.fields_map.iter() {
let pos = self.indexed.len() as u16;
self.indexed.push(*id);
let pos = self.indexed_map.len() as u16;
self.indexed_map.insert(*id, pos.into());
}
}
pub fn set_all_fields_as_displayed(&mut self) {
self.displayed.clear();
for (_name, id) in self.fields_map.iter() {
self.displayed.insert(*id);
}
}
pub fn accept_new_fields(&self) -> bool {
self.accept_new_fields
}
pub fn set_accept_new_fields(&mut self, value: bool) {
self.accept_new_fields = value;
self.displayed = OptionAll::All
}
}

View File

@ -1,6 +1,6 @@
[package]
name = "meilisearch-tokenizer"
version = "0.12.0"
version = "0.13.0"
license = "MIT"
authors = ["Kerollmops <renault.cle@gmail.com>"]
edition = "2018"

View File

@ -1,6 +1,6 @@
[package]
name = "meilisearch-types"
version = "0.12.0"
version = "0.13.0"
license = "MIT"
authors = ["Clément Renault <renault.cle@gmail.com>"]
edition = "2018"