Squashed commit of the following:

commit 27413399
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Tue Oct 15 16:17:10 2024 +0200

    [VERSION] +1 to 0.0.7.3.3

commit f1d6a083
Merge: 0143f243 afdbe31d
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Tue Oct 15 15:45:40 2024 +0200

    Merge remote-tracking branch 'origin/improve-onboarding-experience' into dev

commit 0143f243
Author: Alfredo Di Napoli <alfredo.dinapoli@gmail.com>
Date:   Mon Oct 14 13:45:12 2024 +0200

    Fix step1c again

commit 43901901
Author: Alfredo Di Napoli <alfredo.dinapoli@gmail.com>
Date:   Mon Oct 14 12:39:43 2024 +0200

    Fix bug in GargPorter implementation

commit d44831d0
Author: Alfredo Di Napoli <alfredo.dinapoli@gmail.com>
Date:   Mon Oct 14 10:50:31 2024 +0200

    Introduce SpecContext, replicate search issue

    This commit refactors the test code ever so slightly to add a proper
    type called `SpecContext` rather than anonymous pairs to carry around
    the spec context/data. It also replicates the search problem around #415
    via a test.

commit b4260d9b
Author: Alfredo Di Napoli <alfredo.dinapoli@gmail.com>
Date:   Mon Oct 14 09:11:51 2024 +0200

    Add soysauce docs list to test-data

commit afdbe31d
Author: Grégoire Locqueville <gregoire.locqueville@cnrs.fr>
Date:   Thu Oct 10 18:51:46 2024 +0200

    Improve onboarding experience

    Edited the README to have a linear path to installation.
    Also made scripts to run the executables without having to worry
    whether they are on our path.

commit 3030272d
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Thu Oct 10 18:31:19 2024 +0200

    [VERSION] +1 to 0.0.7.3.2

commit bd33dd6c
Merge: 1d3417d9 592d966c
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Thu Oct 10 18:10:51 2024 +0200

    Merge remote-tracking branch 'origin/dev-websockets-node-update' into dev

commit 592d966c
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Thu Oct 10 18:06:20 2024 +0200

    [notifications] add missing test/Test/Core/Notifications.hs

commit 1d3417d9
Merge: 163304df a48fe0c8
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Thu Oct 10 17:54:39 2024 +0200

    Merge remote-tracking branch 'origin/dev-websockets-node-update' into dev

commit a48fe0c8
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Thu Oct 10 10:01:10 2024 +0200

    [ws] rename AsyncUpdates to Notifications

    This is bit more clear

commit cd831db4
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Wed Oct 9 16:00:48 2024 +0200

    [tests] first working notification test

commit 81af005d
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Wed Oct 9 11:05:11 2024 +0200

    Squashed commit of the following:

    commit 163304df
    Author: Alexandre Delanoë <devel+git@delanoe.org>
    Date:   Tue Oct 8 18:39:54 2024 +0200

        [FIX] conflict

    commit 82c68074
    Merge: f7b76918 5623161c
    Author: Alexandre Delanoë <devel+git@delanoe.org>
    Date:   Tue Oct 8 18:28:55 2024 +0200

        Merge remote-tracking branch 'origin/dev-websockets-node-update' into dev

    commit f7b76918
    Merge: fe7a92cc 88655f68
    Author: Alexandre Delanoë <devel+git@delanoe.org>
    Date:   Tue Oct 8 18:28:53 2024 +0200

        [FIX] conflicts

    commit fe7a92cc
    Author: Christian Merten <christian@merten.dev>
    Date:   Tue Oct 8 17:19:53 2024 +0200

        fix: no longer update graphs and phylos on corpus update

    commit f775d4a3
    Merge: 76b557ea d2f4b89d
    Author: Alexandre Delanoë <devel+git@delanoe.org>
    Date:   Tue Oct 8 16:27:53 2024 +0200

        Merge remote-tracking branch 'origin/dev-guidelines-update' into dev

    commit 76b557ea
    Merge: 2925d008 50c77ea2
    Author: Alexandre Delanoë <devel+git@delanoe.org>
    Date:   Tue Oct 8 16:27:27 2024 +0200

        Merge remote-tracking branch 'origin/304-dev-pubmed-api-not-in-toml' into dev

    commit d2f4b89d
    Author: Przemysław Kaminski <pk@intrepidus.pl>
    Date:   Tue Oct 8 15:37:54 2024 +0200

        DEVELOPER_GUIDELINES: update about git amend

        This is the result of Autumn workshop 2024

    commit 50c77ea2
    Author: Przemysław Kaminski <pk@intrepidus.pl>
    Date:   Tue Oct 8 15:15:28 2024 +0200

        [notifications] fix for send

        sendNonblocking threw an error initially. I just do a compromise and
        timeout the normal send (which blocks infinitely sometimes)

    commit 025b80b6
    Author: Przemysław Kaminski <pk@intrepidus.pl>
    Date:   Tue Oct 8 14:10:56 2024 +0200

        [docker] fix network: host, fix caddyfile

    commit 2925d008
    Author: Christian Merten <christian@merten.dev>
    Date:   Tue Oct 8 10:34:17 2024 +0200

        fix arbitrary instance

    commit e8fb3db6
    Author: Christian Merten <christian@merten.dev>
    Date:   Tue Oct 8 10:13:40 2024 +0200

        fix: re-add lost instances

    commit b86d2e61
    Author: Przemysław Kaminski <pk@intrepidus.pl>
    Date:   Tue Oct 8 10:09:18 2024 +0200

        [toml] remove pubmed api key from config

        It's set up in user settings instead and has been for a long time.

    commit c06de5ef
    Merge: ab710337 a0ec337b
    Author: Christian Merten <christian@merten.dev>
    Date:   Tue Oct 8 09:35:55 2024 +0200

        Merge remote-tracking branch 'gitlab/dev' into cm/update-corpus-button

    commit ab710337
    Author: Christian Merten <christian@merten.dev>
    Date:   Fri Apr 26 22:32:33 2024 +0200

        feat: update corpus endpoint

commit d4a9200e
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Wed Oct 9 11:01:43 2024 +0200

    [ws] notification action on node share

commit 163304df
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Tue Oct 8 18:39:54 2024 +0200

    [FIX] conflict

commit 82c68074
Merge: f7b76918 5623161c
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Tue Oct 8 18:28:55 2024 +0200

    Merge remote-tracking branch 'origin/dev-websockets-node-update' into dev

commit f7b76918
Merge: fe7a92cc 88655f68
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Tue Oct 8 18:28:53 2024 +0200

    [FIX] conflicts

commit 5623161c
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Tue Oct 8 18:18:23 2024 +0200

    [ws] implement node update (rename, move) with notifications to parents

commit 88655f68
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Tue Oct 8 18:05:25 2024 +0200

    Squashed commit of the following:

    commit f775d4a3
    Merge: 76b557ea d2f4b89d
    Author: Alexandre Delanoë <devel+git@delanoe.org>
    Date:   Tue Oct 8 16:27:53 2024 +0200

        Merge remote-tracking branch 'origin/dev-guidelines-update' into dev

    commit 76b557ea
    Merge: 2925d008 50c77ea2
    Author: Alexandre Delanoë <devel+git@delanoe.org>
    Date:   Tue Oct 8 16:27:27 2024 +0200

        Merge remote-tracking branch 'origin/304-dev-pubmed-api-not-in-toml' into dev

    commit d2f4b89d
    Author: Przemysław Kaminski <pk@intrepidus.pl>
    Date:   Tue Oct 8 15:37:54 2024 +0200

        DEVELOPER_GUIDELINES: update about git amend

        This is the result of Autumn workshop 2024

    commit 50c77ea2
    Author: Przemysław Kaminski <pk@intrepidus.pl>
    Date:   Tue Oct 8 15:15:28 2024 +0200

        [notifications] fix for send

        sendNonblocking threw an error initially. I just do a compromise and
        timeout the normal send (which blocks infinitely sometimes)

    commit 025b80b6
    Author: Przemysław Kaminski <pk@intrepidus.pl>
    Date:   Tue Oct 8 14:10:56 2024 +0200

        [docker] fix network: host, fix caddyfile

    commit 2925d008
    Author: Christian Merten <christian@merten.dev>
    Date:   Tue Oct 8 10:34:17 2024 +0200

        fix arbitrary instance

    commit e8fb3db6
    Author: Christian Merten <christian@merten.dev>
    Date:   Tue Oct 8 10:13:40 2024 +0200

        fix: re-add lost instances

    commit b86d2e61
    Author: Przemysław Kaminski <pk@intrepidus.pl>
    Date:   Tue Oct 8 10:09:18 2024 +0200

        [toml] remove pubmed api key from config

        It's set up in user settings instead and has been for a long time.

    commit c06de5ef
    Merge: ab710337 a0ec337b
    Author: Christian Merten <christian@merten.dev>
    Date:   Tue Oct 8 09:35:55 2024 +0200

        Merge remote-tracking branch 'gitlab/dev' into cm/update-corpus-button

    commit ab710337
    Author: Christian Merten <christian@merten.dev>
    Date:   Fri Apr 26 22:32:33 2024 +0200

        feat: update corpus endpoint

commit fe7a92cc
Author: Christian Merten <christian@merten.dev>
Date:   Tue Oct 8 17:19:53 2024 +0200

    fix: no longer update graphs and phylos on corpus update

commit f775d4a3
Merge: 76b557ea d2f4b89d
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Tue Oct 8 16:27:53 2024 +0200

    Merge remote-tracking branch 'origin/dev-guidelines-update' into dev

commit 76b557ea
Merge: 2925d008 50c77ea2
Author: Alexandre Delanoë <devel+git@delanoe.org>
Date:   Tue Oct 8 16:27:27 2024 +0200

    Merge remote-tracking branch 'origin/304-dev-pubmed-api-not-in-toml' into dev

commit d2f4b89d
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Tue Oct 8 15:37:54 2024 +0200

    DEVELOPER_GUIDELINES: update about git amend

    This is the result of Autumn workshop 2024

commit 50c77ea2
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Tue Oct 8 15:15:28 2024 +0200

    [notifications] fix for send

    sendNonblocking threw an error initially. I just do a compromise and
    timeout the normal send (which blocks infinitely sometimes)

commit 025b80b6
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Tue Oct 8 14:10:56 2024 +0200

    [docker] fix network: host, fix caddyfile

commit ee0db8c1
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Tue Oct 8 12:39:12 2024 +0200

    [nix] add nanomsg to nix pkgs

    Also, fix ./start to use gargantext-settings.toml

commit 2925d008
Author: Christian Merten <christian@merten.dev>
Date:   Tue Oct 8 10:34:17 2024 +0200

    fix arbitrary instance

commit e8fb3db6
Author: Christian Merten <christian@merten.dev>
Date:   Tue Oct 8 10:13:40 2024 +0200

    fix: re-add lost instances

commit b86d2e61
Author: Przemysław Kaminski <pk@intrepidus.pl>
Date:   Tue Oct 8 10:09:18 2024 +0200

    [toml] remove pubmed api key from config

    It's set up in user settings instead and has been for a long time.

commit c06de5ef
Merge: ab710337 a0ec337b
Author: Christian Merten <christian@merten.dev>
Date:   Tue Oct 8 09:35:55 2024 +0200

    Merge remote-tracking branch 'gitlab/dev' into cm/update-corpus-button

commit ab710337
Author: Christian Merten <christian@merten.dev>
Date:   Fri Apr 26 22:32:33 2024 +0200

    feat: update corpus endpoint
parent b2a6b741
Pipeline #6851 failed with stages
in 15 minutes and 4 seconds
## Version 0.0.7.3.3
* [FRONT][FIX][Display graph parameters in legend (#706)](https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/706)
* [BACK][FIX][Document Search (#415)](https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/415)
* [BACK][DOC+Scripts][Improving onboarding
experience](https://gitlab.iscpif.fr/gargantext/haskell-gargantext/merge_requests/360)
## Version 0.0.7.3.2 ## Version 0.0.7.3.2
* [FRONT][FIX][[Node Graph] Legend tab improvements (#689)](https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/689) * [FRONT][FIX][[Node Graph] Legend tab improvements (#689)](https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/689)
......
...@@ -7,10 +7,9 @@ ...@@ -7,10 +7,9 @@
#### Table of Contents #### Table of Contents
1. [About the project](#about) 1. [About the project](#about)
2. [Installation and development](#install) 2. [Installation](#install)
3. [Uses cases](#use-cases) 3. [Use cases](#use-cases)
4. [GraphQL](#graphql) 4. [Notes for developers](#develop)
5. [PostgreSQL](#postgresql)
## About the project <a name="about"></a> ## About the project <a name="about"></a>
...@@ -21,107 +20,136 @@ This software is free (as "Libre" in French) software, developed by the CNRS Com ...@@ -21,107 +20,136 @@ This software is free (as "Libre" in French) software, developed by the CNRS Com
GarganText Project: this repo builds the backend for the frontend server built by [backend](https://gitlab.iscpif.fr/gargantext/haskell-gargantext). GarganText Project: this repo builds the backend for the frontend server built by [backend](https://gitlab.iscpif.fr/gargantext/haskell-gargantext).
## Installation and development <a name="install"></a> ## Installation <a name="install"></a>
Disclaimer: since this project is still in development, this document remains in progress. Please report and improve this documentation if you encounter any issues. Disclaimer: since this project is still in development, this document remains in progress. Please report and improve this documentation if you encounter any issues.
### Prerequisites ### Prerequisites
You must have the following installed: You need to have the following installed:
* [GHCup](https://www.haskell.org/ghcup/)
* [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
* [Nix](https://nixos.org/download/)
* [Docker Compose](https://docs.docker.com/compose/install/).
- [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) You will need to add yourself as a member of the `docker` group for Docker
- [Curl](https://everything.curl.dev/index.html) Compose to work (replace `myusername` with your username in the following
- [Nix](https://nixos.org/download/) command):
- [Docker Compose](https://docs.docker.com/compose/install/) ```bash
sudo usermod -a -G docker myusername
### Building ```
#### Clone the projects ### Getting the source code
Clone both the backend (`haskell-gargantext`), and the frontend (`purescript-gargantext`) at the root of the backend. ```bash
```shell git clone https://gitlab.iscpif.fr/gargantext/haskell-gargantext.git
$ git clone https://gitlab.iscpif.fr/gargantext/haskell-gargantext.git cd haskell-gargantext
$ cd haskell-gargantext git clone https://gitlab.iscpif.fr/gargantext/haskell-gargantext.git
$ git clone https://gitlab.iscpif.fr/gargantext/purescript-gargantext.git cd ..
$ cd ..
``` ```
#### The Nix shell ### Building Gargantext
In what follows, many commands need to be executed from within the Nix shell. To make that clear, those will be prefixed with `n$`, *but you must not actually type `n$` before the commands*. #### Building the Backend
To enter a Nix shell, run the following (this will take a moment the first time you run it, be patient):
```shell Run the install script from inside the `haskell-gargantext/` directory:
$ nix-shell ```bash
./bin/install
``` ```
Once you are in a Nix shell, you can run commands like you would in any other shell. This will build Gargantext, run the tests, and build the documentation.
At any point, you can exit a Nix shell and go back to your regular shell by running `exit`. #### Building the Frontend
If for some reason you do not want to enter a Nix shell, you can still run a command from outside: running the following in a non-Nix shell Move into the `purescript-gargantext/` directory, then run the install script:
```shell ```bash
$ nix-shell --run "my command" cd purescript-gargantext
./bin/install
cd ..
``` ```
is equivalent to running `my command` from within a Nix shell.
#### (Optional) Disable optimization flags ### Initialization
Rename the `gargantext-settings.toml_toModify` into `gargantext-settings.toml`,
and fill in the missing fields.
If you are developing Gargantext, you might be interested in disabling compiler optimizations. > `.gitignore` excludes this file, so you don't need to worry about committing
This speeds up compilation, but the compiled program itself will be less efficient. > it by mistake, and you can change the passwords in `gargantext-settings.toml`
> safely.
To disable compiler optimizations, copy the file `cabal.project.local_toCopy` (which contains the flags that disable optimizations) into `cabal.project.local` (which will be read by Cabal): **NOTE** If you had the `gargantext.ini` file before, you can automatically
convert it into a file `gargantext-settings.toml` by running the following
from a Nix shell:
```shell ```shell
$ cp cabal.project.local_toCopy cabal.project.local n$ ./bin/cli ini
``` ```
#### Build the frontend ### Launching and Initializing the Database
```shell Launch the docker image from inside the dedicated directory:
$ cd purescript-gargantext/ ```bash
$ ./bin/install cd devops/docker/
$ cd .. docker compose up
``` ```
#### Build the backend Then initialize the database using the dedicated command: from inside the
`haskell-gargantext/` directory, run
```bash
./bin/cli init
```
*Note: This project can be built with either stack or cabal. We keep the `cabal.project` up-to-date, which allows us to build with cabal by default but we support stack thanks to thanks to `cabal2stack`, which allows us to generate a valid `stack.yaml` from a `cabal.project`. Due to the fact gargantext requires a particular set of system dependencies (C++ libraries, toolchains, etc) we use nix to setup an environment with all the required system dependencies, in a sandboxed and isolated fashion.* And provide a name and a password for the master user as requested.
*This documentation shows how to build with cabal. For information related to stack, see [the note about Stack](https://write.frame.gargantext.org/s/2b50f559075be33faad884e13899a66fc6865c41b97d87772f8bc447234995f7) in the developer documentation.* ### Running Gargantext
Depending on your situation, there are several ways to build the project: From inside the `haskell-gargantext/` directory, run
```bash
./bin/run
```
1. **Simple build** ## Use Cases <a name="use-cases"></a>
*This will build the project and install the executables `gargantext-cli` and `gargantext-server` somewhere on your system. ### Multi-User with Graphical User Interface (Server Mode)
Depending on your Cabal configuration, this is probably `~/.local/bin/` or `~/.cabal/bin/`.*
From within the Nix shell, run: ``` shell
```shell $ ~/.local/bin/stack --docker exec gargantext-server -- --run Prod
n$ cabal update
n$ cabal install
``` ```
2. **Full build** Then you can log in with `user1` / `1resu`
*Same as "simple build" above, but also runs tests and builds documentation.*
Just run the `install` script: ### Command Line Mode tools
```shell #### Simple cooccurrences computation and indexation from a list of Ngrams
$ ./bin/install
``` shell
$ stack --docker exec gargantext-cli -- CorpusFromGarg.csv ListFromGarg.csv Ouput.json
``` ```
3. **Build and run** ### Analyzing the ngrams table repo
*Builds and runs the Gargantext server. This has the advantage of letting you run Gargantext without having to know where on your machine the executable is.* We store the repository in directory `repos` in the [CBOR](https://cbor.io/) file format. To decode it to JSON and analyze, say, using [jq](https://shapeshed.com/jq-json/), use the following command:
*Since you will be running Gargantext,* **you need to have gone through initialization first;** *see "Initializing and running" below.* ``` shell
$ cat repos/repo.cbor.v5 | stack exec gargantext-cbor2json | jq .
```
### Documentation
To build documentation, run:
From inside a Nix shell:
```shell ```shell
n$ cabal run gargantext-server -- --toml gargantext-settings.toml --run Prod $ stack build --haddock --no-haddock-deps --fast
``` ```
#### Upgrading haskell packages (in `.stack-work/dist/x86_64-linux-nix/Cabal-3.2.1.0/doc/html/gargantext`).
## Notes for developers <a name="develop"></a>
### Developer Documentation
If you want to contribute to Gargantext, there is [some documentation geared to developers](https://write.frame.gargantext.org/s/805e1ee2bae24079554d24cdbc8ef5ba5c4ef7d83218c1e86c84af8ac269031d).
### Upgrading Haskell Packages
We use `gargantext.cabal`, `cabal.project` and `cabal.project.freeze` We use `gargantext.cabal`, `cabal.project` and `cabal.project.freeze`
as the source of truth. Ouf ot that, we generate the `stack.yaml` file as the source of truth. Ouf ot that, we generate the `stack.yaml` file
...@@ -173,54 +201,7 @@ with `cabal v2-build all`). ...@@ -173,54 +201,7 @@ with `cabal v2-build all`).
Also, here is a relevant discussion: Also, here is a relevant discussion:
https://discourse.haskell.org/t/whats-your-workflow-to-update-cabal-dependencies/9475 https://discourse.haskell.org/t/whats-your-workflow-to-update-cabal-dependencies/9475
### Initializing and running ### Running Tests
#### Start containers for database and NLP software bricks
```shell
$ cd devops/docker
$ docker compose up
```
The initialization schema should be loaded automatically from `devops/postgres/schema.sql`.
#### Create configuration file
```shell
$ cp gargantext-settings.toml_toModify gargantext-settings.toml
```
**NOTE** If you had the `gargantext.ini` file before, you can automatically
convert it into a file `gargantext-settings.toml` by running the following
from a Nix shell:
```shell
n$ cabal v2-run gargantext-cli -- ini
```
> `.gitignore` excludes this file, so you don't need to worry about
> committing it by mistake, and you can change the passwords in
> `gargantext-settings.toml` safely.
#### Create master user
From within the Nix shell:
```shell
n$ gargantext-cli init
```
The master user's name is automatically set to `gargantua`, but you will be prompted for their password and email address.
#### Running
Make sure you know where `gargantext-server` is (probably in `~/.local/bin/` or `.cabal/bin/`). If the location is in your `$PATH`, just run:
```shell
$ gargantext-server --run Prod --toml gargantext-settings.toml
```
(If the location is not in your `$PATH`, just prefix `gargantext-server` with the path to it.)
You might want to use the `./start` script: it rebuilds the backend, starts the docker containers, and launches the Gargantext server at once.
### Running tests
From nix shell: From nix shell:
...@@ -265,7 +246,7 @@ export TMPDIR=$(pwd)/_build ...@@ -265,7 +246,7 @@ export TMPDIR=$(pwd)/_build
### Working on libraries ### Working on libraries
When a devlopment is needed on libraries (for instance, the HAL crawler in https://gitlab.iscpif.fr/gargantext/crawlers): When a development is needed on libraries (for instance, the HAL crawler in https://gitlab.iscpif.fr/gargantext/crawlers):
1. Ongoing devlopment (on local repo): 1. Ongoing devlopment (on local repo):
1. In `cabal.project`: 1. In `cabal.project`:
...@@ -280,56 +261,13 @@ When a devlopment is needed on libraries (for instance, the HAL crawler in https ...@@ -280,56 +261,13 @@ When a devlopment is needed on libraries (for instance, the HAL crawler in https
> Note: without `stack.yaml` we would have to only fix `cabal.project` -> `source-repository-package` commit id. Sha256 is there to make sure CI reruns the tests. > Note: without `stack.yaml` we would have to only fix `cabal.project` -> `source-repository-package` commit id. Sha256 is there to make sure CI reruns the tests.
### Tooling info ### GraphQL
Once you get Gargantext to compile and run on your machine, you will likely want
the following:
- Language support (intellisense) in your editor; see `docs/editor_setup.md`
- Being able to send commands to the Gargantext server from GHCI; see `docs/running_commands.md`
## Use Cases <a name="use-cases"></a>
### Multi-User with Graphical User Interface (Server Mode)
``` shell
$ ~/.local/bin/stack --docker exec gargantext-server -- --run Prod
```
Then you can log in with `user1` / `1resu`
### Command Line Mode tools
#### Simple cooccurrences computation and indexation from a list of Ngrams
``` shell
$ stack --docker exec gargantext-cli -- CorpusFromGarg.csv ListFromGarg.csv Ouput.json
```
### Analyzing the ngrams table repo
We store the repository in directory `repos` in the [CBOR](https://cbor.io/) file format. To decode it to JSON and analyze, say, using [jq](https://shapeshed.com/jq-json/), use the following command:
``` shell
$ cat repos/repo.cbor.v5 | stack exec gargantext-cbor2json | jq .
```
### Documentation
To build documentation, run:
```shell
$ stack build --haddock --no-haddock-deps --fast
```
(in `.stack-work/dist/x86_64-linux-nix/Cabal-3.2.1.0/doc/html/gargantext`).
## GraphQL <a name="graphql"></a>
Some introspection information. Some introspection information.
Playground is located at http://localhost:8008/gql Playground is located at http://localhost:8008/gql
### List all GraphQL types in the Playground #### List all GraphQL types in the Playground
``` ```
{ {
...@@ -341,7 +279,7 @@ Playground is located at http://localhost:8008/gql ...@@ -341,7 +279,7 @@ Playground is located at http://localhost:8008/gql
} }
``` ```
### List details about a type in GraphQL #### List details about a type in GraphQL
``` ```
{ {
...@@ -356,9 +294,9 @@ Playground is located at http://localhost:8008/gql ...@@ -356,9 +294,9 @@ Playground is located at http://localhost:8008/gql
} }
} }
``` ```
## PostgreSQL <a name="pgsql"></a> ### PostgreSQL <a name="pgsql"></a>
### Upgrading using Docker #### Upgrading using Docker
https://www.cloudytuts.com/tutorials/docker/how-to-upgrade-postgresql-in-docker-and-kubernetes/ https://www.cloudytuts.com/tutorials/docker/how-to-upgrade-postgresql-in-docker-and-kubernetes/
...@@ -383,7 +321,7 @@ $ # now we can restore the dump ...@@ -383,7 +321,7 @@ $ # now we can restore the dump
$ docker exec -i <new-container-id> psql -U gargantua -d gargandbV5 < 11-db.dump $ docker exec -i <new-container-id> psql -U gargantua -d gargandbV5 < 11-db.dump
``` ```
### Upgrading using #### Upgrading using
There is a solution using pgupgrade_cluster but you need to manage the clusters version 14 and 13. Hence here is a simple solution to upgrade. There is a solution using pgupgrade_cluster but you need to manage the clusters version 14 and 13. Hence here is a simple solution to upgrade.
......
# The following line is more portable than just /bin/bash:
#!/usr/bin/env bash
# A couple hygienic options
set -e -u
# The following command will run `cabal run gargantext-cli --` followed by the
# options provided by the user, from inside a Nix shell. For instance,
# if the user types
# $ ./bin/cli someCommand "some string argument"
# the following will be run from inside a Nix shell:
# $ cabal run gargantext-cli -- someCommand "some string argument"
# It's a little convoluted because we want to keep spaces that were enclosed in
# quotes or escaped by the user.
nix-shell --run "$(printf "%q " cabal run gargantext-cli -- "$@")"
# The following line is more portable than just /bin/bash:
#!/usr/bin/env bash
# A couple hygienic options
set -e -u
echo "Launching Gargantext..."
nix-shell --run "cabal run gargantext-server -- --run Prod --toml gargantext-settings.toml"
...@@ -5,7 +5,7 @@ cabal-version: 3.4 ...@@ -5,7 +5,7 @@ cabal-version: 3.4
-- see: https://github.com/sol/hpack -- see: https://github.com/sol/hpack
name: gargantext name: gargantext
version: 0.0.7.3.2 version: 0.0.7.3.3
synopsis: Search, map, share synopsis: Search, map, share
description: Please see README.md description: Please see README.md
category: Data category: Data
...@@ -32,23 +32,24 @@ data-files: ...@@ -32,23 +32,24 @@ data-files:
ekg-assets/bootstrap-1.4.0.min.css ekg-assets/bootstrap-1.4.0.min.css
ekg-assets/chart_line_add.png ekg-assets/chart_line_add.png
ekg-assets/cross.png ekg-assets/cross.png
test-data/ngrams/GarganText_NgramsTerms-QuantumComputing.json
test-data/ngrams/GarganText_DocsList-nodeId-177.json test-data/ngrams/GarganText_DocsList-nodeId-177.json
test-data/ngrams/GarganText_NgramsTerms-nodeId-177.json test-data/ngrams/GarganText_NgramsTerms-nodeId-177.json
test-data/ngrams/GarganText_NgramsTerms-QuantumComputing.json
test-data/ngrams/simple.json test-data/ngrams/simple.json
test-data/ngrams/simple.tsv test-data/ngrams/simple.tsv
test-data/phylo/187481.json
test-data/phylo/bpa_phylo_test.json test-data/phylo/bpa_phylo_test.json
test-data/phylo/cleopatre.golden.json test-data/phylo/cleopatre.golden.json
test-data/phylo/nadal.golden.json test-data/phylo/issue-290-small.golden.json
test-data/phylo/nadal_docslist.golden.tsv test-data/phylo/nadal_docslist.golden.tsv
test-data/phylo/nadal.golden.json
test-data/phylo/nadal_ngramslist.golden.tsv test-data/phylo/nadal_ngramslist.golden.tsv
test-data/phylo/issue-290-small.golden.json
test-data/phylo/open_science.json test-data/phylo/open_science.json
test-data/phylo/small-phylo.golden.json test-data/phylo/phylo2dot2json.golden.json
test-data/phylo/small_phylo_docslist.tsv test-data/phylo/small_phylo_docslist.tsv
test-data/phylo/small-phylo.golden.json
test-data/phylo/small_phylo_ngramslist.tsv test-data/phylo/small_phylo_ngramslist.tsv
test-data/phylo/187481.json test-data/search/GarganText_DocsList-soysauce.json
test-data/phylo/phylo2dot2json.golden.json
test-data/stemming/lancaster.txt test-data/stemming/lancaster.txt
test-data/test_config.ini test-data/test_config.ini
test-data/test_config.toml test-data/test_config.toml
...@@ -734,6 +735,7 @@ common testDependencies ...@@ -734,6 +735,7 @@ common testDependencies
, epo-api-client , epo-api-client
, extra ^>= 1.7.9 , extra ^>= 1.7.9
, fast-logger ^>= 3.2.2 , fast-logger ^>= 3.2.2
, filepath ^>= 1.4.2.2
, fmt , fmt
, gargantext , gargantext
, gargantext-prelude , gargantext-prelude
...@@ -809,6 +811,7 @@ test-suite garg-test-tasty ...@@ -809,6 +811,7 @@ test-suite garg-test-tasty
CLI.Phylo.Common CLI.Phylo.Common
Paths_gargantext Paths_gargantext
Test.API.Private.Share Test.API.Private.Share
Test.API.Private.Table
Test.API.Authentication Test.API.Authentication
Test.API.Routes Test.API.Routes
Test.API.Setup Test.API.Setup
...@@ -869,6 +872,7 @@ test-suite garg-test-hspec ...@@ -869,6 +872,7 @@ test-suite garg-test-hspec
Test.API.Notifications Test.API.Notifications
Test.API.Private Test.API.Private
Test.API.Private.Share Test.API.Private.Share
Test.API.Private.Table
Test.API.Routes Test.API.Routes
Test.API.Setup Test.API.Setup
Test.API.UpdateList Test.API.UpdateList
......
{-| {-|
Module : Gargantext.Core.Text.Ngrams.Stem.En Module : Gargantext.Core.Text.Ngrams.Stem.En
Description : Porter Algorithm Implementation purely Haskell Description : Porter Algorithm Implementation purely Haskell
Copyright : (c) CNRS, 2017-Present Copyright : (c) CNRS, 2017-Present
License : AGPL + CECILL v3 License : AGPL + CECILL v3
Maintainer : team@gargantext.org Maintainer : team@gargantext.org
Stability : experimental Stability : experimental
Portability : POSIX Portability : POSIX
Adapted from: Adapted from:
- source: https://hackage.haskell.org/package/porter - source: https://hackage.haskell.org/package/porter
- [Char] -> [Text] - [Char] -> [Text]
- adding Types signatures - adding Types signatures
- fixes unseen cases - fixes unseen cases
-} -}
module Gargantext.Core.Text.Terms.Mono.Stem.Internal.GargPorter (stem) module Gargantext.Core.Text.Terms.Mono.Stem.Internal.GargPorter (stem)
where where
import Control.Monad import Control.Monad
import Data.Either import Data.Either
import Data.List ((!!)) import Data.List ((!!))
import Data.List qualified as List hiding (map, head) import Data.List qualified as List hiding (map, head)
import Data.Maybe import Data.Maybe
import Data.Text (pack, unpack) import Data.Text (pack, unpack)
import Gargantext.Prelude import Gargantext.Prelude
vowels :: [Char] vowels :: [Char]
vowels = ['a','e','i','o','u'] vowels = ['a','e','i','o','u']
isConsonant :: [Char] -> Int -> Bool isConsonant :: [Char] -> Int -> Bool
isConsonant str i isConsonant str i
| c `elem` vowels = False | c `elem` vowels = False
| c == 'y' = i == 0 || isVowel str (i - 1) | c == 'y' = i == 0 || isVowel str (i - 1)
| otherwise = True | otherwise = True
where where
c = str !! i c = str !! i
isVowel :: [Char] -> Int -> Bool isVowel :: [Char] -> Int -> Bool
isVowel = (not .) . isConsonant isVowel = (not .) . isConsonant
byIndex :: Foldable t1 => (t1 a -> [Int] -> t2) -> t1 a -> t2 byIndex :: Foldable t1 => (t1 a -> [Int] -> t2) -> t1 a -> t2
byIndex fun str = fun str [0..length str - 1] byIndex fun str = fun str [0..length str - 1]
containsVowel :: [Char] -> Bool containsVowel :: [Char] -> Bool
containsVowel = byIndex (any . isVowel) containsVowel = byIndex (any . isVowel)
-- | /!\ unsafe fromJust -- | /!\ unsafe fromJust
measure :: [Char] -> Int measure :: [Char] -> Int
measure = length . filter not . List.init . (True:) measure = length . filter not . List.init . (True:)
. map fromJust . map head . map fromJust . map head
. List.group . byIndex (map . isConsonant) . List.group . byIndex (map . isConsonant)
endsWithDouble :: [Char] -> Bool endsWithDouble :: [Char] -> Bool
endsWithDouble = startsWithDouble . reverse endsWithDouble = startsWithDouble . reverse
where where
startsWithDouble l = case l of startsWithDouble l = case l of
(x:y:_) -> x == y && x `List.notElem` vowels (x:y:_) -> x == y && x `List.notElem` vowels
_ -> False _ -> False
cvc :: [Char] -> Bool cvc :: [Char] -> Bool
cvc word | length word < 3 = False cvc word | length word < 3 = False
| otherwise = isConsonant word lastIndex && | otherwise = isConsonant word lastIndex &&
isVowel word (lastIndex - 1) && isVowel word (lastIndex - 1) &&
isConsonant word (lastIndex - 2) && isConsonant word (lastIndex - 2) &&
List.last word `List.notElem` ['w','x','y'] List.last word `List.notElem` ['w','x','y']
where lastIndex = length word - 1 where lastIndex = length word - 1
statefulReplace :: Eq a => ([a] -> Bool) statefulReplace :: Eq a => ([a] -> Bool)
-> [a] -> [a] -> [a] -> [a] -> [a] -> [a]
-> Maybe (Data.Either.Either [a] [a]) -> Maybe (Data.Either.Either [a] [a])
statefulReplace predicate str end replacement statefulReplace predicate str end replacement
| end `List.isSuffixOf` str = Just replaced | end `List.isSuffixOf` str = Just replaced
| otherwise = Nothing | otherwise = Nothing
where where
part = take (length str - length end) str part = take (length str - length end) str
replaced | predicate part = Right (part <> replacement) replaced | predicate part = Right (part <> replacement)
| otherwise = Left str | otherwise = Left str
replaceEnd :: Eq a => ([a] -> Bool) -> [a] -> [a] -> [a] -> Maybe [a] replaceEnd :: Eq a => ([a] -> Bool) -> [a] -> [a] -> [a] -> Maybe [a]
replaceEnd predicate str end replacement = do replaceEnd predicate str end replacement = do
result <- statefulReplace predicate str end replacement result <- statefulReplace predicate str end replacement
pure (either identity identity result) pure (either identity identity result)
findStem findStem
:: (Foldable t, Functor t, Eq a) => :: (Foldable t, Functor t, Eq a) =>
([a] -> Bool) -> [a] -> t ([a], [a]) -> Maybe [a] ([a] -> Bool) -> [a] -> t ([a], [a]) -> Maybe [a]
findStem f word pairs = msum $ map (uncurry (replaceEnd f word)) pairs findStem f word pairs = msum $ map (uncurry (replaceEnd f word)) pairs
measureGT :: Int -> [Char] -> Bool measureGT :: Int -> [Char] -> Bool
measureGT = flip ((>) . measure) measureGT = flip ((>) . measure)
step1a :: [Char] -> [Char] step1a :: [Char] -> [Char]
step1a word = fromMaybe word result step1a word = fromMaybe word result
where where
result = findStem (const True) word suffixes result = findStem (const True) word suffixes
suffixes = [("sses", "ss"), ("ies", "i"), ("ss", "ss"), ("s", "")] suffixes = [("sses", "ss"), ("ies", "i"), ("ss", "ss"), ("s", "")]
beforeStep1b :: [Char] -> Either [Char] [Char] beforeStep1b :: [Char] -> Either [Char] [Char]
beforeStep1b word = fromMaybe (Left word) result beforeStep1b word = fromMaybe (Left word) result
where where
cond23 x = do { v <- x; either (const Nothing) (return . Right) v } cond23 x = do { v <- x; either (const Nothing) (return . Right) v }
cond1 x = do { v <- x; pure (Left v) } cond1 x = do { v <- x; pure (Left v) }
result = result =
cond1 (replaceEnd (measureGT 0) word "eed" "ee") `mplus` cond1 (replaceEnd (measureGT 0) word "eed" "ee") `mplus`
cond23 (statefulReplace containsVowel word "ed" "" ) `mplus` cond23 (statefulReplace containsVowel word "ed" "" ) `mplus`
cond23 (statefulReplace containsVowel word "ing" "" ) cond23 (statefulReplace containsVowel word "ing" "" )
afterStep1b :: [Char] -> [Char] afterStep1b :: [Char] -> [Char]
afterStep1b word = fromMaybe word result afterStep1b word = fromMaybe word result
where where
double = endsWithDouble word && not (any ((`List.isSuffixOf` word) . return) ['l','s','z']) double = endsWithDouble word && not (any ((`List.isSuffixOf` word) . return) ['l','s','z'])
mEq1AndCvc = measure word == 1 && cvc word mEq1AndCvc = measure word == 1 && cvc word
iif cond val = if cond then Just val else Nothing iif cond val = if cond then Just val else Nothing
result = findStem (const True) word [("at", "ate"), ("bl", "ble"), ("iz", "ize")] result = findStem (const True) word [("at", "ate"), ("bl", "ble"), ("iz", "ize")]
`mplus` iif double (List.init word) `mplus` iif double (List.init word)
`mplus` iif mEq1AndCvc (word <> "e") `mplus` iif mEq1AndCvc (word <> "e")
step1b :: [Char] -> [Char] step1b :: [Char] -> [Char]
step1b = either identity afterStep1b . beforeStep1b step1b = either identity afterStep1b . beforeStep1b
step1c :: [Char] -> [Char] -- Issue #415: According to the Porter stemming rules, we need to replace `y` with `i` only if there
step1c word = fromMaybe word result -- are no other vocals at the end.
where result = replaceEnd containsVowel word "y" "i" step1c :: [Char] -> [Char]
step1c word
step1 :: [Char] -> [Char] | length word > 2 && List.last word == 'y' && isConsonant word (List.length word - 2)
step1 = step1c . step1b . step1a = List.init word <> "i"
| otherwise
step2 :: [Char] -> [Char] = word
step2 word = fromMaybe word result
where step1 :: [Char] -> [Char]
result = findStem (measureGT 0) word step1 = step1c . step1b . step1a
[ ("ational", "ate" )
, ("tional", "tion") step2 :: [Char] -> [Char]
, ("enci", "ence") step2 word = fromMaybe word result
, ("anci", "ance") where
, ("izer", "ize" ) result = findStem (measureGT 0) word
, ("bli", "ble" ) [ ("ational", "ate" )
, ("alli", "al" ) , ("tional", "tion")
, ("entli", "ent" ) , ("enci", "ence")
, ("eli", "e" ) , ("anci", "ance")
, ("ousli", "ous" ) , ("izer", "ize" )
, ("ization", "ize" ) , ("bli", "ble" )
, ("ation", "ate" ) , ("alli", "al" )
, ("ator", "ate" ) , ("entli", "ent" )
, ("alism", "al" ) , ("eli", "e" )
, ("iveness", "ive" ) , ("ousli", "ous" )
, ("fulness", "ful" ) , ("ization", "ize" )
, ("ousness", "ous" ) , ("ation", "ate" )
, ("aliti", "al" ) , ("ator", "ate" )
, ("iviti", "ive" ) , ("alism", "al" )
, ("biliti", "ble" ) , ("iveness", "ive" )
, ("logi", "log" ) ] , ("fulness", "ful" )
, ("ousness", "ous" )
step3 :: [Char] -> [Char] , ("aliti", "al" )
step3 word = fromMaybe word result , ("iviti", "ive" )
where , ("biliti", "ble" )
result = findStem (measureGT 0) word , ("logi", "log" ) ]
[ ("icate", "ic")
, ("ative", "" ) step3 :: [Char] -> [Char]
, ("alize", "al") step3 word = fromMaybe word result
, ("iciti", "ic") where
, ("ical" , "ic") result = findStem (measureGT 0) word
, ("ful" , "" ) [ ("icate", "ic")
, ("ness" , "" ) ] , ("ative", "" )
, ("alize", "al")
step4 :: [Char] -> [Char] , ("iciti", "ic")
step4 word = fromMaybe word result , ("ical" , "ic")
where , ("ful" , "" )
gt1andST str = (measureGT 1) str && any ((`List.isSuffixOf` str) . return) ['s','t'] , ("ness" , "" ) ]
findGT1 = findStem (measureGT 1) word . map (flip (,) "")
result = (findGT1 ["al", "ance", "ence", "er", "ic", "able", "ible", "ant", "ement", "ment", "ent"]) `mplus` step4 :: [Char] -> [Char]
(findStem gt1andST word [("ion","")]) `mplus` step4 word = fromMaybe word result
(findGT1 ["ou", "ism", "ate", "iti", "ous", "ive", "ize"]) where
gt1andST str = (measureGT 1) str && any ((`List.isSuffixOf` str) . return) ['s','t']
step5a :: [Char] -> [Char] findGT1 = findStem (measureGT 1) word . map (flip (,) "")
step5a word = fromMaybe word result result = (findGT1 ["al", "ance", "ence", "er", "ic", "able", "ible", "ant", "ement", "ment", "ent"]) `mplus`
where (findStem gt1andST word [("ion","")]) `mplus`
test str = (measureGT 1 str) || ((measure str == 1) && (not $ cvc str)) (findGT1 ["ou", "ism", "ate", "iti", "ous", "ive", "ize"])
result = replaceEnd test word "e" ""
step5a :: [Char] -> [Char]
step5b :: [Char] -> [Char] step5a word = fromMaybe word result
step5b word = fromMaybe word result where
where test str = (measureGT 1 str) || ((measure str == 1) && (not $ cvc str))
cond s = List.last s == 'l' && measureGT 1 s result = replaceEnd test word "e" ""
result = replaceEnd cond word "l" ""
step5b :: [Char] -> [Char]
step5 :: [Char] -> [Char] step5b word = fromMaybe word result
step5 = step5b . step5a where
cond s = List.last s == 'l' && measureGT 1 s
allSteps :: [Char] -> [Char] result = replaceEnd cond word "l" ""
allSteps = step5 . step4 . step3 . step2 . step1
step5 :: [Char] -> [Char]
stem :: Text -> Text step5 = step5b . step5a
stem s = pack (stem' $ unpack s)
allSteps :: [Char] -> [Char]
stem' :: [Char] -> [Char] allSteps = step5 . step4 . step3 . step2 . step1
stem' s | length s < 3 = s
| otherwise = allSteps s stem :: Text -> Text
stem s = pack (stem' $ unpack s)
--fixpoint :: Eq t => (t -> t) -> t -> t
--fixpoint f x = let fx = f x in stem' :: [Char] -> [Char]
-- if fx == x stem' s | length s < 3 = s
-- then x | otherwise = allSteps s
-- else fixpoint f fx
-- --fixpoint :: Eq t => (t -> t) -> t -> t
--fixstem :: [Char] -> [Char] --fixpoint f x = let fx = f x in
--fixstem = fixpoint stem' -- if fx == x
-- then x
-- else fixpoint f fx
{- --
--fixstem :: [Char] -> [Char]
main :: IO () --fixstem = fixpoint stem'
main = do
content <- readFile "input.txt"
writeFile "output.txt" $ unlines $ map stem $ lines content {-
-} main :: IO ()
main = do
content <- readFile "input.txt"
writeFile "output.txt" $ unlines $ map stem $ lines content
-}
{
"documents": [
{
"document": {
"id": 1101563,
"hash_id": null,
"typename": 4,
"user_id": 58,
"parent_id": null,
"name": "THE EFFECT OF ANTIOXIDANTS ON FROZEN GROUND PORK",
"date": "1956-01-01T00:00:00Z",
"hyperdata": {
"abstract": "The relative effectiveness of monosodium glutamate (MSG), soybean flour, and butylated hydroxyanisole (BHA) as antioxidants for ground pork stored raw, and pork cooked prior to freezer storage, was studied. Peroxide determinations were made at intervals through 18 months of storage, and organoleptic judgments at corresponding intervals through 12 months. Peroxide determinations indicated that soybean flour, BHA, and the cooking process alone inhibited fat oxidation, but MSG did not. On palatability tests, soy-treated pork was rated down on flavor. Samples with MSG received the best scores, but showed rapid peroxide development after 12 months when stored raw. None of the samples became rancid during the first 12 months of storage. At 15 and 18 months, peroxide numbers indicated rancidity in the untreated and in the MSG treated pork stored raw.",
"authors": "NEILL, J; PAGE, L",
"bdd": "WOS",
"language_iso2": "EN",
"publication_date": "1956-01-01 00:00:00 UTC",
"publication_day": 1,
"publication_month": 1,
"publication_year": 1956,
"source": "FOOD TECHNOLOGY",
"title": "THE EFFECT OF ANTIOXIDANTS ON FROZEN GROUND PORK"
}
},
"ngrams": {
"ngrams": [],
"hash": ""
},
"hash": ""
},
{
"document": {
"id": 1101539,
"hash_id": null,
"typename": 4,
"user_id": 58,
"parent_id": null,
"name": "INFLUENCE OF DIETARY PROTEIN LEVEL AND AMINO ACID COMPOSITION ON CHICK; PERFORMANCE",
"date": "1965-01-01T00:00:00Z",
"hyperdata": {
"abstract": "Studies were designed to investigate the effects of altering dietary protein levels and/or amino acid composition on chick growth and feed efficiency. Contradictory observations in chick performance were made among a series of diets in which the crude protein content was increased from 18 to 22%. Chick weight and feed efficiency was unaffected as dietary protein was increased by replacing cellulose with monosodium glutamate or L-glutamic acid. Chick performance was improved, however, by supplementation of the deficient essential amino acids. Increasing dietary protein concomitantly with essential amino acid supplementation had no effect on chick weight or feed efficiency. In contrast, significant improvements in chick performance were observed in a series of diets where dietary protein was increased by replacing corn with soybean meal.",
"authors": "ASKELSON, CE; BALLOUN, SL",
"bdd": "WOS",
"doi": "10.3382/ps.0440193",
"language_iso2": "EN",
"publication_date": "1965-01-01 00:00:00 UTC",
"publication_day": 1,
"publication_month": 1,
"publication_year": 1965,
"source": "POULTRY SCIENCE",
"title": "INFLUENCE OF DIETARY PROTEIN LEVEL AND AMINO ACID COMPOSITION ON CHICK; PERFORMANCE"
}
},
"ngrams": {
"ngrams": [],
"hash": ""
},
"hash": ""
},
{
"document": {
"id": 1102103,
"hash_id": null,
"typename": 4,
"user_id": 58,
"parent_id": null,
"name": "DILUENT SENSITIVITY IN THERMALLY STRESSED CELLS OF; PSEUDOMONAS-FLUORESCENS",
"date": "1977-01-01T00:00:00Z",
"hyperdata": {
"abstract": "Thermally injured cells of P. fluorescens cannot produce colonies on Trypticase soy agar (TSA) after dilution with 0.1% peptone. Nutritional exigency could not be used as the criterion for this injury, since varying the composition of the plating medium had little effect on the number of colonies that developed. The injured cells had no requirement for compounds known to leak out during the heat treatment in order to recover. The cells did not exhibit injury if dilution preceded heat treatment on the plating medium, demonstrating that the heat treatment sensitized the cells to the trauma of dilution. Substitution of 0.1% peptone with growth medium as the diluent largely offset the previously observed drop in TSA count. Little difference in survival was observed when monosodium glutamate or the balance of the defined medium was used as the diluent. The diluent effect was ionic rather than osmotic. The presence of cations was important in maintaining the integrity of the injured cell, and divalent cations enhanced this protective effect. The role of these cations at the level of the cell envelope is discussed.",
"authors": "GRAY, RJH; ORDAL, ZJ; WITTER, LD",
"bdd": "WOS",
"doi": "10.1128/AEM.33.5.1074-1078.1977",
"language_iso2": "EN",
"publication_date": "1977-01-01 00:00:00 UTC",
"publication_day": 1,
"publication_month": 1,
"publication_year": 1977,
"source": "APPLIED AND ENVIRONMENTAL MICROBIOLOGY",
"title": "DILUENT SENSITIVITY IN THERMALLY STRESSED CELLS OF; PSEUDOMONAS-FLUORESCENS"
}
},
"ngrams": {
"ngrams": [],
"hash": ""
},
"hash": ""
},
{
"document": {
"id": 1101179,
"hash_id": null,
"typename": 4,
"user_id": 58,
"parent_id": null,
"name": "GROWTH OF BACILLUS-CEREUS IN MEDIA CONTAINING PLANT SEED MATERIALS AND; INGREDIENTS USED IN CHINESE COOKERY",
"date": "1980-01-01T00:00:00Z",
"hyperdata": {
"abstract": "Growth and sporulation of enterotoxigenic strains of B. cereus in media containing 20 different plant seed flours and meals, with and without added infusions of beef, pork, chicken and shrimp, monosodium glutamate (MSG) and soy sauce, were studied. Suspensions (2%; pH 7.1) of seed flours and meals from diverse botanical origins were excellent sources of nutrients for growth. No correlations could be made between composition of seed materials and rate of cell division. Mean generation times of B. cereus cultured in soy, peanut and rice flour media supplemented with animal flesh infusions were significantly faster (P .ltoreq. 0.05) than those of respective controls. Monosodium glutamate (1-2%) and soy sauce (5-10%) stimulated the rate of growth of B. cereus in rice flour medium. Test flours supporting slower growth rates appeared generally to support higher rates of sporulation.",
"authors": "BEUCHAT, LR; MALIN, CFA; CARPENTER, JA",
"bdd": "WOS",
"doi": "10.1111/j.1365-2672.1980.tb01028.x",
"language_iso2": "EN",
"publication_date": "1980-01-01 00:00:00 UTC",
"publication_day": 1,
"publication_month": 1,
"publication_year": 1980,
"source": "JOURNAL OF APPLIED BACTERIOLOGY",
"title": "GROWTH OF BACILLUS-CEREUS IN MEDIA CONTAINING PLANT SEED MATERIALS AND; INGREDIENTS USED IN CHINESE COOKERY"
}
},
"ngrams": {
"ngrams": [],
"hash": ""
},
"hash": ""
}
],
"garg_version": "0.0.7.3.1"
}
...@@ -20,7 +20,7 @@ import Prelude qualified ...@@ -20,7 +20,7 @@ import Prelude qualified
import Servant.Auth.Client () import Servant.Auth.Client ()
import Servant.Client import Servant.Client
import Test.API.Routes (auth_api) import Test.API.Routes (auth_api)
import Test.API.Setup (withTestDBAndPort, setupEnvironment) import Test.API.Setup (withTestDBAndPort, setupEnvironment, SpecContext (..))
import Test.Database.Types import Test.Database.Types
import Test.Hspec import Test.Hspec
import Gargantext.API.Routes.Named import Gargantext.API.Routes.Named
...@@ -32,7 +32,7 @@ cannedToken = "eyJhbGciOiJIUzUxMiJ9.eyJkYXQiOnsiaWQiOjF9fQ.t49zZSqkPAulEkYEh4pW1 ...@@ -32,7 +32,7 @@ cannedToken = "eyJhbGciOiJIUzUxMiJ9.eyJkYXQiOnsiaWQiOjF9fQ.t49zZSqkPAulEkYEh4pW1
tests :: Spec tests :: Spec
tests = sequential $ aroundAll withTestDBAndPort $ do tests = sequential $ aroundAll withTestDBAndPort $ do
describe "Prelude" $ do describe "Prelude" $ do
it "setup DB triggers" $ \((testEnv, _), _) -> setupEnvironment testEnv it "setup DB triggers" $ \SpecContext{..} -> setupEnvironment _sctx_env
describe "Authentication" $ do describe "Authentication" $ do
baseUrl <- runIO $ parseBaseUrl "http://localhost" baseUrl <- runIO $ parseBaseUrl "http://localhost"
manager <- runIO $ newManager defaultManagerSettings manager <- runIO $ newManager defaultManagerSettings
...@@ -41,15 +41,15 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -41,15 +41,15 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
-- testing scenarios start here -- testing scenarios start here
describe "GET /api/v1.0/version" $ do describe "GET /api/v1.0/version" $ do
let version_api = gargVersionEp . gargAPIVersion . mkBackEndAPI $ genericClient let version_api = gargVersionEp . gargAPIVersion . mkBackEndAPI $ genericClient
it "requires no auth and returns the current version" $ \((_testEnv, port), _) -> do it "requires no auth and returns the current version" $ \SpecContext{..} -> do
result <- runClientM version_api (clientEnv port) result <- runClientM version_api (clientEnv _sctx_port)
case result of case result of
Left err -> Prelude.fail (show err) Left err -> Prelude.fail (show err)
Right r -> r `shouldSatisfy` ((>= 1) . T.length) -- we got something back Right r -> r `shouldSatisfy` ((>= 1) . T.length) -- we got something back
describe "POST /api/v1.0/auth" $ do describe "POST /api/v1.0/auth" $ do
it "requires no auth and authenticates the user 'alice'" $ \((testEnv, port), _) -> do it "requires no auth and authenticates the user 'alice'" $ \(SpecContext testEnv port _app _) -> do
-- Let's create the Alice user. -- Let's create the Alice user.
void $ flip runReaderT testEnv $ runTestMonad $ do void $ flip runReaderT testEnv $ runTestMonad $ do
...@@ -66,7 +66,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -66,7 +66,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
result `shouldBe` Right expected result `shouldBe` Right expected
it "denies login for user 'alice' if password is invalid" $ \((_testEnv, port), _) -> do it "denies login for user 'alice' if password is invalid" $ \(SpecContext _testEnv port _app _) -> do
let authPayload = AuthRequest "alice" (GargPassword "wrong") let authPayload = AuthRequest "alice" (GargPassword "wrong")
result <- runClientM (auth_api authPayload) (clientEnv port) result <- runClientM (auth_api authPayload) (clientEnv port)
putText $ "result: " <> show result putText $ "result: " <> show result
......
...@@ -15,7 +15,7 @@ import Servant.Auth.Client () ...@@ -15,7 +15,7 @@ import Servant.Auth.Client ()
import Servant.Client import Servant.Client
import Servant.Client.Generic (genericClient) import Servant.Client.Generic (genericClient)
import Test.API.Routes (mkUrl) import Test.API.Routes (mkUrl)
import Test.API.Setup (withTestDBAndPort, setupEnvironment, createAliceAndBob) import Test.API.Setup (withTestDBAndPort, setupEnvironment, createAliceAndBob, SpecContext (..))
import Test.Hspec import Test.Hspec
import Test.Hspec.Wai.Internal (withApplication) import Test.Hspec.Wai.Internal (withApplication)
import Test.Utils (protected, withValidLogin, protectedNewError) import Test.Utils (protected, withValidLogin, protectedNewError)
...@@ -26,7 +26,7 @@ tests :: Spec ...@@ -26,7 +26,7 @@ tests :: Spec
tests = sequential $ aroundAll withTestDBAndPort $ do tests = sequential $ aroundAll withTestDBAndPort $ do
describe "Errors API" $ do describe "Errors API" $ do
describe "Prelude" $ do describe "Prelude" $ do
it "setup DB triggers and users" $ \((testEnv, port), _) -> do it "setup DB triggers and users" $ \(SpecContext testEnv port _app _) -> do
setupEnvironment testEnv setupEnvironment testEnv
baseUrl <- parseBaseUrl "http://localhost" baseUrl <- parseBaseUrl "http://localhost"
manager <- newManager defaultManagerSettings manager <- newManager defaultManagerSettings
...@@ -41,7 +41,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -41,7 +41,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
describe "GET /api/v1.0/node" $ do describe "GET /api/v1.0/node" $ do
it "returns the old error by default" $ \((_testEnv, port), app) -> do it "returns the old error by default" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do
res <- protected token "GET" (mkUrl port "/node/99") "" res <- protected token "GET" (mkUrl port "/node/99") ""
...@@ -52,7 +52,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -52,7 +52,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
statusCode `shouldBe` 404 statusCode `shouldBe` 404
simpleBody `shouldBe` [r|{"error":"Node does not exist","node":99}|] simpleBody `shouldBe` [r|{"error":"Node does not exist","node":99}|]
it "returns the new error if header X-Garg-Error-Scheme: new is passed" $ \((_testEnv, port), app) -> do it "returns the new error if header X-Garg-Error-Scheme: new is passed" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do
res <- protectedNewError token "GET" (mkUrl port "/node/99") "" res <- protectedNewError token "GET" (mkUrl port "/node/99") ""
......
...@@ -10,7 +10,7 @@ module Test.API.GraphQL ( ...@@ -10,7 +10,7 @@ module Test.API.GraphQL (
import Gargantext.Core.Types.Individu import Gargantext.Core.Types.Individu
import Prelude import Prelude
import Servant.Auth.Client () import Servant.Auth.Client ()
import Test.API.Setup (withTestDBAndPort, setupEnvironment, createAliceAndBob) import Test.API.Setup (withTestDBAndPort, setupEnvironment, createAliceAndBob, SpecContext (..))
import Test.Hspec import Test.Hspec
import Test.Hspec.Wai.Internal (withApplication) import Test.Hspec.Wai.Internal (withApplication)
import Test.Hspec.Wai.JSON (json) import Test.Hspec.Wai.JSON (json)
...@@ -21,10 +21,10 @@ tests :: Spec ...@@ -21,10 +21,10 @@ tests :: Spec
tests = sequential $ aroundAll withTestDBAndPort $ do tests = sequential $ aroundAll withTestDBAndPort $ do
describe "GraphQL" $ do describe "GraphQL" $ do
describe "Prelude" $ do describe "Prelude" $ do
it "setup DB triggers" $ \((testEnv, _), _) -> setupEnvironment testEnv it "setup DB triggers" $ \SpecContext{..} -> setupEnvironment _sctx_env
describe "get_user_infos" $ do describe "get_user_infos" $ do
it "allows 'alice' to see her own info" $ \((testEnv, port), app) -> do it "allows 'alice' to see her own info" $ \(SpecContext testEnv port app _) -> do
createAliceAndBob testEnv createAliceAndBob testEnv
withApplication app $ do withApplication app $ do
...@@ -34,7 +34,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -34,7 +34,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
protected token "POST" "/gql" query `shouldRespondWithFragment` expected protected token "POST" "/gql" query `shouldRespondWithFragment` expected
describe "nodes" $ do describe "nodes" $ do
it "returns node_type" $ \((_testEnv, port), app) -> do it "returns node_type" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do
let query = [r| { "query": "{ nodes(node_id: 2) { node_type } }" } |] let query = [r| { "query": "{ nodes(node_id: 2) { node_type } }" } |]
...@@ -42,21 +42,21 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -42,21 +42,21 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
protected token "POST" "/gql" query `shouldRespondWithFragment` expected protected token "POST" "/gql" query `shouldRespondWithFragment` expected
describe "check error format" $ do describe "check error format" $ do
it "returns the new error if header X-Garg-Error-Scheme: new is passed" $ \((_testEnv, port), app) -> do it "returns the new error if header X-Garg-Error-Scheme: new is passed" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do
let query = [r| { "query": "{ languages(id:5) { lt_lang } }" } |] let query = [r| { "query": "{ languages(id:5) { lt_lang } }" } |]
let expected = [json| {"errors": [{"locations":[{"column":13,"line":1}],"message":"Unknown Argument \"id\" on Field \"languages\"."}] } |] let expected = [json| {"errors": [{"locations":[{"column":13,"line":1}],"message":"Unknown Argument \"id\" on Field \"languages\"."}] } |]
protectedNewError token "POST" "/gql" query `shouldRespondWithFragment` expected protectedNewError token "POST" "/gql" query `shouldRespondWithFragment` expected
it "returns the old error (though this is deprecated)" $ \((_testEnv, port), app) -> do it "returns the old error (though this is deprecated)" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do
let query = [r| { "query": "{ languages(id:5) { lt_lang } }" } |] let query = [r| { "query": "{ languages(id:5) { lt_lang } }" } |]
let expected = [json| {"errors": [{"locations":[{"column":13,"line":1}],"message":"Unknown Argument \"id\" on Field \"languages\"."}] } |] let expected = [json| {"errors": [{"locations":[{"column":13,"line":1}],"message":"Unknown Argument \"id\" on Field \"languages\"."}] } |]
protected token "POST" "/gql" query `shouldRespondWithFragment` expected protected token "POST" "/gql" query `shouldRespondWithFragment` expected
it "check new errors with 'type'" $ \((_testEnv, port), app) -> do it "check new errors with 'type'" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do withValidLogin port "gargantua" (GargPassword "secret_key") $ \_clientEnv token -> do
let query = [r| { "query": "mutation { delete_team_membership(shared_folder_id:1, team_node_id:1, token:\"abc\") }" } |] let query = [r| { "query": "mutation { delete_team_membership(shared_folder_id:1, team_node_id:1, token:\"abc\") }" } |]
......
...@@ -9,26 +9,25 @@ module Test.API.Private ( ...@@ -9,26 +9,25 @@ module Test.API.Private (
import Gargantext.API.Routes.Named.Node import Gargantext.API.Routes.Named.Node
import Gargantext.API.Routes.Named.Private import Gargantext.API.Routes.Named.Private
import Gargantext.Core.Types (Node)
import Gargantext.Core.Types.Individu import Gargantext.Core.Types.Individu
import Gargantext.Core.Types (Node)
import Gargantext.Database.Admin.Types.Hyperdata (HyperdataUser) import Gargantext.Database.Admin.Types.Hyperdata (HyperdataUser)
import Gargantext.Prelude hiding (get) import Gargantext.Prelude hiding (get)
import Network.HTTP.Client hiding (Proxy) import Network.HTTP.Client hiding (Proxy)
import Network.Wai
import Servant.Auth.Client () import Servant.Auth.Client ()
import Servant.Client import Servant.Client
import Servant.Client.Generic (genericClient) import Servant.Client.Generic (genericClient)
import Test.API.Private.Share qualified as Share import Test.API.Private.Share qualified as Share
import Test.API.Private.Table qualified as Table
import Test.API.Routes (mkUrl) import Test.API.Routes (mkUrl)
import Test.API.Setup (withTestDBAndPort, setupEnvironment, createAliceAndBob) import Test.API.Setup (withTestDBAndPort, setupEnvironment, createAliceAndBob, SpecContext (..))
import Test.Database.Types
import Test.Hspec import Test.Hspec
import Test.Hspec.Wai hiding (pendingWith) import Test.Hspec.Wai hiding (pendingWith)
import Test.Hspec.Wai.Internal (withApplication) import Test.Hspec.Wai.Internal (withApplication)
import Test.Hspec.Wai.JSON (json) import Test.Hspec.Wai.JSON (json)
import Test.Utils (protected, shouldRespondWithFragment, withValidLogin) import Test.Utils (protected, shouldRespondWithFragment, withValidLogin)
privateTests :: SpecWith ((TestEnv, Int), Application) privateTests :: SpecWith (SpecContext a)
privateTests = privateTests =
describe "Private API" $ do describe "Private API" $ do
baseUrl <- runIO $ parseBaseUrl "http://localhost" baseUrl <- runIO $ parseBaseUrl "http://localhost"
...@@ -38,7 +37,7 @@ privateTests = ...@@ -38,7 +37,7 @@ privateTests =
describe "GET /api/v1.0/user" $ do describe "GET /api/v1.0/user" $ do
-- FIXME(adn): unclear if this is useful at all. Doesn't do permission checking. -- FIXME(adn): unclear if this is useful at all. Doesn't do permission checking.
it "doesn't allow someone with an invalid token to show the results" $ \((testEnv, port), _) -> do it "doesn't allow someone with an invalid token to show the results" $ \(SpecContext testEnv port _ _) -> do
createAliceAndBob testEnv createAliceAndBob testEnv
...@@ -49,7 +48,7 @@ privateTests = ...@@ -49,7 +48,7 @@ privateTests =
length result `shouldBe` 0 length result `shouldBe` 0
-- FIXME(adn): unclear if this is useful at all. Doesn't do permission checking. -- FIXME(adn): unclear if this is useful at all. Doesn't do permission checking.
it "allows 'alice' to see the results" $ \((_testEnv, port), _) -> do it "allows 'alice' to see the results" $ \(SpecContext _testEnv port _app _) -> do
withValidLogin port "alice" (GargPassword "alice") $ \clientEnv _token -> do withValidLogin port "alice" (GargPassword "alice") $ \clientEnv _token -> do
let gargAdminClient = (genericClient :: GargAdminAPI (AsClientT ClientM)) let gargAdminClient = (genericClient :: GargAdminAPI (AsClientT ClientM))
...@@ -60,33 +59,33 @@ privateTests = ...@@ -60,33 +59,33 @@ privateTests =
describe "GET /api/v1.0/node" $ do describe "GET /api/v1.0/node" $ do
it "unauthorised users shouldn't see anything" $ \((_testEnv, port), app) -> do it "unauthorised users shouldn't see anything" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
get (mkUrl port "/node/1") `shouldRespondWith` 401 get (mkUrl port "/node/1") `shouldRespondWith` 401
it "allows 'alice' to see her own node info" $ \((_testEnv, port), app) -> do it "allows 'alice' to see her own node info" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do
protected token "GET" (mkUrl port "/node/8") "" protected token "GET" (mkUrl port "/node/8") ""
`shouldRespondWithFragment` [json| {"id":8,"user_id":2,"name":"alice" } |] `shouldRespondWithFragment` [json| {"id":8,"user_id":2,"name":"alice" } |]
it "forbids 'alice' to see others node private info" $ \((_testEnv, port), app) -> do it "forbids 'alice' to see others node private info" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do
protected token "GET" (mkUrl port "/node/1") "" `shouldRespondWith` 403 protected token "GET" (mkUrl port "/node/1") "" `shouldRespondWith` 403
describe "GET /api/v1.0/tree" $ do describe "GET /api/v1.0/tree" $ do
it "unauthorised users shouldn't see anything" $ \((_testEnv, port), app) -> do it "unauthorised users shouldn't see anything" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
get (mkUrl port "/tree/1") `shouldRespondWith` 401 get (mkUrl port "/tree/1") `shouldRespondWith` 401
it "allows 'alice' to see her own node info" $ \((_testEnv, port), app) -> do it "allows 'alice' to see her own node info" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do
protected token "GET" (mkUrl port "/tree/8") "" protected token "GET" (mkUrl port "/tree/8") ""
`shouldRespondWithFragment` [json| { "node": {"id":8, "name":"alice", "type": "NodeUser" } } |] `shouldRespondWithFragment` [json| { "node": {"id":8, "name":"alice", "type": "NodeUser" } } |]
it "forbids 'alice' to see others node private info" $ \((_testEnv, port), app) -> do it "forbids 'alice' to see others node private info" $ \(SpecContext _testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do
protected token "GET" (mkUrl port "/tree/1") "" `shouldRespondWith` 403 protected token "GET" (mkUrl port "/tree/1") "" `shouldRespondWith` 403
...@@ -96,7 +95,9 @@ tests :: Spec ...@@ -96,7 +95,9 @@ tests :: Spec
tests = do tests = do
sequential $ aroundAll withTestDBAndPort $ do sequential $ aroundAll withTestDBAndPort $ do
describe "Prelude" $ do describe "Prelude" $ do
it "setup DB triggers" $ \((testEnv, _), _) -> setupEnvironment testEnv it "setup DB triggers" $ \SpecContext{..} -> setupEnvironment _sctx_env
privateTests privateTests
describe "Share API" $ do describe "Share API" $ do
Share.tests Share.tests
describe "Table API" $ do
Table.tests
...@@ -43,12 +43,12 @@ shareURL token = ...@@ -43,12 +43,12 @@ shareURL token =
tests :: Spec tests :: Spec
tests = sequential $ aroundAll withTestDBAndPort $ do tests = sequential $ aroundAll withTestDBAndPort $ do
describe "Prelude" $ do describe "Prelude" $ do
it "setup DB triggers" $ \((testEnv, _), _) -> do it "setup DB triggers" $ \SpecContext{..} -> do
setupEnvironment testEnv setupEnvironment _sctx_env
-- Let's create the Alice user. -- Let's create the Alice user.
createAliceAndBob testEnv createAliceAndBob _sctx_env
it "should fail if no node type is specified" $ \((_testEnv, serverPort), app) -> do it "should fail if no node type is specified" $ \(SpecContext _testEnv serverPort app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin serverPort "alice" (GargPassword "alice") $ \clientEnv token -> do withValidLogin serverPort "alice" (GargPassword "alice") $ \clientEnv token -> do
url <- liftIO $ runClientM (shareURL (toServantToken token) Nothing Nothing) clientEnv url <- liftIO $ runClientM (shareURL (toServantToken token) Nothing Nothing) clientEnv
...@@ -57,7 +57,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -57,7 +57,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
-> liftIO $ (CL8.unpack $ responseBody res) `shouldSatisfy` (T.isInfixOf "Invalid node Type" . T.pack) -> liftIO $ (CL8.unpack $ responseBody res) `shouldSatisfy` (T.isInfixOf "Invalid node Type" . T.pack)
_ -> fail "Test did not fail as expected!" _ -> fail "Test did not fail as expected!"
it "should fail if no node ID is specified" $ \((_testEnv, serverPort), app) -> do it "should fail if no node ID is specified" $ \(SpecContext _testEnv serverPort app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin serverPort "alice" (GargPassword "alice") $ \clientEnv token -> do withValidLogin serverPort "alice" (GargPassword "alice") $ \clientEnv token -> do
url <- liftIO $ runClientM (shareURL (toServantToken token) (Just NodeCorpus) Nothing) clientEnv url <- liftIO $ runClientM (shareURL (toServantToken token) (Just NodeCorpus) Nothing) clientEnv
...@@ -66,7 +66,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -66,7 +66,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
-> liftIO $ (CL8.unpack $ responseBody res) `shouldSatisfy` (T.isInfixOf "Invalid node ID" . T.pack) -> liftIO $ (CL8.unpack $ responseBody res) `shouldSatisfy` (T.isInfixOf "Invalid node ID" . T.pack)
_ -> fail "Test did not fail as expected!" _ -> fail "Test did not fail as expected!"
it "should return a valid URL" $ \((testEnv, serverPort), app) -> do it "should return a valid URL" $ \(SpecContext testEnv serverPort app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin serverPort "alice" (GargPassword "alice") $ \clientEnv token -> do withValidLogin serverPort "alice" (GargPassword "alice") $ \clientEnv token -> do
cId <- liftIO $ newCorpusForUser testEnv "alice" cId <- liftIO $ newCorpusForUser testEnv "alice"
...@@ -77,7 +77,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -77,7 +77,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
Right (ShareLink _) Right (ShareLink _)
-> pure () -> pure ()
it "should include the port if needed (like localhost)" $ \((testEnv, serverPort), app) -> do it "should include the port if needed (like localhost)" $ \(SpecContext testEnv serverPort app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin serverPort "alice" (GargPassword "alice") $ \clientEnv token -> do withValidLogin serverPort "alice" (GargPassword "alice") $ \clientEnv token -> do
cId <- liftIO $ newCorpusForUser testEnv "alice" cId <- liftIO $ newCorpusForUser testEnv "alice"
......
{-# LANGUAGE BangPatterns #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE TypeFamilies #-}
module Test.API.Private.Table (
tests
) where
import Gargantext.API.HashedResponse
import Gargantext.Core.Text.Corpus.Query
import Gargantext.Core.Types
import Gargantext.Core.Types.Individu
import Gargantext.Prelude
import qualified Gargantext.API.Ngrams.Types as APINgrams
import qualified Gargantext.Database.Query.Facet as Facet
import Servant.Client
import Test.API.Routes
import Test.API.Setup
import Test.API.UpdateList (createDocsList, checkEither)
import Test.Hspec
import Test.Hspec.Wai.Internal (withApplication)
import Test.Utils
tests :: Spec
tests = sequential $ aroundAll withTestDBAndPort $ do
describe "Prelude" $ do
it "setup DB triggers" $ \SpecContext{..} -> do
setupEnvironment _sctx_env
-- Let's create the Alice user.
createAliceAndBob _sctx_env
beforeAllWith createSoySauceCorpus $ do
it "should return sauce in the search (#415)" $ \SpecContext{..} -> do
let corpusId = _sctx_data
withApplication _sctx_app $ do
withValidLogin _sctx_port "alice" (GargPassword "alice") $ \clientEnv token -> do
liftIO $ do
(HashedResponse _ tr1)
<- checkEither $ runClientM (get_table token
corpusId
(Just APINgrams.Docs)
(Just 10)
(Just 0)
(Just Facet.DateDesc)
(Just $ RawQuery "sauce")
Nothing
) clientEnv
length (tr_docs tr1) `shouldBe` 1
it "should return soy in the search (#415)" $ \SpecContext{..} -> do
let corpusId = _sctx_data
withApplication _sctx_app $ do
withValidLogin _sctx_port "alice" (GargPassword "alice") $ \clientEnv token -> do
liftIO $ do
(HashedResponse _ tr1)
<- checkEither $ runClientM (get_table token
corpusId
(Just APINgrams.Docs)
(Just 10)
(Just 0)
(Just Facet.DateDesc)
(Just $ RawQuery "soy")
Nothing
) clientEnv
length (tr_docs tr1) `shouldBe` 3
createSoySauceCorpus :: SpecContext () -> IO (SpecContext CorpusId)
createSoySauceCorpus ctx@SpecContext{..} = do
withApplication _sctx_app $ do
withValidLogin _sctx_port "alice" (GargPassword "alice") $ \clientEnv token -> do
corpusId <- createDocsList "test-data/search/GarganText_DocsList-soysauce.json" _sctx_env _sctx_port clientEnv token
pure $ const corpusId <$> ctx
{-# LANGUAGE BangPatterns #-} {-# LANGUAGE BangPatterns #-}
module Test.API.Setup where module Test.API.Setup (
SpecContext(..)
, withTestDBAndPort
, withTestDBAndNotifications
, withBackendServerAndProxy
, setupEnvironment
, createAliceAndBob
) where
import Control.Concurrent.Async qualified as Async import Control.Concurrent.Async qualified as Async
import Control.Concurrent.MVar import Control.Concurrent.MVar
...@@ -51,6 +58,21 @@ import Test.Database.Types ...@@ -51,6 +58,21 @@ import Test.Database.Types
import UnliftIO qualified import UnliftIO qualified
-- | The context that each spec will be carrying along. This type is
-- polymorphic so that each test can embellish it with test-specific data.
-- 'SpecContext' is a functor, so you can use 'fmap' to change the 'a'.
data SpecContext a =
SpecContext {
_sctx_env :: !TestEnv
, _sctx_port :: !Warp.Port
, _sctx_app :: !Application
, _sctx_data :: !a
}
instance Functor SpecContext where
fmap f (SpecContext e p a d) = SpecContext e p a (f d)
newTestEnv :: TestEnv -> Logger (GargM Env BackendInternalError) -> Warp.Port -> IO Env newTestEnv :: TestEnv -> Logger (GargM Env BackendInternalError) -> Warp.Port -> IO Env
newTestEnv testEnv logger port = do newTestEnv testEnv logger port = do
tomlFile@(SettingsFile sf) <- fakeTomlPath tomlFile@(SettingsFile sf) <- fakeTomlPath
...@@ -94,7 +116,7 @@ newTestEnv testEnv logger port = do ...@@ -94,7 +116,7 @@ newTestEnv testEnv logger port = do
-- | Run the gargantext server on a random port, picked by Warp, which allows -- | Run the gargantext server on a random port, picked by Warp, which allows
-- for concurrent tests to be executed in parallel, if we need to. -- for concurrent tests to be executed in parallel, if we need to.
withTestDBAndPort :: (((TestEnv, Warp.Port), Application) -> IO ()) -> IO () withTestDBAndPort :: (SpecContext () -> IO ()) -> IO ()
withTestDBAndPort action = withTestDBAndPort action =
withTestDB $ \testEnv -> do withTestDB $ \testEnv -> do
-- TODO Despite being cautious here only to start/kill dispatcher -- TODO Despite being cautious here only to start/kill dispatcher
...@@ -123,7 +145,7 @@ withTestDBAndPort action = ...@@ -123,7 +145,7 @@ withTestDBAndPort action =
env <- newTestEnv testEnv ioLogger 8080 env <- newTestEnv testEnv ioLogger 8080
makeApp env makeApp env
let stgs = Warp.defaultSettings { settingsOnExceptionResponse = showDebugExceptions } let stgs = Warp.defaultSettings { settingsOnExceptionResponse = showDebugExceptions }
Warp.testWithApplicationSettings stgs (pure app) $ \port -> action ((testEnv, port), app) Warp.testWithApplicationSettings stgs (pure app) $ \port -> action (SpecContext testEnv port app ())
withTestDBAndNotifications :: D.Dispatcher -> (((TestEnv, Warp.Port), Application) -> IO ()) -> IO () withTestDBAndNotifications :: D.Dispatcher -> (((TestEnv, Warp.Port), Application) -> IO ()) -> IO ()
withTestDBAndNotifications dispatcher action = do withTestDBAndNotifications dispatcher action = do
......
...@@ -7,12 +7,14 @@ ...@@ -7,12 +7,14 @@
module Test.API.UpdateList ( module Test.API.UpdateList (
tests tests
, newCorpusForUser -- * Useful helpers
, JobPollHandle(..) , JobPollHandle(..)
, newCorpusForUser
, pollUntilFinished , pollUntilFinished
-- * Useful helpers
, updateNode , updateNode
, createDocsList
, checkEither
) where ) where
import Control.Lens (mapped, over) import Control.Lens (mapped, over)
...@@ -57,11 +59,12 @@ import Gargantext.Prelude hiding (get) ...@@ -57,11 +59,12 @@ import Gargantext.Prelude hiding (get)
import Network.Wai.Handler.Warp qualified as Wai import Network.Wai.Handler.Warp qualified as Wai
import Paths_gargantext (getDataFileName) import Paths_gargantext (getDataFileName)
import qualified Prelude import qualified Prelude
import System.FilePath
import Servant import Servant
import Servant.Client import Servant.Client
import Servant.Job.Async import Servant.Job.Async
import Test.API.Routes (mkUrl, gqlUrl, get_table_ngrams, put_table_ngrams, toServantToken, clientRoutes, get_table, update_node) import Test.API.Routes (mkUrl, gqlUrl, get_table_ngrams, put_table_ngrams, toServantToken, clientRoutes, get_table, update_node)
import Test.API.Setup (withTestDBAndPort, setupEnvironment, createAliceAndBob) import Test.API.Setup (withTestDBAndPort, setupEnvironment, createAliceAndBob, SpecContext (..))
import Test.Database.Types import Test.Database.Types
import Test.Hspec import Test.Hspec
import Test.Hspec.Wai.Internal (withApplication, WaiSession) import Test.Hspec.Wai.Internal (withApplication, WaiSession)
...@@ -114,13 +117,13 @@ uploadJSONList port token cId pathToNgrams = do ...@@ -114,13 +117,13 @@ uploadJSONList port token cId pathToNgrams = do
tests :: Spec tests :: Spec
tests = sequential $ aroundAll withTestDBAndPort $ do tests = sequential $ aroundAll withTestDBAndPort $ do
describe "UpdateList API" $ do describe "UpdateList API" $ do
it "setup DB triggers and users" $ \((testEnv, _), _) -> do it "setup DB triggers and users" $ \(SpecContext testEnv _port _app _) -> do
setupEnvironment testEnv setupEnvironment testEnv
createAliceAndBob testEnv createAliceAndBob testEnv
describe "POST /api/v1.0/lists/:id/add/form/async (JSON)" $ do describe "POST /api/v1.0/lists/:id/add/form/async (JSON)" $ do
it "allows uploading a JSON ngrams file" $ \((testEnv, port), app) -> do it "allows uploading a JSON ngrams file" $ \(SpecContext testEnv port app _) -> do
cId <- newCorpusForUser testEnv "alice" cId <- newCorpusForUser testEnv "alice"
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do
...@@ -142,7 +145,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -142,7 +145,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
] ]
} |] } |]
it "does not create duplicates when uploading JSON (#313)" $ \((testEnv, port), app) -> do it "does not create duplicates when uploading JSON (#313)" $ \(SpecContext testEnv port app _) -> do
cId <- newCorpusForUser testEnv "alice" cId <- newCorpusForUser testEnv "alice"
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \clientEnv token -> do
...@@ -206,7 +209,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -206,7 +209,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
describe "POST /api/v1.0/lists/:id/csv/add/form/async (CSV)" $ do describe "POST /api/v1.0/lists/:id/csv/add/form/async (CSV)" $ do
it "parses CSV via ngramsListFromCSVData" $ \((_testEnv, _port), _app) -> do it "parses CSV via ngramsListFromCSVData" $ \(SpecContext _testEnv _port _app _) -> do
simpleNgrams <- liftIO (TIO.readFile =<< getDataFileName "test-data/ngrams/simple.tsv") simpleNgrams <- liftIO (TIO.readFile =<< getDataFileName "test-data/ngrams/simple.tsv")
ngramsListFromTSVData simpleNgrams `shouldBe` ngramsListFromTSVData simpleNgrams `shouldBe`
Right (Map.fromList [ (NgramsTerms, Versioned 0 $ Map.fromList [ Right (Map.fromList [ (NgramsTerms, Versioned 0 $ Map.fromList [
...@@ -214,7 +217,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -214,7 +217,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
, (NgramsTerm "brazorf", NgramsRepoElement 1 StopTerm Nothing Nothing (MSet mempty)) , (NgramsTerm "brazorf", NgramsRepoElement 1 StopTerm Nothing Nothing (MSet mempty))
])]) ])])
it "allows uploading a CSV ngrams file" $ \((testEnv, port), app) -> do it "allows uploading a CSV ngrams file" $ \(SpecContext testEnv port app _) -> do
cId <- newCorpusForUser testEnv "alice" cId <- newCorpusForUser testEnv "alice"
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \_clientEnv token -> do
...@@ -257,12 +260,12 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -257,12 +260,12 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
describe "POST /api/v1.0/corpus/:id/add/form/async (JSON)" $ do describe "POST /api/v1.0/corpus/:id/add/form/async (JSON)" $ do
it "allows uploading a JSON docs file" $ \((testEnv, port), app) -> it "allows uploading a JSON docs file" $ \(SpecContext testEnv port app _) ->
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \clientEnv token -> do
void $ createFortranDocsList testEnv port clientEnv token void $ createFortranDocsList testEnv port clientEnv token
it "doesn't use trashed documents for score calculation (#385)" $ \((testEnv, port), app) -> do it "doesn't use trashed documents for score calculation (#385)" $ \(SpecContext testEnv port app _) -> do
withApplication app $ do withApplication app $ do
withValidLogin port "alice" (GargPassword "alice") $ \clientEnv token -> do withValidLogin port "alice" (GargPassword "alice") $ \clientEnv token -> do
corpusId <- createFortranDocsList testEnv port clientEnv token corpusId <- createFortranDocsList testEnv port clientEnv token
...@@ -336,21 +339,28 @@ tests = sequential $ aroundAll withTestDBAndPort $ do ...@@ -336,21 +339,28 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
) clientEnv ) clientEnv
length (_ne_occurrences fortran_ngram') `shouldBe` 1 length (_ne_occurrences fortran_ngram') `shouldBe` 1
createDocsList :: FilePath
-> TestEnv
createFortranDocsList :: TestEnv -> Int -> ClientEnv -> Token -> WaiSession () CorpusId -> Int
createFortranDocsList testEnv port clientEnv token = do -> ClientEnv
-> Token
-> WaiSession () CorpusId
createDocsList testDataPath testEnv port clientEnv token = do
folderId <- liftIO $ newPrivateFolderForUser testEnv "alice" folderId <- liftIO $ newPrivateFolderForUser testEnv "alice"
([corpusId] :: [NodeId]) <- protectedJSON token "POST" (mkUrl port ("/node/" <> build folderId)) [aesonQQ|{"pn_typename":"NodeCorpus","pn_name":"TestCorpus"}|] ([corpusId] :: [NodeId]) <- protectedJSON token "POST" (mkUrl port ("/node/" <> build folderId)) [aesonQQ|{"pn_typename":"NodeCorpus","pn_name":"TestCorpus"}|]
-- Import the docsList with only two documents, both containing a \"fortran\" term. -- Import the docsList with only two documents, both containing a \"fortran\" term.
simpleDocs <- liftIO (TIO.readFile =<< getDataFileName "test-data/ngrams/GarganText_DocsList-nodeId-177.json") simpleDocs <- liftIO (TIO.readFile =<< getDataFileName testDataPath)
let newWithForm = mkNewWithForm simpleDocs "GarganText_DocsList-nodeId-177.json" let newWithForm = mkNewWithForm simpleDocs (T.pack $ takeBaseName testDataPath)
(j :: JobPollHandle) <- checkEither $ fmap toJobPollHandle <$> liftIO (runClientM (add_file_async token corpusId newWithForm) clientEnv) (j :: JobPollHandle) <- checkEither $ fmap toJobPollHandle <$> liftIO (runClientM (add_file_async token corpusId newWithForm) clientEnv)
let mkPollUrl jh = "/corpus/" <> fromString (show $ _NodeId corpusId) <> "/add/form/async/" +|_jph_id jh|+ "/poll?limit=1" let mkPollUrl jh = "/corpus/" <> fromString (show $ _NodeId corpusId) <> "/add/form/async/" +|_jph_id jh|+ "/poll?limit=1"
j' <- pollUntilFinished token port mkPollUrl j j' <- pollUntilFinished token port mkPollUrl j
liftIO (_jph_status j' `shouldBe` "IsFinished") liftIO (_jph_status j' `shouldBe` "IsFinished")
pure corpusId pure corpusId
createFortranDocsList :: TestEnv -> Int -> ClientEnv -> Token -> WaiSession () CorpusId
createFortranDocsList testEnv port =
createDocsList "test-data/ngrams/GarganText_DocsList-nodeId-177.json" testEnv port
updateNode :: Int -> ClientEnv -> Token -> NodeId -> WaiSession () () updateNode :: Int -> ClientEnv -> Token -> NodeId -> WaiSession () ()
updateNode port clientEnv token nodeId = do updateNode port clientEnv token nodeId = do
let params = UpdateNodeParamsTexts Both let params = UpdateNodeParamsTexts Both
......
...@@ -135,6 +135,8 @@ stemmingTest :: TestEnv -> Assertion ...@@ -135,6 +135,8 @@ stemmingTest :: TestEnv -> Assertion
stemmingTest _env = do stemmingTest _env = do
stem EN GargPorterAlgorithm "Ajeje" `shouldBe` "Ajeje" stem EN GargPorterAlgorithm "Ajeje" `shouldBe` "Ajeje"
stem EN GargPorterAlgorithm "PyPlasm:" `shouldBe` "PyPlasm:" stem EN GargPorterAlgorithm "PyPlasm:" `shouldBe` "PyPlasm:"
stem EN GargPorterAlgorithm "soy" `shouldBe` "soy"
stem EN GargPorterAlgorithm "cry" `shouldBe` "cri"
-- This test outlines the main differences between Porter and Lancaster. -- This test outlines the main differences between Porter and Lancaster.
stem EN GargPorterAlgorithm "dancer" `shouldBe` "dancer" stem EN GargPorterAlgorithm "dancer" `shouldBe` "dancer"
stem EN LancasterAlgorithm "dancer" `shouldBe` "dant" stem EN LancasterAlgorithm "dancer" `shouldBe` "dant"
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment