Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
haskell-gargantext
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Christian Merten
haskell-gargantext
Commits
59c23118
Verified
Commit
59c23118
authored
Oct 04, 2024
by
Przemyslaw Kaminski
Browse files
Options
Browse Files
Download
Plain Diff
Merge branch 'dev' into 238-dev-async-job-worker
parents
494c0541
a0ec337b
Changes
12
Show whitespace changes
Inline
Side-by-side
Showing
12 changed files
with
424 additions
and
124 deletions
+424
-124
CHANGELOG.md
CHANGELOG.md
+11
-0
schema.sql
devops/postgres/schema.sql
+0
-32
gargantext.cabal
gargantext.cabal
+3
-2
hie.yaml
hie.yaml
+76
-0
API.hs
src/Gargantext/API.hs
+5
-4
Types.hs
src/Gargantext/Core/Viz/Graph/Types.hs
+4
-3
GarganText_DocsList-nodeId-177.json
test-data/ngrams/GarganText_DocsList-nodeId-177.json
+2
-1
GarganText_NgramsTerms-nodeId-177.json
test-data/ngrams/GarganText_NgramsTerms-nodeId-177.json
+2
-0
Routes.hs
test/Test/API/Routes.hs
+104
-24
UpdateList.hs
test/Test/API/UpdateList.hs
+166
-44
Utils.hs
test/Test/Utils.hs
+37
-9
Jobs.hs
test/Test/Utils/Jobs.hs
+14
-5
No files found.
CHANGELOG.md
View file @
59c23118
## Version 0.0.7.3.1
*
[
FRONT
][
FIX
][
Cannot build the project on latest `dev` (#701)
](
https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/701
)
*
[
FRONT
][
FIX
][
Phylomemy panel reload after first query (#674)
](
https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/674
)
*
[
BACK
][
FIX
][
Various test failures (#408)
](
https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/408
)
*
[
BACK
][
FIX
][
Swagger documentation is down (#407)
](
https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/407
)
*
[
BACK
][
ADMIN
]
[
Improve startup error from
`runDbCheck`
](https://gitlab.iscpif.fr/gargantext/haskell-gargantext/merge_requests/347)
*
[
BACK
][
CLEAN
]
removing unused SQL function in schema.sql
*
[
BACK
][
TESTS
][
Terms are calculated over all documents, even those in trash (#385)
](
https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/385
)
## Version 0.0.7.3 [/!\ Maintenance command inside]
## Version 0.0.7.3 [/!\ Maintenance command inside]
*
[
BACK
][
FIX
][
Upgrade to GHC 9.4.8
][
Switch from .ini to TOML? (#304)
]
(https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/304)
*
[
BACK
][
FIX
][
Upgrade to GHC 9.4.8
][
Switch from .ini to TOML? (#304)
]
(https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/304)
...
...
devops/postgres/schema.sql
View file @
59c23118
...
@@ -404,35 +404,3 @@ FOR EACH ROW
...
@@ -404,35 +404,3 @@ FOR EACH ROW
EXECUTE
PROCEDURE
check_node_stories_json
();
EXECUTE
PROCEDURE
check_node_stories_json
();
CREATE
OR
REPLACE
FUNCTION
check_ngrams_json
()
RETURNS
TRIGGER
AS
$$
DECLARE
missing_ngrams_exist
boolean
;
BEGIN
WITH
child_ngrams
as
(
SELECT
jsonb_array_elements_text
(
ngrams_repo_element
->
'children'
)
AS
term
FROM
node_stories
WHERE
term
=
OLD
.
terms
),
parent_ngrams
AS
(
SELECT
ngrams_repo_element
->>
'root'
AS
term
FROM
node_stories
WHERE
term
=
OLD
.
terms
),
child_parent_ngrams
AS
(
SELECT
*
FROM
child_ngrams
UNION
SELECT
*
FROM
parent_ngrams
)
SELECT
EXISTS
(
SELECT
*
FROM
child_parent_ngrams
)
INTO
missing_ngrams_exist
;
IF
missing_ngrams_exist
THEN
RAISE
EXCEPTION
'ngrams are missing: %'
,
row_to_json
(
OLD
);
END
IF
;
RETURN
OLD
;
END
;
$$
LANGUAGE
plpgsql
;
CREATE
OR
REPLACE
TRIGGER
check_ngrams_json_trg
AFTER
DELETE
ON
ngrams
FOR
EACH
ROW
EXECUTE
PROCEDURE
check_ngrams_json
();
gargantext.cabal
View file @
59c23118
...
@@ -5,7 +5,7 @@ cabal-version: 3.4
...
@@ -5,7 +5,7 @@ cabal-version: 3.4
-- see: https://github.com/sol/hpack
-- see: https://github.com/sol/hpack
name: gargantext
name: gargantext
version: 0.0.7.3
version: 0.0.7.3
.1
synopsis: Search, map, share
synopsis: Search, map, share
description: Please see README.md
description: Please see README.md
category: Data
category: Data
...
@@ -34,6 +34,7 @@ data-files:
...
@@ -34,6 +34,7 @@ data-files:
ekg-assets/cross.png
ekg-assets/cross.png
test-data/ngrams/GarganText_NgramsTerms-QuantumComputing.json
test-data/ngrams/GarganText_NgramsTerms-QuantumComputing.json
test-data/ngrams/GarganText_DocsList-nodeId-177.json
test-data/ngrams/GarganText_DocsList-nodeId-177.json
test-data/ngrams/GarganText_NgramsTerms-nodeId-177.json
test-data/ngrams/simple.json
test-data/ngrams/simple.json
test-data/ngrams/simple.tsv
test-data/ngrams/simple.tsv
test-data/phylo/bpa_phylo_test.json
test-data/phylo/bpa_phylo_test.json
...
@@ -255,6 +256,7 @@ library
...
@@ -255,6 +256,7 @@ library
Gargantext.Database.Admin.Types.Hyperdata
Gargantext.Database.Admin.Types.Hyperdata
Gargantext.Database.Admin.Types.Hyperdata.Corpus
Gargantext.Database.Admin.Types.Hyperdata.Corpus
Gargantext.Database.Admin.Types.Hyperdata.Document
Gargantext.Database.Admin.Types.Hyperdata.Document
Gargantext.Database.Admin.Types.Hyperdata.Folder
Gargantext.Database.Admin.Types.Node
Gargantext.Database.Admin.Types.Node
Gargantext.Database.Prelude
Gargantext.Database.Prelude
Gargantext.Database.Query.Facet
Gargantext.Database.Query.Facet
...
@@ -425,7 +427,6 @@ library
...
@@ -425,7 +427,6 @@ library
Gargantext.Database.Admin.Types.Hyperdata.Dashboard
Gargantext.Database.Admin.Types.Hyperdata.Dashboard
Gargantext.Database.Admin.Types.Hyperdata.Default
Gargantext.Database.Admin.Types.Hyperdata.Default
Gargantext.Database.Admin.Types.Hyperdata.File
Gargantext.Database.Admin.Types.Hyperdata.File
Gargantext.Database.Admin.Types.Hyperdata.Folder
Gargantext.Database.Admin.Types.Hyperdata.Frame
Gargantext.Database.Admin.Types.Hyperdata.Frame
Gargantext.Database.Admin.Types.Hyperdata.List
Gargantext.Database.Admin.Types.Hyperdata.List
Gargantext.Database.Admin.Types.Hyperdata.Model
Gargantext.Database.Admin.Types.Hyperdata.Model
...
...
hie.yaml
0 → 100644
View file @
59c23118
cradle
:
cabal
:
-
path
:
"
./src"
component
:
"
lib:gargantext"
-
path
:
"
./bin/gargantext-cli/Main.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Admin.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/FileDiff.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/FilterTermsAndCooc.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Import.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Ini.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Init.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Invitations.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/ObfuscateDB.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Parsers.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Phylo.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Phylo/Common.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Phylo/Profile.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Server/Routes.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Types.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/CLI/Upgrade.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-cli/Paths_gargantext.hs"
component
:
"
gargantext:exe:gargantext-cli"
-
path
:
"
./bin/gargantext-server/Main.hs"
component
:
"
gargantext:exe:gargantext-server"
-
path
:
"
./bin/gargantext-server/Paths_gargantext.hs"
component
:
"
gargantext:exe:gargantext-server"
-
path
:
"
./test"
component
:
"
gargantext:test:garg-test-tasty"
-
path
:
"
./bin/gargantext-cli"
component
:
"
gargantext:test:garg-test-tasty"
-
path
:
"
./test"
component
:
"
gargantext:test:garg-test-hspec"
-
path
:
"
./bench/Main.hs"
component
:
"
gargantext:bench:garg-bench"
-
path
:
"
./bench/Paths_gargantext.hs"
component
:
"
gargantext:bench:garg-bench"
src/Gargantext/API.hs
View file @
59c23118
...
@@ -92,12 +92,13 @@ startGargantext mode port sf@(SettingsFile settingsFile) = withLoggerHoisted mod
...
@@ -92,12 +92,13 @@ startGargantext mode port sf@(SettingsFile settingsFile) = withLoggerHoisted mod
where
runDbCheck
env
=
do
where
runDbCheck
env
=
do
r
<-
runExceptT
(
runReaderT
DB
.
dbCheck
env
)
`
catch
`
r
<-
runExceptT
(
runReaderT
DB
.
dbCheck
env
)
`
catch
`
(
\
(
_
::
SomeException
)
->
pure
$
Right
False
)
(
\
(
err
::
SomeException
)
->
pure
$
Left
err
)
case
r
of
case
r
of
Right
True
->
pure
()
Right
True
->
pure
()
_
->
panicTrace
$
Right
False
->
panicTrace
$
"You must run 'gargantext-init "
<>
pack
settingsFile
<>
"You must run 'gargantext-
cli
init "
<>
pack
settingsFile
<>
"' before running gargantext-server (only the first time)."
"' before running gargantext-server (only the first time)."
Left
err
->
panicTrace
$
"Unexpected exception:"
<>
show
err
oneHour
=
Clock
.
fromNanoSecs
3600
_000_000_000
oneHour
=
Clock
.
fromNanoSecs
3600
_000_000_000
portRouteInfo
::
NotificationsConfig
->
PortNumber
->
MicroServicesProxyStatus
->
IO
()
portRouteInfo
::
NotificationsConfig
->
PortNumber
->
MicroServicesProxyStatus
->
IO
()
...
...
src/Gargantext/Core/Viz/Graph/Types.hs
View file @
59c23118
...
@@ -14,9 +14,10 @@ Portability : POSIX
...
@@ -14,9 +14,10 @@ Portability : POSIX
module
Gargantext.Core.Viz.Graph.Types
module
Gargantext.Core.Viz.Graph.Types
where
where
import
Data.Aeson
(
defaultOptions
)
import
Data.HashSet
(
HashSet
)
import
Data.HashSet
(
HashSet
)
import
Data.HashSet
qualified
as
HashSet
import
Data.HashSet
qualified
as
HashSet
import
Data.Swagger
(
ToSchema
(
..
),
genericDeclareNamedSchema
)
import
Data.Swagger
(
ToSchema
(
..
),
genericDeclareNamedSchema
,
defaultSchemaOptions
)
import
Data.Text
(
pack
)
import
Data.Text
(
pack
)
import
Database.PostgreSQL.Simple.FromField
(
FromField
(
..
))
import
Database.PostgreSQL.Simple.FromField
(
FromField
(
..
))
import
Gargantext.API.Ngrams.Types
(
NgramsTerm
)
import
Gargantext.API.Ngrams.Types
(
NgramsTerm
)
...
@@ -245,9 +246,9 @@ instance FromField HyperdataGraphAPI
...
@@ -245,9 +246,9 @@ instance FromField HyperdataGraphAPI
data
GraphLegendAPI
=
GraphLegendAPI
[
LegendField
]
data
GraphLegendAPI
=
GraphLegendAPI
[
LegendField
]
deriving
(
Show
,
Generic
)
deriving
(
Show
,
Generic
)
$
(
deriveJSON
(
unPrefix
"_graphAPI"
)
''
G
raphLegendAPI
)
$
(
deriveJSON
defaultOptions
''
G
raphLegendAPI
)
instance
ToSchema
GraphLegendAPI
where
instance
ToSchema
GraphLegendAPI
where
declareNamedSchema
=
genericDeclareNamedSchema
(
unPrefixSwagger
"_graphAPI"
)
declareNamedSchema
=
genericDeclareNamedSchema
defaultSchemaOptions
makeLenses
''
G
raphLegendAPI
makeLenses
''
G
raphLegendAPI
...
...
test-data/ngrams/GarganText_DocsList-nodeId-177.json
View file @
59c23118
...
@@ -58,5 +58,6 @@
...
@@ -58,5 +58,6 @@
},
},
"hash"
:
""
"hash"
:
""
}
}
]
],
"garg_version"
:
"0.0.7.1.16"
}
}
test-data/ngrams/GarganText_NgramsTerms-nodeId-177.json
0 → 100644
View file @
59c23118
{
"NgramsTerms"
:{
"version"
:
1
,
"data"
:{
"fortran"
:{
"size"
:
2
,
"list"
:
"MapTerm"
,
"children"
:[]}
}
}
}
test/Test/API/Routes.hs
View file @
59c23118
...
@@ -8,24 +8,31 @@ module Test.API.Routes where
...
@@ -8,24 +8,31 @@ module Test.API.Routes where
import
Data.Text.Encoding
qualified
as
TE
import
Data.Text.Encoding
qualified
as
TE
import
Fmt
(
Builder
,
(
+|
),
(
|+
))
import
Fmt
(
Builder
,
(
+|
),
(
|+
))
import
Gargantext.API.Admin.Auth.Types
(
AuthRequest
,
AuthResponse
,
Token
)
import
Gargantext.API.Admin.Auth.Types
(
AuthRequest
,
AuthResponse
,
Token
)
import
Gargantext.API.Admin.Orchestrator.Types
(
JobLog
,
asyncJobsAPI'
)
import
Gargantext.API.Errors
import
Gargantext.API.Errors
import
Gargantext.API.HashedResponse
(
HashedResponse
)
import
Gargantext.API.Ngrams.Types
(
NgramsTable
,
NgramsTablePatch
,
OrderBy
,
TabType
,
Versioned
,
VersionedWithCount
)
import
Gargantext.API.Ngrams.Types
(
NgramsTable
,
NgramsTablePatch
,
OrderBy
,
TabType
,
Versioned
,
VersionedWithCount
)
import
Gargantext.API.Routes.Named
import
Gargantext.API.Routes.Named
import
Gargantext.API.Routes.Named.Node
import
Gargantext.API.Routes.Named.Node
import
Gargantext.API.Routes.Named.Private
hiding
(
tableNgramsAPI
)
import
Gargantext.API.Routes.Named.Private
hiding
(
tableNgramsAPI
)
import
Gargantext.API.Routes.Named.Table
import
Gargantext.API.Routes.Named.Table
import
Gargantext.API.Types
()
-- MimeUnrender instances
import
Gargantext.API.Types
()
-- MimeUnrender instances
import
Gargantext.Core.Types
(
ListId
,
NodeId
)
import
Gargantext.Core.Text.Corpus.Query
(
RawQuery
)
import
Gargantext.Core.Types
(
ListId
,
NodeId
,
NodeType
,
NodeTableResult
)
import
Gargantext.Core.Types.Main
(
ListType
)
import
Gargantext.Core.Types.Main
(
ListType
)
import
Gargantext.Core.Types.Query
(
Limit
,
MaxSize
,
MinSize
,
Offset
)
import
Gargantext.Core.Types.Query
(
Limit
,
MaxSize
,
MinSize
,
Offset
)
import
Gargantext.Database.Admin.Types.Hyperdata
import
Gargantext.Database.Query.Facet
qualified
as
Facet
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Network.HTTP.Types
qualified
as
H
import
Network.HTTP.Types
qualified
as
H
import
Network.Wai.Handler.Warp
(
Port
)
import
Network.Wai.Handler.Warp
(
Port
)
import
Servant
((
:<|>
)(
..
))
import
Servant.API.WebSocket
qualified
as
WS
import
Servant.API.WebSocket
qualified
as
WS
import
Servant.Auth.Client
qualified
as
S
import
Servant.Auth.Client
qualified
as
S
import
Servant.Client
(
ClientM
)
import
Servant.Client
(
ClientM
)
import
Servant.Client.Core
(
RunClient
,
HasClient
(
..
),
Request
)
import
Servant.Client.Core
(
RunClient
,
HasClient
(
..
),
Request
)
import
Servant.Client.Generic
(
genericClient
,
AsClientT
)
import
Servant.Client.Generic
(
genericClient
,
AsClientT
)
import
Servant.Job.Async
instance
RunClient
m
=>
HasClient
m
WS
.
WebSocketPending
where
instance
RunClient
m
=>
HasClient
m
WS
.
WebSocketPending
where
...
@@ -47,6 +54,9 @@ mkUrl :: Port -> Builder -> ByteString
...
@@ -47,6 +54,9 @@ mkUrl :: Port -> Builder -> ByteString
mkUrl
_port
urlPiece
=
mkUrl
_port
urlPiece
=
"/api/"
+|
curApi
|+
urlPiece
"/api/"
+|
curApi
|+
urlPiece
gqlUrl
::
ByteString
gqlUrl
=
"/gql"
-- | The client for the full API. It also serves as a \"proof\" that our
-- | The client for the full API. It also serves as a \"proof\" that our
-- whole API has all the required instances to be used in a client.
-- whole API has all the required instances to be used in a client.
...
@@ -64,7 +74,32 @@ auth_api = clientRoutes & apiWithCustomErrorScheme
...
@@ -64,7 +74,32 @@ auth_api = clientRoutes & apiWithCustomErrorScheme
&
gargAuthAPI
&
gargAuthAPI
&
authEp
&
authEp
table_ngrams_get_api
::
Token
toServantToken
::
Token
->
S
.
Token
toServantToken
=
S
.
Token
.
TE
.
encodeUtf8
update_node
::
Token
->
NodeId
->
UpdateNodeParams
->
ClientM
(
JobStatus
'S
a
fe
JobLog
)
update_node
(
toServantToken
->
token
)
nodeId
params
=
clientRoutes
&
apiWithCustomErrorScheme
&
(
$
GES_new
)
&
backendAPI
&
backendAPI'
&
mkBackEndAPI
&
gargAPIVersion
&
gargPrivateAPI
&
mkPrivateAPI
&
(
$
token
)
&
nodeEp
&
nodeEndpointAPI
&
(
$
nodeId
)
&
updateAPI
&
updateNodeEp
&
asyncJobsAPI'
&
(
\
(
_
:<|>
submitForm
:<|>
_
)
->
submitForm
(
JobInput
params
Nothing
))
get_table_ngrams
::
Token
->
NodeId
->
NodeId
->
TabType
->
TabType
->
ListId
->
ListId
...
@@ -76,7 +111,7 @@ table_ngrams_get_api :: Token
...
@@ -76,7 +111,7 @@ table_ngrams_get_api :: Token
->
Maybe
OrderBy
->
Maybe
OrderBy
->
Maybe
Text
->
Maybe
Text
->
ClientM
(
VersionedWithCount
NgramsTable
)
->
ClientM
(
VersionedWithCount
NgramsTable
)
table_ngrams_get_api
(
toServantToken
->
token
)
nodeId
=
get_table_ngrams
(
toServantToken
->
token
)
nodeId
=
clientRoutes
&
apiWithCustomErrorScheme
clientRoutes
&
apiWithCustomErrorScheme
&
(
$
GES_new
)
&
(
$
GES_new
)
&
backendAPI
&
backendAPI
...
@@ -93,16 +128,13 @@ table_ngrams_get_api (toServantToken -> token) nodeId =
...
@@ -93,16 +128,13 @@ table_ngrams_get_api (toServantToken -> token) nodeId =
&
tableNgramsGetAPI
&
tableNgramsGetAPI
&
getNgramsTableEp
&
getNgramsTableEp
toServantToken
::
Token
->
S
.
Token
put_table_ngrams
::
Token
toServantToken
=
S
.
Token
.
TE
.
encodeUtf8
table_ngrams_put_api
::
Token
->
NodeId
->
NodeId
->
TabType
->
TabType
->
ListId
->
ListId
->
Versioned
NgramsTablePatch
->
Versioned
NgramsTablePatch
->
ClientM
(
Versioned
NgramsTablePatch
)
->
ClientM
(
Versioned
NgramsTablePatch
)
table_ngrams_put_api
(
toServantToken
->
token
)
nodeId
=
put_table_ngrams
(
toServantToken
->
token
)
nodeId
=
clientRoutes
&
apiWithCustomErrorScheme
clientRoutes
&
apiWithCustomErrorScheme
&
(
$
GES_new
)
&
(
$
GES_new
)
&
backendAPI
&
backendAPI
...
@@ -118,3 +150,51 @@ table_ngrams_put_api (toServantToken -> token) nodeId =
...
@@ -118,3 +150,51 @@ table_ngrams_put_api (toServantToken -> token) nodeId =
&
tableNgramsAPI
&
tableNgramsAPI
&
tableNgramsPutAPI
&
tableNgramsPutAPI
&
putNgramsTableEp
&
putNgramsTableEp
get_table
::
Token
->
NodeId
->
Maybe
TabType
->
Maybe
Limit
->
Maybe
Offset
->
Maybe
Facet
.
OrderBy
->
Maybe
RawQuery
->
Maybe
Text
->
ClientM
(
HashedResponse
FacetTableResult
)
get_table
(
toServantToken
->
token
)
nodeId
=
clientRoutes
&
apiWithCustomErrorScheme
&
(
$
GES_new
)
&
backendAPI
&
backendAPI'
&
mkBackEndAPI
&
gargAPIVersion
&
gargPrivateAPI
&
mkPrivateAPI
&
(
$
token
)
&
nodeEp
&
nodeEndpointAPI
&
(
$
nodeId
)
&
tableAPI
&
getTableEp
get_children
::
Token
->
NodeId
->
Maybe
NodeType
->
Maybe
Offset
->
Maybe
Limit
->
ClientM
(
NodeTableResult
HyperdataAny
)
get_children
(
toServantToken
->
token
)
nodeId
=
clientRoutes
&
apiWithCustomErrorScheme
&
(
$
GES_new
)
&
backendAPI
&
backendAPI'
&
mkBackEndAPI
&
gargAPIVersion
&
gargPrivateAPI
&
mkPrivateAPI
&
(
$
token
)
&
nodeEp
&
nodeEndpointAPI
&
(
$
nodeId
)
&
childrenAPI
&
summaryChildrenEp
test/Test/API/UpdateList.hs
View file @
59c23118
...
@@ -10,57 +10,68 @@ module Test.API.UpdateList (
...
@@ -10,57 +10,68 @@ module Test.API.UpdateList (
,
newCorpusForUser
,
newCorpusForUser
,
JobPollHandle
(
..
)
,
JobPollHandle
(
..
)
,
pollUntilFinished
,
pollUntilFinished
-- * Useful helpers
,
updateNode
)
where
)
where
import
Control.Lens
(
mapped
,
over
)
import
Control.Lens
(
mapped
,
over
)
import
Control.Monad.Fail
(
fail
)
import
Control.Monad.Fail
(
fail
)
import
Data.Aeson
qualified
as
JSON
import
Data.Aeson.QQ
import
Data.Aeson.QQ
import
Data.
Map.Strict
qualified
as
Map
import
Data.
Aeson
qualified
as
JSON
import
Data.Map.Strict.Patch
qualified
as
PM
import
Data.Map.Strict.Patch
qualified
as
PM
import
Data.Map.Strict
qualified
as
Map
import
Data.Set
qualified
as
Set
import
Data.Set
qualified
as
Set
import
Data.String
(
fromString
)
import
Data.String
(
fromString
)
import
Data.Text
qualified
as
T
import
Data.Text.IO
qualified
as
TIO
import
Data.Text.IO
qualified
as
TIO
import
Data.Text
qualified
as
T
import
Fmt
import
Fmt
import
Gargantext.API.Admin.Auth.Types
(
Token
)
import
Gargantext.API.Admin.Auth.Types
(
Token
)
import
Gargantext.API.Admin.Orchestrator.Types
import
Gargantext.API.Admin.Orchestrator.Types
import
Gargantext.API.Errors
import
Gargantext.API.Errors
import
Gargantext.API.
Ngrams
qualified
as
APINgrams
import
Gargantext.API.
HashedResponse
import
Gargantext.API.Ngrams.List
(
ngramsListFromTSVData
)
import
Gargantext.API.Ngrams.List
(
ngramsListFromTSVData
)
import
Gargantext.API.Ngrams.Types
(
MSet
(
..
),
NgramsPatch
(
..
),
NgramsRepoElement
(
..
),
NgramsTablePatch
(
..
),
NgramsTerm
(
..
),
Versioned
(
..
),
mSetToList
,
toNgramsPatch
,
ne_children
,
ne_ngrams
,
vc_data
,
_NgramsTable
)
import
Gargantext.API.Ngrams
qualified
as
APINgrams
import
Gargantext.API.Ngrams.Types
import
Gargantext.API.Node.Corpus.New.Types
qualified
as
FType
import
Gargantext.API.Node.Corpus.New.Types
qualified
as
FType
import
Gargantext.API.Node.Types
import
Gargantext.API.Node.Types
import
Gargantext.API.Routes.Named
import
Gargantext.API.Routes.Named
import
Gargantext.API.Routes.Named.Corpus
import
Gargantext.API.Routes.Named.Corpus
import
Gargantext.API.Routes.Named.Node
import
Gargantext.API.Routes.Named.Private
import
Gargantext.API.Routes.Named.Private
import
Gargantext.Core
qualified
as
Lang
import
Gargantext.Core
qualified
as
Lang
import
Gargantext.Core.Text.Corpus.Query
(
RawQuery
(
..
))
import
Gargantext.Core.Text.List.Social
import
Gargantext.Core.Text.List.Social
import
Gargantext.Core.Text.Ngrams
import
Gargantext.Core.Text.Ngrams
import
Gargantext.Core.Types
(
CorpusId
,
ListId
,
ListType
(
..
),
NodeId
,
_NodeId
)
import
Gargantext.Core.Types
(
CorpusId
,
ListId
,
NodeId
,
_NodeId
)
import
Gargantext.Core.Types.Individu
import
Gargantext.Core.Types.Individu
import
Gargantext.Core.Types.Main
(
ListType
(
..
))
import
Gargantext.Core.Types
(
TableResult
(
..
))
import
Gargantext.Database.Action.User
import
Gargantext.Database.Action.User
import
Gargantext.Database.Admin.Types.Hyperdata.Corpus
import
Gargantext.Database.Admin.Types.Hyperdata.Corpus
import
Gargantext.Database.Admin.Types.Hyperdata.Folder
(
defaultHyperdataFolderPrivate
)
import
Gargantext.Database.Query.Facet
qualified
as
Facet
import
Gargantext.Database.Query.Table.Node
import
Gargantext.Database.Query.Table.Node
import
Gargantext.Database.Query.Tree.Root
import
Gargantext.Database.Query.Tree.Root
import
Gargantext.Prelude
hiding
(
get
)
import
Gargantext.Prelude
hiding
(
get
)
import
Network.Wai.Handler.Warp
qualified
as
Wai
import
Network.Wai.Handler.Warp
qualified
as
Wai
import
Paths_gargantext
(
getDataFileName
)
import
Paths_gargantext
(
getDataFileName
)
import
qualified
Prelude
import
Servant
import
Servant
import
Servant.Client
import
Servant.Client
import
Servant.Job.Async
import
Servant.Job.Async
import
Test.API.Routes
(
mkUrl
,
table_ngrams_get_api
,
table_ngrams_put_api
,
toServantToken
,
clientRoutes
)
import
Test.API.Routes
(
mkUrl
,
gqlUrl
,
get_table_ngrams
,
put_table_ngrams
,
toServantToken
,
clientRoutes
,
get_table
,
update_node
)
import
Test.API.Setup
(
withTestDBAndPort
,
setupEnvironment
,
createAliceAndBob
)
import
Test.API.Setup
(
withTestDBAndPort
,
setupEnvironment
,
createAliceAndBob
)
import
Test.Database.Types
import
Test.Database.Types
import
Test.Hspec
import
Test.Hspec
import
Test.Hspec.Wai
(
shouldRespondWith
)
import
Test.Hspec.Wai.Internal
(
withApplication
,
WaiSession
)
import
Test.Hspec.Wai.Internal
(
withApplication
,
WaiSession
)
import
Test.Hspec.Wai.JSON
(
json
)
import
Test.Hspec.Wai.JSON
(
json
)
import
Test.Hspec.Wai
(
shouldRespondWith
)
import
Test.Types
(
JobPollHandle
(
..
))
import
Test.Types
(
JobPollHandle
(
..
))
import
Test.Utils
(
getJSON
,
pollUntilFinished
,
postJSONUrlEncoded
,
protectedJSON
,
withValidLogin
)
import
Test.Utils
(
getJSON
,
pollUntilFinished
,
postJSONUrlEncoded
,
protectedJSON
,
withValidLogin
)
import
Text.Printf
(
printf
)
import
Web.FormUrlEncoded
import
Web.FormUrlEncoded
newCorpusForUser
::
TestEnv
->
T
.
Text
->
IO
NodeId
newCorpusForUser
::
TestEnv
->
T
.
Text
->
IO
NodeId
newCorpusForUser
env
uname
=
flip
runReaderT
env
$
runTestMonad
$
do
newCorpusForUser
env
uname
=
flip
runReaderT
env
$
runTestMonad
$
do
uid
<-
getUserId
(
UserName
uname
)
uid
<-
getUserId
(
UserName
uname
)
...
@@ -69,11 +80,25 @@ newCorpusForUser env uname = flip runReaderT env $ runTestMonad $ do
...
@@ -69,11 +80,25 @@ newCorpusForUser env uname = flip runReaderT env $ runTestMonad $ do
(
corpusId
:
_
)
<-
mk
(
Just
corpusName
)
(
Nothing
::
Maybe
HyperdataCorpus
)
parentId
uid
(
corpusId
:
_
)
<-
mk
(
Just
corpusName
)
(
Nothing
::
Maybe
HyperdataCorpus
)
parentId
uid
pure
corpusId
pure
corpusId
uploadJSONList
::
Wai
.
Port
->
Token
->
CorpusId
->
WaiSession
()
ListId
-- | Generate a 'Node' where we can append more data into, a bit reminiscent to the
uploadJSONList
port
token
cId
=
do
-- \"Private\" root node we use in the real Gargantext.
newPrivateFolderForUser
::
TestEnv
->
T
.
Text
->
IO
NodeId
newPrivateFolderForUser
env
uname
=
flip
runReaderT
env
$
runTestMonad
$
do
uid
<-
getUserId
(
UserName
uname
)
parentId
<-
getRootId
(
UserName
uname
)
let
nodeName
=
"NodeFolderPrivate"
(
nodeId
:
_
)
<-
mk
(
Just
nodeName
)
(
Just
defaultHyperdataFolderPrivate
)
parentId
uid
pure
nodeId
uploadJSONList
::
Wai
.
Port
->
Token
->
CorpusId
->
FilePath
->
WaiSession
()
ListId
uploadJSONList
port
token
cId
pathToNgrams
=
do
([
listId
]
::
[
NodeId
])
<-
protectedJSON
token
"POST"
(
mkUrl
port
(
"/node/"
<>
build
cId
))
[
aesonQQ
|
{"pn_typename":"NodeList","pn_name":"Testing"}
|]
([
listId
]
::
[
NodeId
])
<-
protectedJSON
token
"POST"
(
mkUrl
port
(
"/node/"
<>
build
cId
))
[
aesonQQ
|
{"pn_typename":"NodeList","pn_name":"Testing"}
|]
-- Upload the JSON doc
-- Upload the JSON doc
simpleNgrams
<-
liftIO
(
TIO
.
readFile
=<<
getDataFileName
"test-data/ngrams/simple.json"
)
simpleNgrams
<-
liftIO
(
TIO
.
readFile
=<<
getDataFileName
pathToNgrams
)
let
jsonFileFormData
=
[
(
T
.
pack
"_wjf_data"
,
simpleNgrams
)
let
jsonFileFormData
=
[
(
T
.
pack
"_wjf_data"
,
simpleNgrams
)
,
(
"_wjf_filetype"
,
"JSON"
)
,
(
"_wjf_filetype"
,
"JSON"
)
,
(
"_wjf_name"
,
"simple_ngrams.json"
)
,
(
"_wjf_name"
,
"simple_ngrams.json"
)
...
@@ -86,22 +111,6 @@ uploadJSONList port token cId = do
...
@@ -86,22 +111,6 @@ uploadJSONList port token cId = do
pure
listId
pure
listId
-- uploadListPatch :: Wai.Port
-- -> Token
-- -> CorpusId
-- -> ListId
-- -> APINgrams.Version
-- -> PM.PatchMap NgramsTerm NgramsPatch
-- -> WaiSession () ()
-- uploadListPatch port token cId listId version patch = do
-- let js = JSON.toJSON (Versioned version $ NgramsTablePatch patch)
-- -- panicTrace $ "[uploadListPatch] js: " <> show js
-- -- APINgrams.tableNgramsPut Terms listId (Versioned 0 $ NgramsTablePatch $ fst patch)
-- (_res :: Versioned NgramsTablePatch) <- protectedJSON token "PUT" (mkUrl port ("/node/" <> build cId <> "/ngrams?ngramsType=Terms&list=" <> build listId)) js
-- -- panicTrace $ "[uploadListPatch] res: " <> show res
-- pure ()
tests
::
Spec
tests
::
Spec
tests
=
sequential
$
aroundAll
withTestDBAndPort
$
do
tests
=
sequential
$
aroundAll
withTestDBAndPort
$
do
describe
"UpdateList API"
$
do
describe
"UpdateList API"
$
do
...
@@ -115,7 +124,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
...
@@ -115,7 +124,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
cId
<-
newCorpusForUser
testEnv
"alice"
cId
<-
newCorpusForUser
testEnv
"alice"
withApplication
app
$
do
withApplication
app
$
do
withValidLogin
port
"alice"
(
GargPassword
"alice"
)
$
\
_clientEnv
token
->
do
withValidLogin
port
"alice"
(
GargPassword
"alice"
)
$
\
_clientEnv
token
->
do
listId
<-
uploadJSONList
port
token
cId
listId
<-
uploadJSONList
port
token
cId
"test-data/ngrams/simple.json"
-- Now check that we can retrieve the ngrams
-- Now check that we can retrieve the ngrams
let
getUrl
=
"/node/"
+|
listId
|+
"/ngrams?ngramsType=Terms&listType=MapTerm&list="
+|
listId
|+
"&limit=50"
let
getUrl
=
"/node/"
+|
listId
|+
"/ngrams?ngramsType=Terms&listType=MapTerm&list="
+|
listId
|+
"&limit=50"
...
@@ -142,10 +151,10 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
...
@@ -142,10 +151,10 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
-- this is the new term, under which importedTerm will be grouped
-- this is the new term, under which importedTerm will be grouped
let
newTerm
=
NgramsTerm
"new abelian group"
let
newTerm
=
NgramsTerm
"new abelian group"
listId
<-
uploadJSONList
port
token
cId
listId
<-
uploadJSONList
port
token
cId
"test-data/ngrams/simple.json"
let
checkNgrams
expected
=
do
let
checkNgrams
expected
=
do
eng
<-
liftIO
$
runClientM
(
table_ngrams_get_api
token
cId
APINgrams
.
Terms
listId
10
Nothing
(
Just
MapTerm
)
Nothing
Nothing
Nothing
Nothing
)
clientEnv
eng
<-
liftIO
$
runClientM
(
get_table_ngrams
token
cId
APINgrams
.
Terms
listId
10
Nothing
(
Just
MapTerm
)
Nothing
Nothing
Nothing
Nothing
)
clientEnv
case
eng
of
case
eng
of
Left
err
->
fail
(
show
err
)
Left
err
->
fail
(
show
err
)
Right
r
->
Right
r
->
...
@@ -164,7 +173,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
...
@@ -164,7 +173,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
,
NgramsReplace
{
_patch_old
=
Nothing
,
NgramsReplace
{
_patch_old
=
Nothing
,
_patch_new
=
Just
nre
}
)
,
_patch_new
=
Just
nre
}
)
]
]
_
<-
liftIO
$
runClientM
(
table_ngrams_put_api
token
cId
APINgrams
.
Terms
listId
(
Versioned
1
$
NgramsTablePatch
$
fst
patch
))
clientEnv
_
<-
liftIO
$
runClientM
(
put_table_ngrams
token
cId
APINgrams
.
Terms
listId
(
Versioned
1
$
NgramsTablePatch
$
fst
patch
))
clientEnv
-- check that new term is added (with no parent)
-- check that new term is added (with no parent)
checkNgrams
[
(
newTerm
,
[]
)
checkNgrams
[
(
newTerm
,
[]
)
...
@@ -175,7 +184,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
...
@@ -175,7 +184,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
(
newTerm
(
newTerm
,
toNgramsPatch
[
importedTerm
]
)
,
toNgramsPatch
[
importedTerm
]
)
]
]
_
<-
liftIO
$
runClientM
(
table_ngrams_put_api
token
cId
APINgrams
.
Terms
listId
(
Versioned
32
$
NgramsTablePatch
$
fst
patchChildren
))
clientEnv
_
<-
liftIO
$
runClientM
(
put_table_ngrams
token
cId
APINgrams
.
Terms
listId
(
Versioned
32
$
NgramsTablePatch
$
fst
patchChildren
))
clientEnv
-- check that new term is parent of old one
-- check that new term is parent of old one
checkNgrams
[
(
newTerm
,
[
importedTerm
])
]
checkNgrams
[
(
newTerm
,
[
importedTerm
])
]
...
@@ -183,7 +192,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
...
@@ -183,7 +192,7 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
-- finally, upload the list again, the group should be as
-- finally, upload the list again, the group should be as
-- it was before (the bug in #313 was that "abelian group"
-- it was before (the bug in #313 was that "abelian group"
-- was created again as a term with no parent)
-- was created again as a term with no parent)
_
<-
uploadJSONList
port
token
cId
_
<-
uploadJSONList
port
token
cId
"test-data/ngrams/simple.json"
-- old (imported) term shouldn't become parentless
-- old (imported) term shouldn't become parentless
-- (#313 error was that we had [newTerm, importedTerm] instead)
-- (#313 error was that we had [newTerm, importedTerm] instead)
...
@@ -248,20 +257,107 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
...
@@ -248,20 +257,107 @@ tests = sequential $ aroundAll withTestDBAndPort $ do
describe
"POST /api/v1.0/corpus/:id/add/form/async (JSON)"
$
do
describe
"POST /api/v1.0/corpus/:id/add/form/async (JSON)"
$
do
it
"allows uploading a JSON docs file"
$
\
((
testEnv
,
port
),
app
)
->
do
it
"allows uploading a JSON docs file"
$
\
((
testEnv
,
port
),
app
)
->
cId
<-
newCorpusForUser
testEnv
"alice"
withApplication
app
$
do
withApplication
app
$
do
withValidLogin
port
"alice"
(
GargPassword
"alice"
)
$
\
clientEnv
token
->
do
withValidLogin
port
"alice"
(
GargPassword
"alice"
)
$
\
clientEnv
token
->
do
void
$
createFortranDocsList
testEnv
port
clientEnv
token
it
"doesn't use trashed documents for score calculation (#385)"
$
\
((
testEnv
,
port
),
app
)
->
do
withApplication
app
$
do
withValidLogin
port
"alice"
(
GargPassword
"alice"
)
$
\
clientEnv
token
->
do
corpusId
<-
createFortranDocsList
testEnv
port
clientEnv
token
tr1
<-
liftIO
$
do
(
HashedResponse
_
tr1
)
<-
checkEither
$
runClientM
(
get_table
token
corpusId
(
Just
APINgrams
.
Docs
)
(
Just
10
)
(
Just
0
)
(
Just
Facet
.
DateDesc
)
(
Just
$
RawQuery
"fortran"
)
Nothing
)
clientEnv
length
(
tr_docs
tr1
)
`
shouldBe
`
2
pure
tr1
termsNodeId
<-
uploadJSONList
port
token
corpusId
"test-data/ngrams/GarganText_NgramsTerms-nodeId-177.json"
liftIO
$
do
-- Now let's check the score for the \"fortran\" ngram.
(
VersionedWithCount
_
_
(
NgramsTable
[
fortran_ngram
]))
<-
checkEither
$
runClientM
(
get_table_ngrams
token
corpusId
APINgrams
.
Terms
termsNodeId
10
(
Just
0
)
(
Just
MapTerm
)
Nothing
Nothing
Nothing
Nothing
)
clientEnv
length
(
_ne_occurrences
fortran_ngram
)
`
shouldBe
`
2
-- At this point, we need to trash one of the two documents which contains
-- the \"fortran\" occurrence, and this should be reflected in the Ngrams.
trash_document
token
(
Facet
.
facetDoc_id
$
Prelude
.
head
(
tr_docs
tr1
))
corpusId
-- Check that the document of returned documents has decreased
liftIO
$
do
(
HashedResponse
_
tr2
)
<-
checkEither
$
runClientM
(
get_table
token
corpusId
(
Just
APINgrams
.
Docs
)
(
Just
10
)
(
Just
0
)
(
Just
Facet
.
DateDesc
)
(
Just
$
RawQuery
"fortran"
)
Nothing
)
clientEnv
length
(
tr_docs
tr2
)
`
shouldBe
`
1
liftIO
$
do
-- Now let's check the score for the \"fortran\" ngram. It must be decreased
-- by 1, because one of the matching documents have been trashed.
(
VersionedWithCount
_
_
(
NgramsTable
[
fortran_ngram'
]))
<-
checkEither
$
runClientM
(
get_table_ngrams
token
corpusId
APINgrams
.
Terms
termsNodeId
10
(
Just
0
)
(
Just
MapTerm
)
Nothing
Nothing
Nothing
Nothing
)
clientEnv
length
(
_ne_occurrences
fortran_ngram'
)
`
shouldBe
`
1
createFortranDocsList
::
TestEnv
->
Int
->
ClientEnv
->
Token
->
WaiSession
()
CorpusId
createFortranDocsList
testEnv
port
clientEnv
token
=
do
folderId
<-
liftIO
$
newPrivateFolderForUser
testEnv
"alice"
([
corpusId
]
::
[
NodeId
])
<-
protectedJSON
token
"POST"
(
mkUrl
port
(
"/node/"
<>
build
folderId
))
[
aesonQQ
|
{"pn_typename":"NodeCorpus","pn_name":"TestCorpus"}
|]
-- Import the docsList with only two documents, both containing a \"fortran\" term.
-- Import the docsList with only two documents, both containing a \"fortran\" term.
([
corpusId
]
::
[
CorpusId
])
<-
protectedJSON
token
"POST"
(
mkUrl
port
(
"/node/"
<>
build
cId
))
[
aesonQQ
|
{"pn_typename":"NodeCorpus","pn_name":"Testing"}
|]
simpleDocs
<-
liftIO
(
TIO
.
readFile
=<<
getDataFileName
"test-data/ngrams/GarganText_DocsList-nodeId-177.json"
)
simpleDocs
<-
liftIO
(
TIO
.
readFile
=<<
getDataFileName
"test-data/ngrams/GarganText_DocsList-nodeId-177.json"
)
let
newWithForm
=
mkNewWithForm
simpleDocs
"GarganText_DocsList-nodeId-177.json"
let
newWithForm
=
mkNewWithForm
simpleDocs
"GarganText_DocsList-nodeId-177.json"
(
j
::
JobPollHandle
)
<-
checkEither
$
fmap
toJobPollHandle
<$>
liftIO
(
runClientM
(
add_file_async
token
corpusId
newWithForm
)
clientEnv
)
(
j
::
JobPollHandle
)
<-
checkEither
$
fmap
toJobPollHandle
<$>
liftIO
(
runClientM
(
add_file_async
token
corpusId
newWithForm
)
clientEnv
)
let
mkPollUrl
jh
=
"/corpus/"
<>
fromString
(
show
$
_NodeId
corpusId
)
<>
"/add/form/async/"
+|
_jph_id
jh
|+
"/poll?limit=1"
let
mkPollUrl
jh
=
"/corpus/"
<>
fromString
(
show
$
_NodeId
corpusId
)
<>
"/add/form/async/"
+|
_jph_id
jh
|+
"/poll?limit=1"
j'
<-
pollUntilFinished
token
port
mkPollUrl
j
j'
<-
pollUntilFinished
token
port
mkPollUrl
j
liftIO
(
_jph_status
j'
`
shouldBe
`
"IsFinished"
)
liftIO
(
_jph_status
j'
`
shouldBe
`
"IsFinished"
)
pure
corpusId
updateNode
::
Int
->
ClientEnv
->
Token
->
NodeId
->
WaiSession
()
()
updateNode
port
clientEnv
token
nodeId
=
do
let
params
=
UpdateNodeParamsTexts
Both
(
j
::
JobPollHandle
)
<-
checkEither
$
fmap
toJobPollHandle
<$>
liftIO
(
runClientM
(
update_node
token
nodeId
params
)
clientEnv
)
let
mkPollUrl
jh
=
"/node/"
<>
fromString
(
show
$
_NodeId
nodeId
)
<>
"/update/"
+|
_jph_id
jh
|+
"/poll?limit=1"
j'
<-
pollUntilFinished
token
port
mkPollUrl
j
liftIO
(
_jph_status
j'
`
shouldBe
`
"IsFinished"
)
toJobPollHandle
::
JobStatus
'S
a
fe
JobLog
->
JobPollHandle
toJobPollHandle
::
JobStatus
'S
a
fe
JobLog
->
JobPollHandle
toJobPollHandle
=
either
(
\
x
->
panicTrace
$
"toJobPollHandle:"
<>
T
.
pack
x
)
identity
.
JSON
.
eitherDecode
.
JSON
.
encode
toJobPollHandle
=
either
(
\
x
->
panicTrace
$
"toJobPollHandle:"
<>
T
.
pack
x
)
identity
.
JSON
.
eitherDecode
.
JSON
.
encode
...
@@ -298,3 +394,29 @@ add_file_async (toServantToken -> token) corpusId nwf =
...
@@ -298,3 +394,29 @@ add_file_async (toServantToken -> token) corpusId nwf =
&
(
$
corpusId
)
&
(
$
corpusId
)
&
asyncJobsAPI'
&
asyncJobsAPI'
&
(
\
(
_
:<|>
submitForm
:<|>
_
)
->
submitForm
(
JobInput
nwf
Nothing
))
&
(
\
(
_
:<|>
submitForm
:<|>
_
)
->
submitForm
(
JobInput
nwf
Nothing
))
-- | Utility to trash a document by performing a raw query towards GQL. Not very type safe,
-- but it will get the job done for now.
trash_document
::
Token
->
NodeId
-- ^ The context id to delete, i.e. the document ID.
->
CorpusId
-- ^ The parent corpus ID this document is attached to.
->
WaiSession
()
()
trash_document
token
docId
cpsId
=
void
$
protectedJSON
@
JSON
.
Value
token
"POST"
gqlUrl
[
aesonQQ
|
{
"query": #{operation},
"operationName": "update_node_context_category",
"variables": {}
}
|]
where
operation
::
Prelude
.
String
operation
=
printf
"mutation update_node_context_category { update_node_context_category(context_id: %d, node_id: %d, category: 0) }"
contextId
corpusId
contextId
::
Int
contextId
=
_NodeId
docId
corpusId
::
Int
corpusId
=
_NodeId
cpsId
test/Test/Utils.hs
View file @
59c23118
...
@@ -7,33 +7,38 @@ module Test.Utils where
...
@@ -7,33 +7,38 @@ module Test.Utils where
import
Control.Exception.Safe
()
import
Control.Exception.Safe
()
import
Control.Monad
()
import
Control.Monad
()
import
Data.Aeson
qualified
as
JSON
import
Data.Aeson.KeyMap
qualified
as
KM
import
Data.Aeson.KeyMap
qualified
as
KM
import
Data.Aeson
qualified
as
JSON
import
Data.ByteString.Char8
qualified
as
B
import
Data.ByteString.Char8
qualified
as
B
import
Data.ByteString.Lazy
qualified
as
L
import
Data.ByteString.Lazy
qualified
as
L
import
Data.Map.Strict
qualified
as
Map
import
Data.Map.Strict
qualified
as
Map
import
Data.Text
qualified
as
T
import
Data.Text.Encoding
qualified
as
TE
import
Data.Text.Encoding
qualified
as
TE
import
Data.Text.Lazy
qualified
as
TL
import
Data.Text.Lazy.Encoding
qualified
as
TLE
import
Data.Text.Lazy.Encoding
qualified
as
TLE
import
Data.Text.Lazy
qualified
as
TL
import
Data.Text
qualified
as
T
import
Data.TreeDiff
import
Data.TreeDiff
import
Fmt
(
Builder
)
import
Fmt
(
Builder
)
import
Gargantext.API.Routes.Types
(
xGargErrorScheme
)
import
Gargantext.API.Admin.Auth.Types
(
AuthRequest
(
..
),
Token
,
authRes_token
)
import
Gargantext.API.Admin.Auth.Types
(
AuthRequest
(
..
),
Token
,
authRes_token
)
import
Gargantext.API.Admin.Orchestrator.Types
import
Gargantext.API.Routes.Types
(
xGargErrorScheme
)
import
Gargantext.Core.Types.Individu
(
Username
,
GargPassword
)
import
Gargantext.Core.Types.Individu
(
Username
,
GargPassword
)
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Network.HTTP.Client
(
defaultManagerSettings
,
newManager
)
import
Network.HTTP.Client
(
defaultManagerSettings
,
newManager
)
import
Network.HTTP.
Types
(
Header
,
Method
,
status200
)
import
Network.HTTP.
Client
qualified
as
HTTP
import
Network.HTTP.Types.Header
(
hAccept
,
hAuthorization
,
hContentType
)
import
Network.HTTP.Types.Header
(
hAccept
,
hAuthorization
,
hContentType
)
import
Network.HTTP.Types
(
Header
,
Method
,
status200
)
import
Network.Wai.Handler.Warp
(
Port
)
import
Network.Wai.Handler.Warp
(
Port
)
import
Network.Wai.Test
(
SResponse
(
..
))
import
Network.Wai.Test
(
SResponse
(
..
))
import
Prelude
qualified
import
Prelude
qualified
import
Servant.Client
(
ClientEnv
,
baseUrlPort
,
mkClientEnv
,
parseBaseUrl
,
runClientM
)
import
Servant.Client
(
ClientEnv
,
baseUrlPort
,
mkClientEnv
,
parseBaseUrl
,
runClientM
,
makeClientRequest
,
defaultMakeClientRequest
)
import
Servant.Client.Core
(
BaseUrl
)
import
Servant.Client.Core.Request
qualified
as
Client
import
System.Environment
(
lookupEnv
)
import
System.Timeout
qualified
as
Timeout
import
System.Timeout
qualified
as
Timeout
import
Test.API.Routes
(
auth_api
,
mkUrl
)
import
Test.API.Routes
(
auth_api
,
mkUrl
)
import
Test.Hspec.Expectations
import
Test.Hspec.Expectations
import
Test.Hspec.Wai
(
MatchBody
(
..
),
WaiExpectation
,
WaiSession
,
request
)
import
Test.Hspec.Wai.JSON
(
FromValue
(
..
))
import
Test.Hspec.Wai.JSON
(
FromValue
(
..
))
import
Test.Hspec.Wai
(
MatchBody
(
..
),
WaiExpectation
,
WaiSession
,
request
)
import
Test.Hspec.Wai.Matcher
(
MatchHeader
(
..
),
ResponseMatcher
(
..
),
bodyEquals
,
formatHeader
,
match
)
import
Test.Hspec.Wai.Matcher
(
MatchHeader
(
..
),
ResponseMatcher
(
..
),
bodyEquals
,
formatHeader
,
match
)
import
Test.Tasty.HUnit
(
Assertion
,
assertBool
)
import
Test.Tasty.HUnit
(
Assertion
,
assertBool
)
import
Test.Types
import
Test.Types
...
@@ -183,12 +188,27 @@ withValidLogin port ur pwd act = do
...
@@ -183,12 +188,27 @@ withValidLogin port ur pwd act = do
case
result
of
case
result
of
Left
err
->
liftIO
$
throwIO
$
Prelude
.
userError
(
show
err
)
Left
err
->
liftIO
$
throwIO
$
Prelude
.
userError
(
show
err
)
Right
res
->
do
Right
res
->
do
traceEnabled
<-
isJust
<$>
liftIO
(
lookupEnv
"GARG_DEBUG_LOGS"
)
let
token
=
res
^.
authRes_token
let
token
=
res
^.
authRes_token
act
clientEnv0
token
act
(
clientEnv0
{
makeClientRequest
=
gargMkRequest
traceEnabled
})
token
-- | Allows to enable/disable logging of the input 'Request' to check what the
-- client is actually sending to the server.
-- FIXME(adn) We cannot upgrade to servant-client 0.20 due to OpenAlex:
-- https://gitlab.iscpif.fr/gargantext/crawlers/openalex/blob/main/src/OpenAlex/ServantClientLogging.hs#L24
gargMkRequest
::
Bool
->
BaseUrl
->
Client
.
Request
->
HTTP
.
Request
gargMkRequest
traceEnabled
bu
clientRq
=
let
httpReq
=
defaultMakeClientRequest
bu
clientRq
in
case
traceEnabled
of
True
->
traceShowId
httpReq
False
->
httpReq
-- | Poll the given job URL every second until it finishes.
-- | Poll the given job URL every second until it finishes.
-- Retries up to 60 times (i.e. for 1 minute, before giving up)
-- Retries up to 60 times (i.e. for 1 minute, before giving up)
-- /NOTE(adn)/: Check the content of the \"events\" logs as a stopgap
-- measure for #390.
pollUntilFinished
::
HasCallStack
pollUntilFinished
::
HasCallStack
=>
Token
=>
Token
->
Port
->
Port
...
@@ -208,7 +228,15 @@ pollUntilFinished tkn port mkUrlPiece = go 60
...
@@ -208,7 +228,15 @@ pollUntilFinished tkn port mkUrlPiece = go 60
|
_jph_status
h
==
"IsFailure"
|
_jph_status
h
==
"IsFailure"
->
panicTrace
$
"JobPollHandle contains a failure: "
<>
TE
.
decodeUtf8
(
L
.
toStrict
$
JSON
.
encode
h
)
->
panicTrace
$
"JobPollHandle contains a failure: "
<>
TE
.
decodeUtf8
(
L
.
toStrict
$
JSON
.
encode
h
)
|
otherwise
|
otherwise
->
pure
h
->
case
any
hasError
(
_jph_log
h
)
of
True
->
panicTrace
$
"JobPollHandle contains a failure: "
<>
TE
.
decodeUtf8
(
L
.
toStrict
$
JSON
.
encode
h
)
False
->
pure
h
-- FIXME(adn) This is wrong, errs should be >= 1.
hasError
::
JobLog
->
Bool
hasError
JobLog
{
..
}
=
case
_scst_failed
of
Nothing
->
False
Just
errs
->
errs
>
1
-- | Like HUnit's '@?=', but With a nicer error message in case the two entities are not equal.
-- | Like HUnit's '@?=', but With a nicer error message in case the two entities are not equal.
(
@??=
)
::
(
HasCallStack
,
ToExpr
a
,
Eq
a
)
=>
a
->
a
->
Assertion
(
@??=
)
::
(
HasCallStack
,
ToExpr
a
,
Eq
a
)
=>
a
->
a
->
Assertion
...
...
test/Test/Utils/Jobs.hs
View file @
59c23118
...
@@ -367,6 +367,7 @@ testMarkProgress = do
...
@@ -367,6 +367,7 @@ testMarkProgress = do
myEnv
<-
newTestEnv
myEnv
<-
newTestEnv
-- evts <- newTBQueueIO 7
-- evts <- newTBQueueIO 7
evts
<-
newTVarIO
[]
evts
<-
newTVarIO
[]
let
expectedEvents
=
7
let
getStatus
hdl
=
do
let
getStatus
hdl
=
do
liftIO
$
threadDelay
100
_000
liftIO
$
threadDelay
100
_000
st
<-
getLatestJobStatus
hdl
st
<-
getLatestJobStatus
hdl
...
@@ -375,15 +376,21 @@ testMarkProgress = do
...
@@ -375,15 +376,21 @@ testMarkProgress = do
readAllEvents
=
do
readAllEvents
=
do
-- We will get thread blocking if there is ANY error in the job
-- We will get thread blocking if there is ANY error in the job
-- Hence we assert the `readAllEvents` test doesn't take too long
-- Hence we assert the `readAllEvents` test doesn't take too long
mRet
<-
timeout
1
_000_000
$
atomically
$
do
mRet
<-
timeout
5
_000_000
$
atomically
$
do
-- allEventsArrived <- isFullTBQueue evts
-- allEventsArrived <- isFullTBQueue evts
evts'
<-
readTVar
evts
evts'
<-
readTVar
evts
-- STM retry if things failed
-- STM retry if things failed
-- check allEventsArrived
-- check allEventsArrived
check
(
length
evts'
==
7
)
check
(
length
evts'
==
expectedEvents
)
-- flushTBQueue evts
-- flushTBQueue evts
return
evts'
pure
evts'
return
$
fromMaybe
[]
mRet
case
mRet
of
Nothing
->
Prelude
.
fail
$
"testMarkProgress: timeout exceeded, but didn't receive all 7 required events."
Just
xs
|
length
xs
==
expectedEvents
->
pure
xs
|
otherwise
->
Prelude
.
fail
$
"testMarkProgress: received some events, but they were not of the expected number ("
<>
show
expectedEvents
<>
"): "
<>
show
xs
withJob_
myEnv
$
\
hdl
_input
->
do
withJob_
myEnv
$
\
hdl
_input
->
do
markStarted
10
hdl
markStarted
10
hdl
...
@@ -410,6 +417,8 @@ testMarkProgress = do
...
@@ -410,6 +417,8 @@ testMarkProgress = do
getStatus
hdl
getStatus
hdl
evts'
<-
readAllEvents
evts'
<-
readAllEvents
-- This pattern match should never fail, because the precondition is
-- checked in 'readAllEvents'.
let
[
jl0
,
jl1
,
jl2
,
jl3
,
jl4
,
jl5
,
jl6
]
=
evts'
let
[
jl0
,
jl1
,
jl2
,
jl3
,
jl4
,
jl5
,
jl6
]
=
evts'
-- Check the events are what we expect
-- Check the events are what we expect
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment